Since we’ve got a blog now, I suppose we might as well use to it preview topics and ideas that we’re looking at expanding upon in more detailed content formats down the road (webinars, videos, e-books, etc). While we’re at it, it would be great to get some feedback from you fine readers to see if this is a topic that would interest you.
Now before I go any further, let’s just be clear that this may not be fully baked, but it’s something we’ve been discussing over here and I think it’s worth starting to think about, even if only in somewhat of a preliminary form.
I don’t mean to diminish or discredit this, but I do want to make sure that we’re on the same page as I jaunt through this quick bit of musing.
As we’ve discussed in posts prior to this one, regulatory requirements often leave Life Sciences organizations a bit further behind the curve of other industries from both a technology and methodology standpoint. As we’ve also discussed, at this point, much of this has really been self-imposed. Dealing with often vague compliance guidelines and faced with potential consequences, teams, in turn, have implemented and utilized outdated and bloated processes that don’t really support modern tools or processes. Tools and processes that could bring great enhancements to their software development or software quality activities.
These self-impositions are largely enforced due to thinking of things in terms of legacy or traditional compliance and validation appeasement processes that were born of the days when Waterfall was prevalent and manual testing was the specialty of the day. However, technology has long passed those days by and methodologies were born that drove those very same technology innovations. Those methodologies and technologies have now long been embraced by other industries where Life Sciences have largely struggled to adopt them, but the reality is that they are definitely possible to adopt, and many Life Sciences organizations have been able to make the leap successfully.
It simply required a shift in outlook, interpretation, and execution of regulatory requirements. A breaking of tradition, if you will. Instead of looking at traditional computer system validation, they instead started looking at letting those requirements drive and dictate good software quality practices across the enterprise with the documentation being a bi-product of those practices, rather than the driving force behind it or the sole consideration of software and system-related projects.
Document-centric, or “document first” mentalities are being moved away from to support and give way to much more nimble management of compliance and validation activities in order to support SDLC modernization. At this point, the FDA is even starting to shift away from legacy CSV and instead focusing on a “critical thinking” first approach, or Computer Software Assurance (CSA), as I’ve been hearing it called. Granted, this is being discussed in certain use cases for the time being, but it seems that the trend is moving in the right direction.
This, however, isn’t new. True, it’s encouraging to hear it echoed by the FDA, and that should give some teams more confidence to move away from traditional validation mentalities, but the model for modernizing SDLC practices while seamlessly integrating aspects like approvals, esignatures, traceability, audit history and other validation relevant data, was already there.
In any case, I do like the trend (as it’s one that we’ve been proselytizing for years now) of moving the focus away from legacy CSV, and instead focusing on software quality and good practices with relevant compliance tasks captured and produced as a bi-product of those practices. Yes, it’s regulated software quality so extra diligence needs to be applied but achieving regulatory compliance shouldn’t be the driving force behind the planning and executing of your SDLC projects.
Instead, embed those compliance tasks into your projects in as efficient a way as possible (that means getting rid of paper or electronic document-based methodologies) and leverage them more as a “checks and balances” that your software development and quality objectives are being achieved.
In any case, as I mentioned before, this isn’t really a fully fleshed out dissertation, but just something I’d been thinking about and thought getting it out there in a blog made sense while we compose more robust and complete collateral on the subject.
In the meantime, check out this white paper we put together not too long ago which somewhat echoes these sentiments in a precursory way.
“Agile, Data-Driven Validation”, here.
Jason Secola is the Support Sales Manager at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing and later began a focus on testing in a regulated environment.
He currently resides near Sacramento, CA.