Data-Driven CSV: The Where and the Why
As I was mulling over some new topics, I realized that we have made a lot of references to data-driven validation as a methodology that we advocate and enable, but we haven’t really spent an actual entire post or series covering what it is, or why it is becoming increasingly more important for regulated organizations. Yes, it’s been sprinkled throughout numerous posts and series, and I’m sure many of you have been able to put it together based on the context in which we have referenced it, but hey, why not just carve out a little time to dedicate to it?
It is, after all, the toothpaste 4 out of 5 dentists recommend. Wait, I meant it is, after all, the methodology that we advocate and implement as life sciences teams evolve and adapt to modern solutions and processes in the digital landscape.
Let’s first take a look at what’s driving the need behind a more data-driven approach to compliance in software and GxP systems.
Documents and documentation were (and for many teams they still are) the standard for compliance verification in the life sciences industry for a long time, but the advent and adoption of more efficient digital technologies and methodologies have really enabled a drastically more efficient approach that’s better suited to today’s business needs and technology landscape. One that no longer requires reliance on documents (paper or electronic), but instead allows teams and tools to focus on managing and executing compliance deliverables at the data level, thus providing a more comprehensive, integrated and collaborative effort that is not limited by the static nature of a document.
What this approach in turn enables is the obsolescence of overly burdensome documentation practices that, unfortunately, are very heavily emphasized and utilized with traditional validation. Instead, teams are now able to leverage modern SDLC solutions and methodologies to drive software quality practices while achieving compliance deliverables as a natural byproduct. All this while allowing for the adoption and implementation of a critical-thinking and a strategic, risk-based approach to testing that can be applied across the various applications and application functions within an organization’s GxP landscape.
This allows teams to put the emphasis back on quality development, testing, and delivery, rather than extraneous documents and documentation practices which were derived from what was – for a long time – a very prevalent “test and document everything” mentality.
I don’t want to get too much into the limitations of the document as we have certainly covered that in many previous posts, but I would like to point out that there have been many drivers of change – both on the business and the technology side – that documents simply don’t support. In fact, those documentation practices have, for a long time now, hindered the life sciences industry and largely overshadowed the ultimate goal, which is quality software, systems, and product. These legacy processes and overly burdensome documentation practices add undue strain across numerous teams while accomplishing little to add value or drive quality. In reality, we regularly see that it has quite the opposite effect and that it often reduces overall quality.
But what are some of the business and technology drivers that are prompting teams to move away from documents (again, paper or electronic) and instead leverage a data-driven approach to their software compliance practices? What changes in recent years have really begun to highlight the limitations of traditional validation methodologies?
- Globalization: As companies grow and expand beyond national borders, they need to adapt their validation procedures to the requirements not just of the FDA or other local governing bodies, but to a variety of different regulatory agencies. In addition, validation teams are now geographically dispersed making communication and collaboration critical elements to providing a consistent, defendable validation deliverable.
- Accelerated Pace: Novel drugs and biologicals are being approved quickly due to innovative efforts and efficiency improvements by the FDA such as adaptive clinical trials. A better understanding of disease and drug mechanisms has bolstered development of new targeted therapies. And approval of medical devices has surged at an unprecedented level as consumers embrace wearables of all sorts. Today, even the applications in your cell phone might be classified as FDA-regulated medical applications. These products are coming to market quickly but must first undergo rigorous validation efforts before patients can enjoy their benefits.
- Emerging Technology: The adoption of DevOps tools enables companies to address the ongoing release of enterprise software and their integration into critical applications to the business. Agile-oriented software development teams (that reject a waterfall approach to SDLC) are dictating the technology of the future, which challenges traditional life sciences quality assurance personnel comfort zone.
- Systems Which Lack and Enterprise-Wide View of Compliance Risk: Life sciences – just like most other industries – have grown up with a siloed infrastructure, one that is filled with separate, individual business units and operational areas. Many of these departments do not talk to each other, let alone collaborate with one another.
- Mergers and Acquisitions: While the last couple of years may not measure up to the mega-merger deals of previous years, continued merger activity can be problematic in terms of multiple – and often redundant – systems, non-integrated processes, and differing procedures/processes.
- Economic Pressures: Faced with increasing pressure to maintain competitiveness by shortening development timelines and reducing costs, companies are finding they can no longer afford the inefficiency of lengthy, resource-constrained, and manually laden validation processes.
- Outsourcing and Sub-Contracting: More and more companies are outsourcing major corporate functions such as research, product development, and manufacturing, which further strains the ability of current infrastructures and systems to integrate test procedures while maintaining control and consistency across multiple partners/multiple sites.
- Changing Regulatory Environment: Life science companies are already heavily regulated by a number of organizations including the FDA, the EMA, the ISO, just to name a few. The industry is also affected by non-FDA-type regulations such as Sarbanes Oxley and General Data Protection Regulation (GDPR). All these regulations routinely evolve to protect the interests of industry and consumers.
As I’m sure most of our readers can attest to – likely having come from a document-centric background or being currently embroiled in a document-based framework – it’s easy to see how these mounting changes over the last number of years have really driven organizations, industry thought leaders, and even regulatory bodies to shift away from traditional validation practices. As with business drivers in any industry, when the roadblocks continue to mount, better methodologies and tools are developed to allow for adaptation. Evolution takes place.
Now entering from stage right, please welcome, data-driven validation.
Was that last sentence a bit hacky and overly dramatic? Yeah, you’re probably right. Strike it from the record.
Either way, this is where the data-driven approach comes into the picture. We’ve touched on it a bit here from a high level, but in the next post we will do a little deeper dive into what it actually is and, most importantly, the benefits it offers as life sciences teams adapt and transform to meet modern demands.
If you don’t want to have to remember to check back in, then just subscribe at the top right of this page, won’t you?
Also, we recently hosted a webinar on SDLC Modernization along with Allergan and much of the content ties in with this series. You can view it at the link below if you are interested.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing and later began a focus on testing in a regulated environment. He currently resides near Sacramento, CA.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing and later began a focus on testing in a regulated environment. He currently resides near Sacramento, CA.View all posts by Jason Secola