Data-Driven CSV: Moving Beyond Traditional Validation

Data-Driven CSV: Moving Beyond Traditional Validation

Let me start by saying that if you didn’t read the first post about the data-driven approach to regulatory compliance and validation, I would recommend going back and giving that a read as it provides a bit of context leading up to this piece. You can find that post here.

Now, with that aside and assuming that you read the prequel to this post, let's get a bit more into what data-driven validation actually is. More importantly, let's discuss how it addresses the shortcomings of  traditional validation which is failing to satisfy the current needs and requirements of the life sciences industry.

Obviously, a data-driven approach is going to move away from the structure of static documents. Instead, it is breaking down validation and compliance deliverables into more granular data elements (think individual data as it moves through the lifecycle of going through requirements, tests, defects, etc) which are managed and approved along with supporting metadata, for use in myriad ways.

Yes, within this framework teams can create the familiar reports that they (or auditors) may be comfortable with if they so choose, but it allows for a more rigorous capturing and use of testing data, such as comparing test results and defects across multiple projects and multiple business units.

It’s important to note that while I said it allows for a more rigorous use of testing data, that does not mean that the actual activities associated with generating and capturing compliance and validation deliverables are more rigorous (quite the contrary).

At first glance, it may sound more complicated and tedious when we talk about breaking things down to a more granular level, but when applied and executed with the correct solutions and process, this actually enables a much nimbler and more seamlessly infused procedure for managing compliance deliverables throughout their lifecycle. A process, by the way, that is essential as teams look to effectively make the shift from Waterfall to Agile and fully embrace DevOps solutions.

As for the benefits and advantages that data-drive validation brings to the table relative to traditional, document-based validation, let’s take a look and hopefully some of these highlights will illustrate where documents fall short of modern demands….

  • Real-time Insight and Access: Gone are the days of “reactive” discovery and analysis to identify compliance issues. Real-time access allows users to monitor and take corrective action before simple compliance concerns become significant compliance breakdowns.

  • Repeatable Process: Quality control is paramount in all life sciences business processes. Using configurable workflows, data-driven CSV helps to provide repeatability and efficiency, greatly reducing the risk of human error while enforcing the correct process (a.k.a., workflow governance).

  • Test Automation: Test automation can easily be incorporated into a test management tool and leveraged along with manual testing. These test elements and their associated execution results can be sent through a formal review and approval process, while being recorded in the audit history. In addition, historical information such as number of executions, execution times, and defect trends can be leveraged as future releases of the application are being considered.

  • Comprehensive Reporting: The data-driven approach truly provides a panoramic view of the entire CSV landscape – and captures all necessary information and data for rigorous, comprehensive reporting. Real-time reports can show exactly where you are in a validation project, including what reviews/approvals might be stuck and how you can re-assign elements to keep things moving.

  • Analytics and Analytical Reporting: Life science companies are intense collectors and users of data; however, they have historically been unable to extract intelligence from the underlying validation data. A data-driven approach provides the ability to capture relevant and meaningful data at each critical step of the process. This means users can analyze the data along the entire Application Life Management (ALM) cycle. For example, analytics can help create “what-if” scenarios or highlight productivity (or lack thereof) with different outsourced teams.

  • Traceability: The data-driven approach enables users to trace and capture data across the full lifecycle – regardless of the tools that are used (requirements tool, agile tool, testing tool, or ITSM). In traditional validation this has historically been a very difficult aspect to manage and one that is very prone to human error.

  • Improved Efficiency: It streamlines processes to help reduce processing time, while ensuring accurate tracking.

  • Reduced Risk: A systematic, iterative approach, applied throughout a computer system’s lifecycle, helps drive better decision-making, ensures product quality, and minimizes supply chain disruptions. Factors that can reduce risk include repeatability, electronic signature security, visibility and an audit trail.

When we bring all of these elements together and apply them across SDLC tools and teams, we are left with what is a more comprehensive and controlled, yet flexible methodology for executing and capturing compliance deliverables. Furthermore, there is another benefit that wasn’t listed above as it is more the resulting outcome of all those factors. In fact, I believe it to be so crucial to software teams working in a regulated environment and I want to stress it so much, that I am going to separate it from this paragraph and let it stand alone, here:

The data-driven methodology allows teams to take the emphasis and focus off overly burdensome documents and documentation practices. Instead, it allows teams to focus on quality, and for compliance deliverables to be derived as a byproduct of good software and systems quality practices, rather than overshadowing them.

Look, I understand that there are complexities to consider and that it’s not as easy as simply adopting a new methodology, or implementing tools to achieve it, and maybe it can’t happen for your team tomorrow, but at the very least, these conversations need to start being had. The reliance on traditional validation methodologies, not matter how long they have been utilized across the industry or within your organization, has simply not allowed organizations to keep pace with the rapidly evolving world of business and technology requirements.

In recent years we have seen many prominent organizations and even the FDA begin to take stances and actions to shift away from these practices. I think we are seeing what might be best described as the pendulum swinging back a bit to rain in documentation and testing practices that got a little out of hand for a while. To the point where – in many cases - the documentation and documentation practices were actually hindering the ultimate goal of software quality, rather than improving it.

In any case, the sum of all the benefits we have discussed result in increased quality and efficiency, while reducing risk and removing much of the human error from the picture, and – as I stressed above – the ability to finally move away from legacy documentation practices. With that in mind, broaching this topic as part of your next team meeting certainly seems like a worthwhile endeavor, wouldn’t you say?

Of course, you can always reach out to Tx3 via the "learn more" link if you would like to discuss this in more detail with a member of our team.

At the very least, please feel free to subscribe to our blog at the top right of this page so you can be notified as we upload new posts on various topics in the world of regulated software quality.

Learn More

Also, please register for a webinar we will be hosting on 6/11/2020 where we will be covering a number of items related to digital transformation and SDLC modernization in a regulated environment. You can learn more and register here: 

REGISTER NOW

 

Share to: