The Latest Acronym on the Block: CSA (Computer Software Assurance)

The Latest Acronym on the Block: CSA (Computer Software Assurance)

Within what has traditionally been defined as computer system validation, there are a number of thoughts and methodologies to meet and achieve compliance. They’re not all the most effective, they’re not all created equal, and they’re not all cutting edge, but they should (theoretically) all be achieving the same goal. The actual FDA guidance and regulations state what needs to be done as an end result, and roughly some of the things that need to be accomplished along the way as part of that, but they don’t lay out a clear roadmap on exactly how it must be done.

Perhaps this was intentional with the understanding that organizations, even if in the exact same industry, may have very different ways of doing things. Maybe they understood that the technology landscape changes frequently and quickly, and some teams will be quicker to adopt those new tools or methodologies while others will be slower to adapt. Maybe the lack of a clear X’s and O’s breakdown allows for flexibility and variants as long as the end result is verifiable compliance adherence in an organization’s GxP systems.

Or maybe it wasn’t that well thought out and the lack of specific process definition just left the door open for a few ways to eat the same apple (is that even a saying?).

The point is that what ended up happening with a lot of organizations (and I’m sure much of this may have been perpetuated by overzealous FDA inspectors, or internal and 3rd party auditors as well) is that caution born of non-compliance fears gave way to overly rigorous validation and documentation practices that governed all applications and application functions within an FDA regulated organization’s designated GxP systems. Sort of a, “We must test everything rigorously, and we must document everything extensively” mentality that actually ended up leading to a top-heavy emphasis on the documentation piece.

While I can understand how this happened, it doesn’t appear that this was the intention of the FDA - or at least that in more recent years the FDA has come to recognize this issue. It’s clear that this type of interpretation has had adverse effects on teams that implemented these overly rigorous practices based on their interpretations during the days of manual testing and purely waterfall frameworks. 

Another unintended consequence of this “test and document everything heavily” approach is that we actually see a reduction in overall quality. What has been found over a long period of time and through enough case studies is that when organizations attempt to achieve the same level of compliance verification/documentation and testing across their entire GxP systems landscape, that more errors and gaps in compliance come into the picture. In short, the result of putting too much focus and emphasis on documentation rather than strategically applying resources and testing rigor is often decreased quality and increased human error.

Further, this type of legacy validation mindset leads to the limited implementation of things like automation due to the cost and complexity of maintaining an automation framework in this type of environment - not to mention the fact that the benefits of test automation are drastically limited in this type of validation methodology. Automation, by the way, which has the potential to drastically improve system quality and efficacy when executed with a strategic, risk-based approach.

With these considerations in mind and in an effort to un-ring the blanket testing and documenting bell a bit, what is being called Computer Software Assurance, or CSA, has been introduced. The idea behind this is to put more emphasis and critical thinking behind strategic, quality-focused testing with required compliance verification levels being dictated by a more risk-based approach to the testing of non-product GxP systems.

Let me just reiterate, this is specific to non-product testing.

That means systems and applications that are not directly embedded in or associated directly with medical devices or patient-related products. The idea is to put the emphasis and endorsement back on an effective, quality-based testing strategy with less focus and perceived requirements on the documentation piece. These are being deemed “indirect systems”.

For “direct” systems, we can expect the same levels of rigorous testing and compliance verification deliverables (I’m trying to start shying away from saying “documentation” because that implies a “document”, which isn’t necessarily a requirement) be supplied as these will be the highest risk-profile pieces of software in your organizations. The systems that will have the highest impact on potential product quality and patient safety. 

However, even within those “direct” systems, there should still be room and flexibility to assess based on specific functions or areas. Meaning that even within those high-risk systems, there is room for a more granular risk-profile assessment, assuming that one’s criteria are well defined, and those conclusions can be clearly and adequately justified to an auditor.

Now, this is where we start getting into the actual application and execution of CSA and how teams can effectively implement it, so how about we hold off on that for another fully dedicated blog piece, shall we?

I hope this first piece offered a decent (albeit, truncated) understanding and overview of the idea behind CSA, but subscribe to the blog so you can get notified when we get a bit more into the actual adoption and utilization of CSA, as well as what teams can expect as part of this transition.

In the meantime, please request this datasheet to learn about Digital Validation, a methodology that enables enhanced CSV and easily supports CSA.

Digital Validation: A Dynamic New Approach to Regulatory Compliance

Share to: