Best Practices for Compliance in Digital Transformation, Part 1 of 2
If you’re reading this, I’m going to go ahead and assume that you work in an FDA regulated environment. I’m also going to assume that you have first-hand experience when it comes to the stringent and serious nature of those regulations as it pertains to adhering to compliance requirements in GxP applications and systems.
The importance of navigating this becomes increasingly crucial as more and more regulated organizations begin to undertake or advance initiatives around things like digital transformation, Waterfall to Agile, hybrid, cloud, or full-blown DevOps. When trying to apply traditional validation and compliance adherence processes and methodologies, this can become risk-laden and unwieldy for a host of reasons, but the good news is that, when done correctly, there are extremely effective ways to manage this that will not only assure compliance adherence but also drastically improve and streamline compliance management in regulated software quality projects.
To help our readers gain an understanding of managing adherence to all cGxP guidelines, and to help advise on how to maintain consistent regulated quality in the software development lifecycle (SDLC), I thought it would be good to share some best practices and potential pitfalls as life sciences teams look to modernize their software quality practices and adopt more modernized solutions and methodologies.1) Perform a Detailed “System” Analysis
Any CSV quality program must start with an extensive understanding of the over product(s) lifecycle. From the sourcing of raw materials to the final certificate of analysis, each part of the product’s lifecycle collects and directs critical information (i.e. data elements) that need to be understood in order to accurately assess overall “system” risk. This accomplishes two things:
- It enables your organization to develop clear-cut user and functional requirements that consider your specific operating environment.
- It allows a thorough “gap” analysis of your organization’s systems, i.e what is currently in place, what is extraneous or non-practical or what is outdated. Many organizations have pieces to review what they have – and what they might be missing.
2) Create a Comprehensive Agile Computer Validation Strategy
Many organizations still have in place validation strategies that were developed in the early 2000s when paper documentation, waterfall system development lifecycle, and fairly static data sets were being recorded and utilized. Today, the vast amount of data that is being captured requires a controlled agile computer validation approach. One that allows for changes and pragmatic, critical thinking. There are three key components to a broad, integrated CSV strategy.
- Risk-Based: Your organization needs to evaluate and determine all risk-factors in terms of security, data, product quality, safety, and efficacy, as well as intellectual property. This includes all system elements including hardware and software, related equipment and network components, and the operating systems environment. This helps to determine what is practical and achievable for critical elements of the system that affect quality assurance and regulatory compliance. It is essential to ensure appropriate risk-mitigation processes are put into place; otherwise, there is no concrete data to make informed decisions to help balance cost vs. risk (in order to minimize both) to the organization and its products.
- Systems-Based: The interrelationship of organizational processes and procedures – and compliance within each system – directly impacts overall product output and compliance. By clearly defining inputs and outputs to a system, a direct or indirect relationship to outcomes can be determined. Only those outputs that are direct to quality need to the most rigorous analysis. This system-based approach helps to determine what is practical and achievable for critical elements of the system, to focus on inputs that affect quality assurance, regulatory, security, privacy, compliance, etc. It also helps to minimize those efforts and rigorous analysis to inputs/outputs that are not directly related to quality, efficacy and/or safety.
- Data-Based: Data integrity and control cannot be overemphasized here. It is critical to understand and map all data flows, business processes that utilize the data, and data transfer points to other systems or network components. Understanding the intended use of data and its relationships to other data points is essential to allow organizations to ensure data quality throughout the entire data lifecycle.
Key Mistakes to Avoid: It is important to understand that document-driven (physical or electronic) is not the same as data-driven. Many of today’s life sciences companies still rely on a document-driven process, on that is proving to be increasingly expensive, onerous and hard to manage. A data-driven approach moves away from the structure of static documents. Instead, it breaks the information down into more granular data elements which are managed and approved along with supporting metadata, for use in myriad ways. The benefits are many – real-time insight and access, configurable workflows that are repeatable, and comprehensive reporting and analysis. When deciding on the right strategic approach for your company, take these two different processes into account.
Special Note: Any and all data collected in conjunction with cGxP practices are required to be maintained for the lifetime of the data retention model and are subject to audit if regulatory agencies request.
3) Implement Strategies for Reducing Cost and Complexity of Compliance
This sounds like an obvious practice, but time and again we have witnessed projects that were fraught with complexity, overburdened with unnecessary processes, and became an immense drain on resources. The key? Keep it simple, but make sure your process is well-defined, well-maintained, and well recorded.
Key Mistakes to Avoid: Consolidate your purchasing process(es). Quite often, different areas within the same organization have the same business need, but due to organizational structure and allocation of funds, the same and sometimes identical solutions are purchased by these different groups – either in different business units, or the same business unit but one that is located in a different geographic location.
Consolidate and/or integrate project management functions with your CSV approach. In many cases, the Project Management Offices (PMO) doesn’t understand the detailed user requirements of a business unit and are making the decisions about a “plan” strategy (i.e. what should be validation strategy) without consulting key stakeholders in compliance and/or validation. Or the PMO itself has additional project management deliverables that are redundant validation deliverables – ones that are already required.
Okay, so I think that’s a decent start to this list, but we’re getting a little long for a single post, so I think we’ll break it here. Next week I will cover the rest of the topics, but in the meantime, should you have any questions or if you are undergoing similar discussion or planning within your organization and would like to discuss any of this in more detail, please feel free to contact me at the provided link below, or to reach out to Tx3 directly at firstname.lastname@example.org.
And don't forget to subscribe at the top right of this page to be notified when new blogs are posted!
Also, we recently hosted a webinar on SDLC Modernization along with Allergan and much of the content ties in with this series. You can view it at the link below if you are interested.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.View all posts by Jason Secola