There will be Audits: The Summary Wrap-Up
In an effort to more concisely summarize some of the longer series that we post in our blog, we have decided to start putting a summary wrap up piece together to encapsulate that series. While these won’t go into great detail, it should give you the gist of what was covered. With that said, I would recommend going back and reading each piece for more in-depth coverage of these topics (I will link to the original posts throughout this piece).
Okay, enough with the setup. Let’s just jump straight into the summary wrap up for our audit series, shall we?
Audits are a certainty for teams working in a regulated environment, but everything leading up to the point of supplying an auditor with documentation and records can vary widely from team to team, and organization to organization. A successful or failed audit is the end-result of everything put in place, executed, and captured leading up to the moment when the FDA opens up the books.
Teams that are advanced in their software validation and compliance maturity models and have achieved high levels of governance, control, risk mitigation, and data integrity by adopting and implementing digital systems and electronic records. In this summary, we'll cover some common issues keeping teams from achieving similar results, as well as how to best start adopting and implementing some of these practices to better align regulatory requirements with business and quality objectives.
Assumptions vs Reality of the FDA (full post in part 1)
Many teams hold on to legacy, document-based validation methodologies since it’s what they have done in the past, and it works. It true that they can “work”, but little else is offered by way of benefits. Further tying teams to these processes and methodologies is the belief that it’s what the FDA is familiar with and what the auditor wants to see, but that’s not necessarily the case anymore. In fact, it may be a bit of the opposite.
- FDA auditors today are much more versed in modern technologies and methodologies than has been the case in years gone by
- What they are concerned with is demonstrable and verifiable control
- Control and governance over defined processes and policies
- Control and governance over the people that operate within those processes
- A lack of systems and tools in place will raise questions about an organization’s maturity model and their ability to apply control and governance over said processes and people
- This raises additional scrutiny when reviewing records
Control and Governance (full post in part 1)
Governance and control are a cornerstone of an effective team operating within regulated software and validation requirements. Still, we often find this to be lacking when evaluating a new client’s existing systems, processes, and methodologies.
- Many teams are still rooted in document-based validation methodologies
- This often allows for little by way of governance
- A lack of control can lead to a number of errors
- Even well-defined processes are susceptible to errors and subsequent compliance gaps without proper governance applied
- People, no matter how good and capable, are prone to mistakes. Procedural and technology based governance can drastically mitigate this
- Verifiable control and it's impact on record review
- Once control and governance is verified, the auditor can begin to look at the actual content of the records (which will lead us to the next part)
- Without proper governance in place, more scrutiny will be placed on the content of those records as the quality and accuracy of the data within becomes potentially more suspect
Consider governance and control one of the first things to take a look at when it comes to the benefits of moving away from traditional documents and shifting to digital systems and electronic records.
Data and the Human Variable (full post in part 2)
As valuable and crucial as governance is, it’s only as good as the processes that it controls. If defined processes and data requirements are not sufficient, then governance is really only ensuring that bad process is being adhered to, which will result in poor data capturing. This, of course, stems from the human side of things before tools to apply governance can even be implemented. If poorly thought out or insufficient processes are put in place, governance alone will not be the cure.
For example, let’s take a look at some of the most common issues we see when we do customer audits:
- Content of requirements are not SMART requirements (Specific Measurable Attainable Relevant, Time-Bound or Testable)
- Content of Tests that do not challenge the requirement/design, or level of necessary testing appropriately
- Risk Management not being applied uniformly and varying from application to application
These are all attributable to the human element and what people define as relevant and required data input. Regardless of governance in place, and regardless of the output format of the record and documentation, none of it matters if the inputs and captured data are inadequate or inconsistent.
Beginning the Shift to Digital Systems and Electronic Records (full post in part 3)
Shifting to digital systems and electronic records may seem like a large and challenging undertaking in a regulated environment, but it offers a great opportunity to do an analysis of how things are handled today vs. what the desired state is. It’s a unique chance to really overhaul and enhance your current software validation and compliance practices, so use this chance to:
- Identify what the true data map should be
- Identify current gaps
- Evaluate, assess, and designate appropriate risk profiles in each application and function
- Identify areas where risks, errors, and gaps can be mitigated with both procedural and technology/solution-based controls
Where to Begin (full post in part 3)
Where to begin such a fundamental change can be a challenging decision. Unfortunately, there really isn’t a “one size fits all” answer. Understanding what your team or your organization can handle is really the only way to gauge where you should start, and what the scope of that start should entail. For some, that may mean an enterprise-wide rollout, but for most, starting small with a pilot area will likely make the most sense. Some common areas are:
- Identify longstanding problem areas where changes will yield high impact results
- Streamlining documentation is a very common area where this applies, and these changes can then be implemented across the cGxP landscape
- Identify areas where existing systems that are being upgraded, or where a new system is being implemented
- Active projects will already be occurring around these initiatives, so it’s a good time to incorporate changes
- This allows for a manageable project while capturing metrics and a rollout plan that can be replicated across other systems (with system-specific adjustments, of course)
Still, whenever a change is introduced, there will likely be pushback, so be prepared to navigate those conversations.
- Get key stakeholders involved in the discussions early.
- Capture and be able to discuss metrics and successes related to the pilot rollout we discussed.
- Be able to point back to challenges in the past finding, organizing, and supplying correct information to an auditor while leveraging old systems and documentation practices.
One of the biggest advantages of digitizing those processes and moving to electronic records is being able to quickly and easily retrieve electronic records at the time of an audit and supply it to an auditor in an organized, simple, and easy to digest fashion
All of this should make navigating pushback conversations a bit easier.
Ensure Continued Success and Effectiveness (full post in part 3)
As with success in anything, it’s not as simple as just starting the engine and then letting it run. The shift to digital systems and electronic records is a great step to ensuring successful audit preparation and execution, but it should be treated as an ongoing endeavor. As part of the process and solution overhaul, we suggest teams implement:
- Regular and frequent communication between all teams and stakeholders
- This should become much simpler with the right tools in place
- Schedule and mandate periodic reviews
- This enables continuous process improvement
- Problem areas will be identified and mitigated earlier, and in smaller, more manageable efforts
- Regular and frequent training for all team members
- Regular training and review of defined processes/procedures
- Regular training in tools and solutions that will be utilized
- Tools should not be a limitation, but an extension of defined business processes
- Proficiency in utilization of these solutions will also give and auditor a look at the “day in the life” and how the tools are being used relative to data requirements and defined processes/procedures
- Solicit regular feedback across the cGxP landscape from users and stakeholders
- Top-down and bottom-up feedback and reviews are crucial to ensure that systems, solutions, and processes are optimized across all teams and that both compliance and business objectives are being satisfied
As I mentioned before, I know a lot of this wasn’t necessarily specific to the act of the audit itself, but everything we’ve discussed here is important for achieving a more streamlined and balanced software quality framework that enables the use of modern solutions and methodologies. The audit is the end result of everything put in place leading up to it, and the simpler, cleaner and more efficient teams are able to make that process, the simpler, cleaner, and more efficient their audit preparedness and execution will become.
We recently hosted a webinar on SDLC Modernization along with Allergan and much of the content ties into being prepared for audits. You can view it at the link below if you are interested.
Also, please feel free to request our audit tip sheet:
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.View all posts by Jason Secola