There will be Audits: Behind the Shift to Digital Systems and Electronic Records
As we’ve covered up to this point in the previous two posts in this series (check out part 1 and part 2), there are a lot of variables and factors to account for in building robust and effective validation and software assurance processes if one would like to avoid the “tsk, tsk, tsks” and disappointed looks of an auditor. After all, a warning letter is never a good thing to receive...and let’s not even imagine what happens if one of those doesn’t suffice.
Governance, control, quality data, good policies, good procedures, etc, etc. All of these factors will play a role in an organization’s level of audit preparedness, but what can be done to effectively achieve these things? If these are gaps or issues that your team is currently experiencing, it’s likely that some adjustments and enhancements are already being discussed throughout your organization as it pertains to your SDLC and compliance methodologies. Maybe it’s being driven by a recent rough audit, or maybe it’s a pre-emptive strike to make sure you have a robust and mature validation model to demonstrate when the FDA comes knocking.
Whether your motives be corrective or pre-emptive in nature, initiatives to shift to more modern and digital compliance practices may seem like a large undertaking - and it can be - but this shift from legacy methodologies and documentation practices will yield great results both in the short and the long term. Still, it will require a shift in mindset and practical approach to be successful.
So what should teams be prepared to change as they begin to shift to digital systems, modern methodologies, and the utilization of electronic records instead of legacy documents to manage their compliance deliverables and audit documentation? While this will by no means be a comprehensive layout of X’s and O’s to get your organization to that point (it’s just a blog, after all), there are some useful insights that I think we can share as teams prepare to undergo this change.
A big positive that often gets overlooked or taken for granted during these types of initiatives, is that this type of exercise offers a great opportunity to do a complete evaluation and analysis of how things are being handled today vs. what the desired - or required - outcome should be. It’s a good time to evaluate the whole business process. Objectively and strategically determine things like what the true data map should be, evaluate the associate risk in each area, and identify areas where risk, errors, and gaps can be mitigated with procedural and solution controls.
Still, a question often comes around where to actually start this process and it’s a difficult question to provide a uniformed answer for. Do you start small with a pilot area in your cGxP landscape and expand the footprint from there, or do you jump into a complete, unified shift across all cGxP related systems at once? As with any organizational change, knowing what your organization can handle and manage effectively and successfully is really the only way to answer this, but for many organizations, starting small to pilot things out before an enterprise rollout is usually a safe bet and a good way to ensure success. With this approach, there are a few areas that may make sense to target as a logical starting point.
For one, as part of your overall business process review, identify some areas that present an opportunity to apply “quick” changes that will yield a large impact. This may not even be something that’s identified as part of the business process review as it’s likely that your organization will have already identified some of these sticking points over the years. Since we’re talking about more effectively managing processes for audit preparedness, a great place to start is with looking at streamlining documentation. This is often one of the biggest value-adds that organizations can see, and it can easily be applied across the entire GxP lines of business.
Teams can also start this process by identifying areas where either net-new systems are being implemented, or existing systems are being upgraded. Since there will already be active changes occurring based on those projects, they can be good, no-brainer areas to focus on to get some early success and metrics to provide to the rest of the organization. This will make wider spread adoption an easier proposition to navigate, as well as help to build a rollout plan that can be replicated (with some system specific adjustments) across other applications.
However, new processes and new solutions are not always going to be an easy pill to swallow for some. As with any organization shift, there is likely to be some pushback, but the advantages to be had are many - once embraced - and getting key stakeholders involved in the process and discussions early is key for adoption. This is also where the benefits of a smaller project-based or pilot-based rollout can be beneficial as you will have some metrics and benefits under your belt to help sway those in the resistance to join the cause.
Aside from some practical data from a pilot rollout, surely there will be some scenarios that you can reference that will resonate as a pain point for folks that are not yet on board. For example, pointing back to a difficult (“difficult” might even be putting it lightly) time finding, organizing, and supplying correct information to an auditor while leveraging old systems and documentation practices. One of the biggest advantages of digitizing those processes and moving to electronic records (let’s put aside the actual benefits of day to day utilization benefits) is being able to quickly and easily retrieve electronic records at the time of an audit and supply it to an auditor in an organized, simple, and easy to digest fashion.
Whether you opt to extract those records in the form of a report or show them directly in the system of record itself is up to your discretion, but actually seeing a system in action gives an auditor a look into the real world utilization or “day in the life” of how these systems are actually being used. If your organization does a good job of keeping your team enabled in these solutions, your QA personnel should be more than comfortable with the solution and should have no issues in demonstrating their proficiency and common use cases with the tool(s). After all, no solution should be a limitation to an organization’s teams or process. They should be making the life of team members easier while acting as an effective extension of the business processes they are designed to support and automate or make electronic.
It’s not all champagne and caviar, though, dear readers. As much as I tout the shift to standardizing validation and software assurance practices across the SDLC through digitalization and the adoption of electronic records, it’s not as simple as just plugging in a new solution, using your electronic signatures, dusting your hands off after a good day's work and then walking away. This is an ongoing practice that must be managed effectively to ensure that your software and systems compliance ship is tip-top when an auditor comes to inspect.
This means that cohesion and frequent communication are needed across Quality, Validation, Business, and IT teams. The implementation of digital solutions should make this much easier and certainly remove many of the roadblocks that sometimes cause these teams to be at odds with one another (go back and check out our digital transformation wrap up for more detail here), still, mandated and scheduled periodic reviews are a key component to any effective quality management system - certainly more-so in a regulated environment. Through these regular reviews with key stakeholders, teams should now be able to really embrace and implement a system of continuous process improvement by identifying gaps or enhancement areas in smaller, more manageable chunks.
Just as important as everything mentioned above, soliciting feedback from all appropriate users across the business process, top-down and bottom-up, on a regular basis is going to be crucial to ensure that processes, proper governance, performance, and well-defined data requirements are all up to par, functioning and operating as they should, and will continue to stay at that level.
Listen, I know we’ve gone over kind of a lot over the course of the last three posts in this series, and while some of it may not necessarily have been specific to audits (as the titles would lead you to believe), I can assure you it as all relevant as teams strive to enhance their audit readiness and quality of audit submissions/deliverables by shifting to more modern utilization of digital systems, automation, and electronic records.
After all, audit preparedness and effectively managing audits is not just about the actual audit itself. The audit is the end result of everything that is put in place leading up to it. That means intelligently and strategically thought out policies, processes, and procedures. That means implementing an effective combination of technologies and methodologies to ensure defined processes are being adhered to and captured easily, without being a roadblock or burden to IT and business teams. That means frequent and regular training for all team members across the cGxP landscape and mandating regular communication and review across key teams and team members.
As always, I hope you’ve found this series useful, but please feel free to download the audit readiness tip sheet below. We also have a webinar coming up that will cover a variety of topics that will lead to clean and successful audit executions and I have included a link to register for that presentation as well.
Audit Tip Sheet:
Tx3 Webinar: Standardizing Compliance to Achieve SDLC Modernization
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing and later began a focus on testing in a regulated environment. He currently resides near Sacramento, CA.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing and later began a focus on testing in a regulated environment. He currently resides near Sacramento, CA.View all posts by Jason Secola