There will be Audits: Losing Control
As the old saying goes about paying taxes being a certainty in life, if you work in a regulated industry, you might as well go ahead and add being audited to that list. What is not a certainty, however, is the level of process maturity, preparation, and audit readiness from team to team, and organization to organization. Being properly prepared is attributable to a number of factors, but so is being unprepared.
For those that are less prepared, the anticipation alone of being audited can be staggering. The audit itself, when one is is knowingly lacking in good processes and tools to manage their audit readiness, is so stressful that it has been known to cause people’s faces to actually melt. Yes, when the auditor is spotted in the lobby it creates a scene of panic and madness in the office that can only be equated to the climax in Raiders of the Lost Ark when the Ark of the Covenant itself is finally opened. It’s true. Look it up.
Okay, so maybe that’s not true (and is perhaps just a touch overly hyperbolic), but what is true is that I think that now marks the second Raiders of the Lost Ark scene I have referenced over the course of our less than year-long blog history.
“But Jason,” you may be asking, “how is the number of Raiders of the Lost Ark references you have made relevant to this specific blog about audits?”.
That’s a great question. Fortunately for you, my lucky readers, it happens to be a question that I have a definitive answer for: It’s not relevant. It’s not relevant in the least, so let me get back on track.
As I was saying, audits are a certainty, but proper preparation and process maturity are not. Over the course of this series, I will cover several aspects related to computer systems audits, but I thought a good place to start is with something that should be a cornerstone of any good CSV process, which is control. Control over systems, processes, and people. A lack of process controls can have a trickle-down effect that leads to human error and, subsequently, missing or bad data being supplied during an audit.
Often, the culprit behind gaps in control is a lack of good systems and tools to manage governance. A lack of proper tools being put in place that allow for proper governance and assurances that defined processes are being adhered to. This seems to be especially true in more legacy, document-based methodologies.
For some organizations, these types of legacy processes are still in place simply because it’s what they have inherited and for whatever reason, they haven’t yet chosen to, or haven’t yet been able to migrate away to more modern approaches. Unfortunately, this is sometimes even due to a somewhat out of date assumption that auditors may be more comfortable with legacy processes, tools, and methodologies - i.e., rooted in traditional documents. While that may have been true at some point, it certainly doesn’t seem to be the case anymore. The reality is that FDA auditors today are much more familiar and trained in technological solutions than in years gone by.
What they are still focused on, however, is whether or not an organization’s defined business and quality procedures are in place and are being adhered to. Verifiable control is of utmost importance as it demonstrates that mitigation has been put in place to limit those trickle-down factors that we mentioned above.
Put the methodology for capturing and providing validation deliverables aside for a moment (although some are certainly better than others for assuring levels of control) and it boils down to the same point: control. How well does an organization control its processes? How well does an organization govern its employees as they operate within those processes? How adequately and easily is a team able to demonstrate and provide verification of that control over their regulated systems?
The reason why this is important and salient to the next few points is that even though some teams still carry the notion that FDA auditors are more comfortable with legacy-based validation processes, or that those approaches are more tried and proven, in actuality, the FDA has become much more versed in the adoption and utilization of modern tools and SDLC practices. They have fully recognized and embraced that modern tools and processes enable much higher standards of control across validation, quality, and testing teams, while also enabling higher levels of software and systems quality - which is the point and the goal in the first place.
In fact, when we engage with clients to assess their processes and we see certain “legacy” practices in place, it will immediately raise some questions related to control and governance (or lack thereof) over defined policies and procedures. You can be sure that the same will hold true of an FDA auditor.
A document-based system is simply going to raise more questions during an audit. Why is the organization not further along in its maturity model? What are they using and how are they managing their documents? Where are the control points and how are they being enforced?
That’s not to say that compliance and validation cannot be adequately achieved through document-based or manual processes, it is simply to say that those approaches leave more room for error and, therefore, will raise some additional scrutiny in the eyes of an auditor. I don’t know about you, but the phrase “additional scrutiny” is not one I particularly like associated with an audit.
The reality is that document-based and manual processes offer little control whereas utilization of digital processes and effective tools show process maturity while enabling much higher levels of governance. Governance that allows for control over defined processes. Governance over the people and teams responsible for adhering to those processes, thus drastically mitigating the potential for error.
While governance and control are crucial components of any CSV process that need to be demonstrated during an audit, there are still those trickle-down variables that we mentioned earlier that come as a result from improper control, and that’s where we start to get into the “data” and the “human element” side of the discussion. However, since we’ve already spanned several pages, I think an intermission is in order.
So go grab your popcorn, soda, and candy, and come back for the next part in the series where we will start to cover how improper control can lead to the poor input and capturing of data within records. In the meantime, feel free to request our audit tip sheet below, and, as always, please subscribe at the top right of this page if you would like to be notified as new blogs are posted.
Also, we recently hosted a webinar on SDLC Modernization along with Allergan and much of the content ties in with this series. You can view it at the link below if you are interested.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.View all posts by Jason Secola