There will be Audits: Data and the Human Element
As we discussed in the last post (view here), control and governance over regulated systems and processes should be a cornerstone of CSV (or CSA) practices for any regulated organization. Without it, defined processes are in jeopardy of being deviated from, errors will be frequent, and heavy scrutiny will be placed on the evaluation of records should an auditor identify lack of proper control over personnel and processes. Fortunately, there are tools available today designed with Life Sciences teams in mind that can facilitate this and grant the controls necessary to apply appropriate governance.
By putting appropriate tools and methodologies in place - generally, those which enable the migration away from traditional document-based methodologies and offer the utilization of more flexible and effective electronic records - a large element of audit readiness can be achieved. However, there are still some variables to be accounted for that governance alone cannot achieve.
While governance can apply control and reduce human error in deviation from defined processes, there is still a potential blind spot that still leaves room for the other side of human error - and that’s when it starts to boil down to the actual data itself. The actual data contained in the record an auditor will be evaluating which is derived from defined policies and procedures that a team’s personnel operate within. After all, governance is only as effective as the procedures and policies it presides over and governance alone does not ensure that good data is being defined and captured by the people and teams operating within those processes.
Mitigation of mistakes is crucial in a regulated environment and teams that are further along in their maturity model have been able to implement and utilize effective tools, apply thoughtful processes with regular evaluation, and deliver frequent education to achieve high levels of governance and data-integrity to drastically reduce those mistakes. Still, for many teams, human error and subsequent poor data input/output remain amongst the main culprits behind adverse results of an audit.
Let’s just take a look at some of the most common errors we see when evaluating a new client that hasn’t yet adopted methodologies, solutions, or education frameworks to achieve some of those mistake mitigation and data-integrity maturity factors that we alluded to above. Here is what we typically find:
- Content of requirements are not SMART requirements (Specific Measurable Attainable Relevant, Time-Bound or Testable)
- Content of Tests that do not challenge the requirement/design, or level of necessary testing appropriately
- Risk Management not being applied uniformly and varying from application to application
All of the above are attributable to the human element. With appropriate governance, organizations are better assuring that their personnel does not deviate from defined processes, but beyond that, the result can still be what see here. This is the element that a technology solution cannot provide to you - the actual input and capturing of the data itself.
Training team members in each of the areas listed above is critical to being able to produce results that are relevant to the intended use of solutions, ensuring that the appropriate level of effort, design, control, and are put into place to making a robust business process that is effectively managed, executed, and captured.
Focusing too much on governance can be detrimental in its own way if other key areas of building and maintaining an effective team are neglected. To that point, what we sometimes see is over-engineered processes - either procedurally or on the technology side - rather than organizations actually doing more data integrity checks on the actual inputs of things like requirements, design, testing, and deviation content. For lack of a better phrase, too often we see “garbage in, garbage out” results on the content of the records themselves.
As important as governance and effective tools can be to enabling and displaying process maturity, so too is making sure that appropriate levels of quality data are being captured and supplied in the records themselves. Once the auditor gets to the point where the actual data is being evaluated, the output, whether it be via user interface or a report, is almost a moot point in comparison to making sure the right data is being captured in the first place.
Again, this is where we often see things get messy for teams using more legacy approaches to their compliance and validation practices. Especially so as we see teams try to graft those processes onto more modern solutions and methodologies.
The good news is that more and more teams are starting to shift and change their mindsets and modernizing their approaches to controlling, executing, and capturing their validation deliverables. This type of digital transformation across the SDLC also requires an often much-needed transformation of compliance practices.
These types of transformation initiatives present a great opportunity to really evaluate gaps and start addressing how to effectively manage things like data and governance throughout the compliance chain - and let’s not forget that by achieving these things, we are drastically mitigating the human error factor and ensuring that we are providing an auditor with robust systems and processes to demonstrate advanced maturity models with quality data inputs/outputs.
For the next part of this series, we’ll take a look at how teams are able to successfully and strategically approach implementing some of these transformational changes, and some of the benefits they will yield. In the meantime, please feel free to click the link below to download our Audit Readiness tipsheet, and please subscribe to this blog if you find these posts useful or interesting.
Also, we recently hosted a webinar on SDLC Modernization along with Allergan and much of the content ties in with this series. You can view it at the link below if you are interested.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.
Jason Secola manages content marketing and channel activities at Tx3 Services and has been with the company since 2016. Jason began working with the larger portion of the existing Tx3 team dating back to 2007 when he got his first start in the world of application testing which ultimately led to a focus on testing in a regulated environment. He currently resides near Sacramento, CA.View all posts by Jason Secola