close
close

Validation of Computer Systems in 2024 | by KCR CRO | KCR headlight series | May 2024

KCR CRO
KCR headlamp series

Doug Baindirector of technology at KCR

When working in clinical trials, we are often supported (some might say “constrained”) by regulations from the FDA, EMA and other regulatory bodies that define certain standards that must be met to ensure a satisfactory research environment. They help ensure patient safety and test reliability.

In today’s blog, I would like to assure you that the way you were able to verify software in 2014 is not sufficient in 2024.

First of all, I would like to discuss what “validation” should and should not be.

If regulations were abolished – e.g., ICH GCP and 21 CFR Part 11 – would I still follow the same steps to ensure that the software we use in clinical trials is “good”?

My answer would be, “Yes, I would.”

I have always viewed validation as a process of delivering better business solutions, not just meeting regulatory requirements. In fact, when I’m asked if our systems are “verified”, I interpret that to mean that they have been validated to meet all applicable regulatory requirements and KCR business requirements.

My policy is to ensure that the software solutions we use meet high standards in both quality and performance.

Some might argue that THEIR GCP and 21 CFR Part 11 achieve this. I would disagree. In my previous experience, I had worked with some poor technologies that passed CSV compliance audits perfectly. That didn’t make them good software. Any validation plans I define are intended to meet regulations AND achieve a “good quality solution” for the purposes for which we use the software.

The software we use today has changed quite a bit since we started using web-based systems in the late 1990s. At that time we had what we called client/server solutions, which were moved to the Internet. Very often they were programmed rather than configured. They were multi-instanced (actually… that’s still true…).

In 2008, GAMP 5 was released – often called Model V. This was defined to reflect waterfall software development methods where a specification was written – code was written in long form in multiple documents and then tested. Mainly, software manufacturers wrote and configured the software and the licensee used it.

In the 2020s, we are engaged in agile software development deployed as configured cloud platforms. Thanks to these solutions, we increasingly have highly configurable systems that are adapted by licensees to business requirements. The configuration contains most of the business logic that actually powers the business functions (e.g. CTMS, eTMF, etc.).

To be clear, the vendors of these systems validate the underlying platform and often validate the original application configuration on those platforms, BUT this is then passed on to the licensee who then actually “owns” the configuration and therefore has responsibility for validation.

I also understand that some suppliers not only maintain their core products, but also provide services to licensees to customize product configurations. This is fine, but configuration and user verification still needs to be done to meet end user requirements.

The next major change in the way software is developed and released is the principle of continuous deployment. In the life sciences technology space, this has been strongly adopted by Veeva since 2019 and more recently by Medidata. (Veeva achieved this by basing most of its software on a common content management system (Veeva Vault), Medidata by having a mature code base). This means that new software updates and software configuration are frequently checked and applied.

GAMP 5 version 2 was released in July 2022 and included many of these specific types of solutions. Cloud, single instance, agile development methods, continuous deployment and configurability.

GAMP 5 is defined as

risk-based approach to implementation, operation and validation GxP Computer systems in regulated industries – including life sciences.

Edition 2 adds or significantly changes the following sections;

  • IT Infrastructure (New)
  • Critical Thinking (New – Key to Targeted Risk Management)
  • Determining requirements (significant changes)
  • Testing of computer systems (significant changes)
  • Agile software development (new)
  • Blockchain (new)
  • AI/ML (New)
  • Electronic production records (new)

I will not analyze all the changes separately. The above changes propose a lifecycle approach to validating cloud solutions created and delivered in an agile manner. Virtually all new systems developed since 2015 follow this flexible approach to continuous deployment.

CSV 2024

The revised Edition 2 V model can be described in my diagram above. This means KCR will implement our risk-based validation of V2 computer systems.

One of the really important factors in this method is the use of a risk-based approach. It doesn’t say – define everything and test everything. Instead, it asks you to define risk elements (categories) and apply your actions according to the risk measurement. A good example is testing a configured system when the configuration specification is generated by the (verified) product. Testing is intended to confirm that the configuration used is as expected, not that the product will perform the configuration correctly. At KCR, we assess the risk and define the management of all systems that we introduce to the company.

This is probably the most important and underestimated aspect of validation of modern computer systems. Configuration steps can be simplified to a point-click interface rather than complex programming, but this lack of programming can mask the risk of misconfiguration going undetected. A recent example of this was presented by the regulator MHRA at the recent SCDM Live EME24 conference. The case study explains how, due to a lack of testing, the RTSM solution resulted in a dosage increase rather than a dose reduction, leading to SUSAR and potentially related patient death. It was setup AND testing failure.

Again, this had a radical impact on the way we do CSV today. In the past, full verification could be performed every many years for each major software release. Documentation was mostly paper (or PDF) and testing was complete and comprehensive. Today, changes occur frequently and constantly.

For example, today at KCR, for our Veeva-based clinical systems, we are rolling out changes on a monthly basis. This means we need to keep ongoing evidence of configuration and testing. In practice, we divide our publications into larger and smaller ones. This gives us time to create groups (or themes) of changes with associated business process adjustments over a longer period of time.

Issue 2 highlights the value (and need) of using digital tools to support the validation lifecycle. This is something that KCR has adopted in most cases replacing documents/PDFs and Excel spreadsheets with JIRA Change Control. This helps maintain ongoing evidence of our validation and also ensures that our CSV process itself is compliant with 21 CFR Part 11.

Of course, for organizations like us that primarily configure and use software rather than build it, it’s incredibly important to keep things simple and stupid. GAMP V Release 2 may seem complicated and even overkill, however, when using a risk-based approach, in my personal experience there are fewer actual steps that need to be performed than when using traditional methods. Importantly, the principle of “critical thinking” helps ensure that the completed, approved solution not only meets regulatory requirements, but also provides a “good” solution for end users.