Insights from the CBI IVT Validation Week
The 24th CBI IVT Validation Week conference took place on October 22-24 at the beautiful Coronado Island Marriott hotel in San Diego, CA. In addition to the beautiful venue and gorgeous California weather, the event brought together a great group of thought leaders and industry experts from the world of validation.
The three-day event focused on three major themes - Process Validation, Quality, and CSV/Data Integrity. NNIT’s focus was primarily on the CSV/Data integrity track and, therefore, this article will address our major takeaways from the conference.
Updates from the Regulators (FDA)
Recent observations on 21 CFR Part 211.68(b) indicate a lack of controls in change management for production computer systems and records.
Most Life Sciences organizations have already established internal policies, procedures, and tools to enforce this regulation. However, with increasing supplier involvement (mainly cloud-based solutions), the situation is becoming murky. Many of these technology providers are new to the Life Sciences industry and may not necessarily have the right processes and validation framework in place to maintain their computer systems in a compliant manner.
Observations around 21 CFR 211.25(a) reiterate the importance of robust training and learning management programs for staff working on GxP systems.
Employee turnover is a significant challenge in the technology industry. Therefore, technology organizations responsible for maintaining GxP systems must establish a robust training framework for their staff. They must also ensure that training and access records are kept intact and up to date. Periodic reviews and internal audits must be enforced to ensure continued compliance.
Data integrity has always been at the heart and center of Computer Systems Validation. We are seeing increasing attention paid to this topic due to recent data privacy regulations such as the GDPR and recent data breaches at large technology corporations.
Even though the GDPR has been effective as of May 25, 2018 and carries a significant punitive measure for non-compliance, it is evident that the awareness of it is not that widespread across many global Life Sciences organizations.
Work needs to be done to include the GDPR as a critical element within the CSV framework, including but not limited to
Implementing automated audit trails around data capture, maintenance, and erasure
Periodic reviews need to be enforced to ensure GDPR compliance during the operation phase/change controls
A GDPR-driven impact assessment must be conducted, and test strategies must be tailored accordingly to all system changes
Contractual changes/SLAs must be in place to enforce the 72-hour reporting obligation during inspections.
In addition to the GDPR, several life sciences organizations are currently going through mergers and acquisitions, and they are paving the way to a slew of upgrades, retirements, and application/infrastructure consolidations. Data integrity must be given the highest importance as distributed, large datasets are altered, consolidated, archived, moved, or even destroyed as part of such organizational changes.
Information security has also become a significant driver for data integrity efforts. Organizations are now including vulnerability and penetration testing within their testing strategy/stage gates as well as conducting periodic security scans during the operation phase to ensure data integrity.
Risk-based validation and testing
It was clear that a majority of Life Sciences organizations are now following the recent GAMP5 guidance and have moved to a risk-based testing/validation approach. However, it was also evident that the level of maturity varies across organizations.
We came across organizations that are classifying changes based on functional risk assessments and system complexity only, whereas the mature organizations are measuring both upstream and downstream impacts using a mature configuration management database (CMDB), adjusting the testing and change control efforts based on the associated risk of the change.
Recent technological shifts have made Supplier Management an integral function to support Computer Systems Validation. The common theme across all sessions was that a robust supplier contract is key to enforcing supplier compliance and leveraging supplier documentation.
Portfolio-based Risk management
Organizations with a mature CSV process are taking their risk management process to the next level by introducing a portfolio-based risk management framework. As part of this approach, they categorize their applications/systems into multiple portfolios based on their inherent risk classifications. These portfolios then assist their leadership teams to drive key strategic decisions such as business continuity/disaster recovery planning, application/risk consolidation, vendor management, and even budgeting for the next fiscal year.
Although it is a very mature framework, this approach requires a solid understanding of the current application portfolio and history of changes and associated risks as well as a significant amount of cost and resource involvement.
In summary, Life Sciences organizations and their Computer Systems Validation functions are facing increasing challenges due to the growing regulatory requirements as well as increasing technological complexity. To ensure compliance and reduce inefficiencies in the validation process, organizations must enable their CSV leadership to learn and adapt to new technologies and standards, make investments to train their staff on an ongoing basis as well as participate in events/forums to facilitate the exchange of ideas and best practices across the industry.
To learn more about how NNIT's Compliance, Test, and Validation (CTV) practice can help your organization, stay compliant, or increase the maturity level of your CSV processes, please reach out to Saurav Ghosh at firstname.lastname@example.org or +1 609 4807595.