Computer System Validation
(CSV) in the Pharmaceutical
Industry: Ensuring Quality and
Compliance
Antonio Visconti
President, Founder & Principal GMP Consultant at GxP VISCONTI Pharma
Consulting Services (ViPCS)
1 article Follow
August 28, 2023
Open Immersive Reader
In the ever-evolving landscape of pharmaceutical manufacturing and
regulation, ensuring the quality, safety, and effectiveness of
pharmaceutical products is of paramount importance. To achieve this, the
pharmaceutical industry heavily relies on computer systems that control
various aspects of manufacturing, quality control, and regulatory
compliance. Computer System Validation (CSV) emerges as a critical
process to ensure that these computerized systems are designed,
implemented, and maintained to meet stringent regulatory requirements
and industry standards.
Understanding Computer System Validation (CSV):
CSV is a comprehensive approach that ensures the reliability, accuracy,
and integrity of computer systems used in the pharmaceutical industry. It
encompasses a series of activities, processes, and documentation that
collectively establish the validity and compliance of computer systems.
From research and development to manufacturing, distribution, and
beyond, CSV touches every facet of pharmaceutical operations.
CSV is essential for a variety of reasons:
1. Regulatory Compliance: Regulatory agencies such as the U.S. Food and
Drug Administration (FDA), the European Medicines Agency (EMA), and
others require pharmaceutical companies to validate computer systems
used in GxP (Good x Practice) environments.
2. Data Integrity: CSV ensures that the data generated and processed by
computer systems are accurate, reliable, and consistent, contributing to
maintaining data integrity.
3. Patient Safety: Many computer systems in the pharmaceutical industry
control critical processes that directly impact patient safety. Ensuring the
proper functioning of these systems is vital to prevent errors that could
lead to adverse events.
4. Risk Management: CSV helps identify and mitigate risks associated with
computer systems, ensuring that potential vulnerabilities are addressed
before they impact product quality.
5. Operational Efficiency: Validated computer systems are more likely to
operate effectively, minimizing downtime and disruptions in
manufacturing processes.
CSV Process:
The CSV process involves a series of well-defined steps:
1. Planning and Strategy: Defining the scope, objectives, and resources
required for CSV.
2. User Requirement Specification (URS): Documenting user needs and
system functionalities to guide system development.
3. Functional Specification (FS): Describing how the system will meet user
requirements.
4. Design Specification (DS): Translating functional specifications into
technical design.
5. Installation Qualification (IQ): Verifying that the system is properly
installed.
6. Operational Qualification (OQ): Demonstrating that the system
functions according to specifications.
7. Performance Qualification (PQ): Ensuring that the system consistently
performs within defined parameters.
8. User Acceptance Testing (UAT): Confirming that the system meets user
requirements.
9. Risk Assessment: Identifying and addressing potential risks associated
with the system.
10. Change Control: Managing changes to the validated system to ensure
ongoing compliance.
Challenges in CSV:
The complex nature of pharmaceutical operations, evolving technology,
and regulatory changes pose challenges to effective CSV. Maintaining
compliance across the entire lifecycle of a computer system, including
upgrades and modifications, requires careful planning and execution.
CSV Questions & Answers
I was requested to support younger pharmaceutical industry professionals
with a CSV Q&A. Find it below:
Q1: You are given one CSV project in pharmaceutical industry.
tell me how you will start and decide the validation approach and explain
the entire things you will do from start up to release of the system for use.
A1: here’s a detailed step-by-step approach for managing a Computer
System Validation (CSV) project in the pharmaceutical industry:
1. Project Initiation and Planning:
• Define the project scope, objectives, and goals.
• Identify key stakeholders, including users, IT, quality assurance, and
regulatory teams.
• Create a project plan outlining timelines, resources, roles, and
responsibilities.
• Determine the validation approach based on system complexity, risk
assessment, and regulatory requirements.
2. System Requirements Gathering:
• Collect detailed user requirements, including functional, technical,
security, and regulatory requirements.
• Create a comprehensive User Requirements Specification (URS)
document.
3. Vendor Selection and Assessment (if applicable):
• Evaluate potential vendors or suppliers of the software.
• Conduct vendor audits to assess their quality systems and ability to
meet regulatory requirements.
4. Risk Assessment:
• Identify potential risks associated with the system’s intended use, data
integrity, patient safety, and regulatory compliance.
• Perform a risk assessment to prioritize validation activities and
determine the level of testing required.
5. Validation Plan Preparation:
• Develop a Validation Master Plan (VMP) outlining the overall validation
strategy, scope, resources, and documentation requirements.
• Include strategies for change control, deviation management, and
revalidation.
6. Functional Specification and Design:
• Develop a Functional Design Specification (FDS) based on the URS.
• Design the system architecture, including hardware, software,
interfaces, and data flows.
• Include security measures, data backup, and disaster recovery plans.
7. Configuration and Installation:
• Configure the system according to the FDS.
• Install the software, hardware, and necessary components in a
controlled environment.
8. Testing:
• Develop test scripts (IQ/OQ/PQ) for Installation Qualification,
Operational Qualification, and Performance Qualification.
• Execute IQ tests to ensure proper installation of hardware and software.
• Perform OQ tests to verify system functionality and performance.
• Conduct PQ tests to demonstrate the system meets user requirements
under realistic conditions.
9. Data Integrity and Security Testing:
• Verify data integrity controls and encryption mechanisms.
• Test access controls, user authentication, and audit trail functionality.
10. User Acceptance Testing (UAT):
• Involve end users in testing to ensure the system meets their
requirements.
• Document UAT results and any deviations or issues.
11. Validation Documentation:
• Prepare validation protocols, test scripts, and test reports.
• Document any deviations, investigations, and corrective actions taken.
12. Review and Approval:
• Conduct a review of all validation documentation by relevant
stakeholders.
• Obtain sign-off from authorized personnel.
13. Training:
• Train users on system operation, data entry, and troubleshooting.
• Provide training on data integrity and security best practices.
14. Change Control and Release:
• Implement a change control process to manage any future system
changes.
• Obtain final approval to release the system for use.
15. Periodic Review and Maintenance:
• Perform periodic reviews to ensure the system continues to meet
regulatory requirements.
• Update documentation, perform periodic testing, and address any
issues that arise.
16. Archival of Records:
• Archive all validation documentation, including protocols, reports, and
deviations.
17. Regulatory Reporting:
• Prepare necessary documentation for regulatory submissions if
required.
18. Continuous Improvement:
• Use lessons learned from the project to improve future CSV projects
and enhance the quality system.
Throughout the entire process, collaboration, communication, and
adherence to regulatory guidelines are key. Regular updates to
stakeholders and maintaining detailed documentation are critical for a
successful CSV project in the pharmaceutical industry.
Q2: What is CSV operational qualification and CSV performance
qualification in pharmaceutical industry and its difference?
A2: In the pharmaceutical industry, Computer System Validation (CSV) is a
critical process that ensures that computerized systems used for
manufacturing, testing, and quality control comply with regulatory
requirements and are fit for their intended use. CSV includes various
stages of testing to verify and document the system's functionality and
performance. Two important stages of CSV are Operational Qualification
(OQ) and Performance Qualification (PQ). Let's delve into the definitions
and differences between these two stages:
Operational Qualification (OQ):
Operational Qualification focuses on verifying that the computerized
system operates according to its design specifications and meets
predefined functional requirements. It is conducted after the Installation
Qualification (IQ) phase, which ensures that the system is properly
installed. During OQ, the emphasis is on demonstrating that the system's
components, interfaces, and functions work correctly and consistently
under various operating conditions. Key aspects of OQ include:
1. Test Execution: Test scripts are developed based on system
requirements and specifications. These scripts cover a wide range of
scenarios to ensure the system's functionality is thoroughly tested.
2. Functional Testing: Each system function is tested to ensure it operates
as intended. This may include testing user interfaces, data entry,
calculations, data retrieval, and reporting.
3. Performance Testing: OQ also includes testing the system's
performance under normal operating conditions. This can involve
assessing response times, transaction throughput, and data retrieval
times.
4. Boundary Testing: Boundaries of the system's functionality are tested,
including input limits, error conditions, and exceptions.
5. Security and Access Control Testing: OQ verifies that the system's
security measures, such as user authentication and access controls, are
functioning correctly.
6. Data Integrity Testing: Data integrity controls, including data entry,
storage, retrieval, and audit trails, are validated.
7. Interface Testing: If the system interfaces with other systems or
instruments, OQ verifies that the data exchange and integration are
working as intended.
Performance Qualification (PQ):
Performance Qualification focuses on demonstrating that the
computerized system consistently performs as expected under real-world
operating conditions. It is the final step of the CSV process and follows
successful completion of OQ. PQ evaluates the system's ability to
consistently and reliably support the intended processes and meet
regulatory requirements. Key aspects of PQ include:
1. Real-world Testing: The system is subjected to real-world scenarios,
data, and inputs that simulate actual operational conditions.
2. Process Simulation: PQ validates the system's performance in a
simulated production environment, including production volumes, load
variations, and multiple users.
3. Stability and Reliability: The system's stability and reliability are
evaluated over an extended period to ensure it can consistently perform
without errors.
4. Fail-over and Recovery Testing: If applicable, PQ tests the system's
ability to recover from failures, data loss, or system crashes.
5. Business Continuity: PQ assesses the system's ability to continue
functioning during planned and unplanned outages.
6. Batch Processing: If the system is used for batch processing, PQ
ensures that batch records are accurately generated and processed.
7. Regulatory Compliance: PQ verifies that the system generates accurate
and compliant records required by regulatory agencies.
Difference between OQ and PQ:
The primary difference between OQ and PQ lies in the focus of testing. OQ
primarily focuses on verifying that the system's design specifications and
functional requirements are met, whereas PQ emphasizes demonstrating
the system's consistent performance in real-world conditions, ensuring it
can reliably support production and quality processes. Both stages are
essential to ensure that the computerized system is validated and fit for
its intended use in the pharmaceutical industry
Q3: How will you do the risk assessment during CSV validation in
pharmaceutical industry?
A3: Performing a risk assessment during Computer System Validation
(CSV) in the pharmaceutical industry is crucial to identify, evaluate, and
mitigate potential risks associated with the computerized system and its
impact on product quality, patient safety, and data integrity. Here's a
comprehensive approach to conducting a risk assessment during CSV:
1. Define Scope and Objectives:
Clearly define the scope of the risk assessment, including the
computerized system, its functionalities, interfaces, and intended use. Set
objectives for the risk assessment process.
2. Assemble a Cross-Functional Team:
Form a team of experts from various relevant disciplines, such as quality
assurance, IT, compliance, regulatory affairs, process owners, and subject
matter experts.
3. Identify Hazards and Potential Risks:
Identify potential hazards and risks associated with the computerized
system, its components, interfaces, data integrity, and impact on patient
safety, product quality, and regulatory compliance.
4. Risk Identification:
Use tools such as brainstorming, process mapping, and Failure Modes and
Effects Analysis (FMEA) to systematically identify potential failure modes,
vulnerabilities, and scenarios that could result in harm.
5. Risk Assessment:
Assess the identified risks based on severity, likelihood, and detectability.
Use a risk matrix to categorize risks into low, medium, and high levels of
risk.
6. Risk Evaluation:
Evaluate the assessed risks to determine their significance and prioritize
them for mitigation. Focus on risks that have the potential to affect
patient safety, product quality, data integrity, and regulatory compliance.
7. Risk Mitigation Strategies:
Develop risk mitigation strategies for high and medium-risk scenarios.
These strategies may include process changes, system enhancements,
additional controls, procedural safeguards, or training.
8. Documenting the Risk Assessment:
Document the entire risk assessment process, including identified risks,
their assessment, prioritization, and mitigation strategies. This
documentation is crucial for regulatory compliance and audit purposes.
9. Implement Mitigation Measures:
Implement the risk mitigation measures based on the strategies
developed in the previous step. Ensure that necessary changes are made
to the system, processes, or procedures.
10. Monitor and Review:
Continuously monitor the effectiveness of the implemented risk mitigation
measures. Regularly review and update the risk assessment based on
changes to the system or operational environment.
11. Document the Outcome:
Summarize the findings of the risk assessment, including identified risks,
assessment results, mitigation strategies, and their implementation
status. This documentation serves as evidence of a systematic approach
to risk management.
12. Regulatory Compliance:
Ensure that the risk assessment process aligns with relevant regulations
and guidelines, such as ICH Q9 (Quality Risk Management), FDA's 21 CFR
Part 11, and other industry-specific guidelines.
13. Continuous Improvement:
Use the insights gained from the risk assessment to improve the
validation process, enhance system functionality, and strengthen overall
quality management practices.
The risk assessment process is iterative and should be an integral part of
the CSV lifecycle. It ensures that potential risks are addressed proactively,
leading to a robust and compliant computerized system that supports
patient safety, product quality, and regulatory compliance.
Q4: What kind of applications need computer system validation and why?
A4: Computer System Validation (CSV) is needed for applications that are
used in regulated industries, such as pharmaceuticals, medical devices,
biotechnology, and other industries where product quality, patient safety,
and data integrity are critical. The primary purpose of CSV is to ensure
that computerized systems operate reliably, consistently, and in
compliance with regulatory requirements. Here are some examples of
applications that require CSV and the reasons why:
1. Laboratory Information Management Systems (LIMS):
LIMS are used to manage laboratory workflows, data, and sample
tracking. They play a crucial role in maintaining data integrity and
traceability in laboratories. CSV ensures accurate data recording, sample
tracking, and adherence to testing and reporting procedures.
2. Electronic Document Management Systems (EDMS):
EDMS systems manage electronic documents, records, and workflows.
They are vital for maintaining controlled and organized documentation,
including SOPs, batch records, and regulatory submissions. CSV ensures
that documents are securely stored, accessible, and in compliance with
version control.
3. Quality Management Systems (QMS):
QMS systems manage quality-related processes, such as deviations,
CAPAs, change controls, and audits. These systems are critical for
maintaining compliance, identifying and resolving quality issues, and
tracking corrective actions. CSV ensures that quality processes are
consistent and well-documented.
4. Manufacturing Execution Systems (MES):
MES systems manage manufacturing processes, batch records,
equipment, and personnel. These systems ensure that manufacturing
operations are controlled, monitored, and compliant with GMP
requirements. CSV helps prevent errors and discrepancies in batch
records, ensuring product consistency and quality.
5. Enterprise Resource Planning (ERP) Systems:
ERP systems integrate various business processes, including inventory
management, procurement, finance, and human resources. In regulated
industries, ERP systems are used to track materials, ensure accurate
financial reporting, and maintain compliance with regulatory standards.
CSV ensures data accuracy and integrity within ERP modules.
6. Clinical Trial Management Systems (CTMS):
CTMS systems manage clinical trial data, including patient enrollment,
study protocols, and regulatory submissions. These systems help ensure
the integrity and accuracy of clinical trial data, critical for regulatory
submissions and patient safety. CSV safeguards the reliability of clinical
trial data.
7. Pharmacovigilance Systems:
Pharmacovigilance systems manage adverse event reporting and safety
surveillance for pharmaceutical products. These systems are crucial for
ensuring patient safety and regulatory compliance. CSV ensures that
adverse event data is accurately captured, assessed, and reported.
8. Regulatory Information Management (RIM) Systems:
RIM systems manage regulatory submissions, approvals, and compliance
information. These systems support the timely submission of regulatory
documents and the maintenance of regulatory compliance. CSV helps
ensure that regulatory information is accurate and up-to-date.
9. Process Control Systems:
Process control systems are used in manufacturing environments to
monitor and control critical process parameters. In industries like
pharmaceuticals and biotechnology, these systems ensure consistent
product quality and adherence to GMP requirements. CSV safeguards the
accuracy of process control data.
10. Data Analysis and Reporting Software:
Any software used for data analysis, reporting, and decision-making in
regulated environments should undergo CSV. This includes statistical
analysis tools, data visualization software, and reporting tools used to
generate data-driven insights.
Overall, CSV is necessary for any application that handles critical data,
supports regulatory compliance, impacts patient safety, and contributes
to product quality in regulated industries. It ensures that these
applications are developed, implemented, and maintained in a controlled
and documented manner to mitigate risks and maintain data integrity.
Q5: What are the phases in software development life cycle in
pharmaceutical industry?
A5: In the pharmaceutical industry, the software development life cycle
(SDLC) consists of several phases that ensure the proper development,
validation, and deployment of computerized systems used in various
processes. The SDLC phases in the pharmaceutical industry typically
include:
1. Requirements Definition and Analysis:
In this phase, the requirements for the software system are gathered from
stakeholders, users, and regulatory guidelines. These requirements are
analyzed, documented, and translated into functional and non-functional
specifications.
2. System Design:
During this phase, the detailed system design is created based on the
requirements. This includes designing the architecture, data flow, user
interfaces, and interactions. Design specifications are created, which will
guide the actual development process.
3. Coding and Programming:
The coding phase involves writing the actual software code based on the
design specifications. Programming practices must follow industry
standards and good coding practices to ensure maintainability,
traceability, and future modifications.
4. Testing:
Testing is a critical phase in the SDLC. It includes unit testing, integration
testing, system testing, and user acceptance testing (UAT). The software
is tested for functionality, accuracy, performance, security, and
compliance with requirements.
5. Validation and Qualification:
This phase is specific to the pharmaceutical industry. The software
undergoes validation to ensure that it meets regulatory requirements and
is fit for its intended use. Validation includes verification (did we build it
right?) and validation (did we build the right thing?). Documentation is
generated to demonstrate compliance.
6. Installation and Deployment:
Once the software has passed validation and testing, it is deployed to the
intended environment. Installation processes and procedures are followed
to ensure that the software is correctly set up.
7. Operation and Maintenance:
After deployment, the software enters the operational phase. This
involves ongoing monitoring, support, and maintenance. Regular updates,
bug fixes, and enhancements are performed as needed.
8. Change Management and Version Control:
Throughout the software's lifecycle, changes may be required due to user
feedback, regulatory updates, or evolving business needs. A structured
change management process ensures that any changes are documented,
tested, and validated to maintain the software's integrity.
9. Retirement and Decommissioning:
At the end of its useful life, the software is retired and decommissioned.
Data and information are archived, and any remaining regulatory
requirements are fulfilled. This phase ensures the proper closure of the
software's lifecycle.
10. Documentation and Reporting:
Throughout the SDLC, comprehensive documentation is generated to
provide evidence of compliance, traceability, and validation efforts. This
documentation includes user requirements, design specifications, test
plans, validation reports, and change control records.
Each phase of the SDLC plays a crucial role in ensuring that computerized
systems used in the pharmaceutical industry are developed, validated,
and maintained in a controlled and compliant manner. Regulatory
agencies, such as the FDA, require adherence to these phases to ensure
the safety, efficacy, and integrity of products and processes.
Q6: What is V model, agile model and waterfall model and their
differences?
A6: The V Model, Agile Model, and Waterfall Model are three distinct
software development methodologies, each with its own approach to
managing the development process. Here's an overview of each model
and their key differences:
Waterfall Model:
The Waterfall Model is a linear and sequential approach to software
development. It follows a structured step-by-step process, where each
phase must be completed before moving to the next. The key phases in
the Waterfall Model include requirements gathering, system design,
implementation, testing, deployment, and maintenance. This model is
suited for projects with well-defined and stable requirements, where
changes are less likely to occur. However, it can be rigid and less adaptive
to changing requirements.
Agile Model:
The Agile Model is an iterative and incremental approach that focuses on
collaboration, flexibility, and customer feedback. It breaks the
development process into small, manageable iterations or sprints. Each
iteration includes requirements gathering, design, coding, testing, and
delivery of a working increment of the software. Agile methods prioritize
customer satisfaction and embrace changing requirements even late in
the development process. Examples of Agile methodologies include
Scrum, Kanban, and Extreme Programming (XP).
V Model:
The V Model, also known as the Verification and Validation Model, is an
extension of the Waterfall Model. It emphasizes the relationship between
development phases and their corresponding testing phases. The V Model
involves a parallel development and testing process. For every
development phase, there is a corresponding testing phase, forming a "V"
shape. For example, the requirement phase is followed by the
requirement verification phase, design phase by design verification, and
so on. This model ensures that testing and verification are closely tied to
each development step.
Differences:
1. Approach:
- Waterfall: Linear and sequential process.
- Agile: Iterative and incremental process.
- V Model: Combination of sequential development and parallel testing.
2. Flexibility:
- Waterfall: Less flexible to changing requirements once the project
starts.
- Agile: Highly adaptable to changing requirements throughout the
project.
- V Model: Somewhat adaptable due to the parallel testing phases.
3. Phases:
- Waterfall: Sequential phases with minimal overlap.
- Agile: Iterations with phases like planning, designing, coding, and
testing in each iteration.
- V Model: Parallel phases for development and testing.
4. Customer Involvement:
- Waterfall: Limited customer involvement in the development process.
- Agile: High customer involvement, continuous feedback, and
collaboration.
- V Model: Customer involvement mainly during the requirement phase.
5. Documentation:
- Waterfall: Emphasis on comprehensive documentation.
- Agile: Documentation is important but not excessive.
- V Model: Emphasis on documentation, particularly for testing and
validation.
6. Project Size:
- Waterfall: Best suited for small to medium-sized projects with stable
requirements.
- Agile: Suitable for various project sizes, particularly beneficial for
complex and evolving projects.
- V Model: Well-suited for projects with clearly defined requirements and
significant testing needs.
Ultimately, the choice of development methodology depends on project
requirements, timelines, team dynamics, and the level of flexibility
needed to accommodate changes during the development process.
Q7: How will you handle a discrepancy in CSV Validation lifecycle in
pharmaceutical industry?
A7: Handling a discrepancy in the Computer System Validation (CSV)
lifecycle in the pharmaceutical industry requires a systematic approach to
identify the root cause, assess the impact, and implement appropriate
corrective and preventive actions. Here's a step-by-step process to handle
a discrepancy in CSV validation:
1. Identification and Documentation:
- Document the nature of the discrepancy, including its description,
location, and the stage of validation where it occurred.
- Assign a unique identifier to the discrepancy for tracking purposes.
- Capture all relevant details, such as date, time, personnel involved, and
any observed deviations from expected behavior.
2. Immediate Containment:
- If the discrepancy poses an immediate risk to patient safety, product
quality, or data integrity, take necessary steps to contain the issue. This
might involve stopping the affected process or system.
3. Root Cause Analysis:
- Assemble a cross-functional team with expertise in validation, IT, quality
assurance, and relevant business areas.
- Conduct a thorough investigation to identify the root cause of the
discrepancy.
- Use tools like fishbone diagrams, 5 Whys, or Failure Mode and Effects
Analysis (FMEA) to explore potential causes.
4. Impact Assessment:
- Evaluate the impact of the discrepancy on product quality, patient
safety, data integrity, and regulatory compliance.
- Determine whether the discrepancy has affected other related systems,
processes, or data.
5. Corrective and Preventive Actions (CAPAs):
- Develop a corrective action plan to address the immediate issue and
prevent recurrence.
- Define steps to rectify the discrepancy and bring the system back into
compliance.
- Identify preventive actions to mitigate the risk of similar discrepancies
occurring in the future.
6. Change Control:
- If the discrepancy requires changes to the system, process, or
documentation, initiate a change control process.
- Update relevant documents, such as validation protocols, standard
operating procedures, and work instructions, to reflect the corrective
actions taken.
7. Revalidation and Retesting:
- Determine whether the discrepancy necessitates revalidation or
additional testing of the affected system.
- Plan and execute the necessary revalidation activities, such as IQ, OQ,
and PQ, as applicable.
8. Documentation and Reporting:
- Maintain a comprehensive record of the discrepancy, investigation, and
corrective actions taken.
- Prepare a discrepancy report that outlines the incident, investigation
findings, root cause analysis, corrective actions, and preventive measures.
- Ensure timely reporting to relevant stakeholders, including quality
assurance, regulatory affairs, and senior management.
9. Regulatory Notifications:
- If the discrepancy impacts regulatory compliance, consider whether
notification to regulatory authorities is required. Consult with regulatory
affairs experts to determine the appropriate course of action.
10. Training and Communication:
- Provide training to personnel involved in the discrepancy and its
resolution to prevent recurrence.
- Communicate the findings and actions to relevant teams to enhance
awareness and prevent similar issues.
11. Follow-Up and Monitoring:
- Monitor the effectiveness of the corrective and preventive actions over
time.
- Conduct periodic reviews to verify that the discrepancy has been
effectively resolved and that the system remains in a compliant state.
12. Continuous Improvement:
- Use the lessons learned from the discrepancy to improve validation
processes, procedures, and documentation.
- Implement changes that can enhance the overall CSV process and
prevent similar discrepancies in the future.
Handling discrepancies in CSV validation is a critical aspect of ensuring
data integrity, product quality, and patient safety. It requires a proactive
and systematic approach to identify, address, and prevent issues
throughout the lifecycle of computerized systems in the pharmaceutical
industry.
Q8: What is change control in CSV Validation lifecycle in pharmaceutical
industry?
A8: Change control in the Computer System Validation (CSV) lifecycle
within the pharmaceutical industry refers to the systematic process of
managing and documenting changes to computerized systems, software,
hardware, or related components to ensure that these changes are
implemented in a controlled and compliant manner. Change control is a
fundamental aspect of maintaining the integrity, reliability, and regulatory
compliance of computerized systems throughout their lifecycle.
Key Elements of Change Control in CSV:
1. Change Request Initiation:
- Any proposed change to a computerized system or its components
starts with a formal change request. This request includes details such as
the reason for the change, the scope of the change, and the potential
impact on the system.
2. Change Assessment and Impact Analysis:
- A cross-functional team, including representatives from IT, quality
assurance, validation, and relevant business units, assesses the proposed
change.
- The team determines the potential impact of the change on system
functionality, data integrity, regulatory compliance, and other critical
factors.
3. Risk Assessment:
- A risk assessment is conducted to evaluate the potential risks
associated with the change. This assessment considers factors such as the
criticality of the system, the nature of the change, and potential impacts
on patient safety, product quality, and data integrity.
4. Change Evaluation and Approval:
- Based on the assessment and risk analysis, the change control board or
relevant decision-making body evaluates the change request.
- The decision is made to approve, reject, or require further analysis for
the proposed change.
5. Change Implementation:
- If the change is approved, an implementation plan is developed. This
plan includes details such as the timeline, responsible individuals, and
necessary resources.
- The change is executed according to the plan, which may involve
updating software, modifying configurations, or making hardware
adjustments.
6. Validation and Testing:
- Changes to computerized systems often require validation and testing
to ensure that the changes do not adversely affect the system's
functionality, data integrity, or compliance.
- Validation activities may include executing Installation Qualification (IQ),
Operational Qualification (OQ), and Performance Qualification (PQ)
protocols.
7. Documentation and Reporting:
- All changes and associated activities are documented in change control
records. These records capture the details of the change request,
assessment, risk analysis, implementation plan, testing results, and any
deviations or issues encountered.
- A change control report is prepared summarizing the entire change
process, including the rationale, assessment outcomes, and validation
results.
8. Verification and Approval:
- After implementation and testing, the changes are verified to ensure
that they were successfully executed and have achieved the desired
outcome.
- Verification and approval may involve a final review by the change
control board to confirm that the change has been properly executed and
meets the intended objectives.
9. Communication and Training:
- Stakeholders affected by the change, including system users, are
informed about the implemented change and any relevant training
required to adapt to the changes.
10. Post-Change Monitoring:
- A period of post-implementation monitoring ensures that the change
has not introduced unintended consequences or issues.
- Ongoing monitoring also helps confirm that the system continues to
operate as intended after the change.
Change control is a crucial process in maintaining the integrity of
computerized systems and ensuring that modifications are implemented
in a controlled manner to minimize risks and maintain compliance with
regulatory requirements.
Q9: What is Requirement traceability matrix, why it's required and what
are its contents?
A9: A Requirement Traceability Matrix (RTM) is a structured document
used in project management, software development, and quality
assurance to ensure that each requirement identified for a system,
software, or project is successfully fulfilled. The RTM serves as a tool to
track and manage the relationships between various requirements
throughout the project lifecycle, ensuring that all requirements are
properly implemented, tested, and verified.
Purpose and Importance of Requirement Traceability Matrix:
1. Ensuring Fulfillment: The primary purpose of an RTM is to ensure that
all requirements, whether functional, technical, or regulatory, are met
during the development, testing, and validation phases of a project.
2. Risk Mitigation: An RTM helps identify gaps, inconsistencies, and
ambiguities in requirements. By tracking each requirement's
implementation and testing, potential issues can be caught early and
addressed, reducing the risk of project failure or costly rework.
3. Regulatory Compliance: In regulated industries such as
pharmaceuticals or aerospace, an RTM provides evidence that all
requirements, including those mandated by regulations, have been
addressed and validated.
4. Verification and Validation: The RTM aids in verification by confirming
that each requirement has been implemented as intended. It also assists
in validation by showing that the implemented system meets the intended
business needs and user expectations.
Contents of a Requirement Traceability Matrix:
An RTM typically includes the following elements:
1. Requirement ID: A unique identifier assigned to each requirement for
easy reference and tracking.
2. Requirement Description: A clear and detailed description of the
requirement, often accompanied by additional information such as
priority, source, and rationale.
3. Design Specification: The corresponding design or development
specification that describes how the requirement will be implemented.
4. Test Cases: The test cases or scenarios developed to verify and validate
each requirement. This includes details such as input data, expected
outputs, and pass/fail criteria.
5. Validation Criteria: The criteria that will be used to determine if the
requirement has been successfully validated during user acceptance
testing.
6. Status: The current status of each requirement, indicating whether it
has been implemented, tested, validated, or any other relevant stage.
7. Change History: Any changes made to the requirement, including
modifications, updates, and related decisions.
8. Verification and Validation Results: A record of the outcomes of
verification and validation activities related to each requirement. This
includes the results of testing, inspections, and user acceptance.
9. Traceability Links: Links to related documents, such as user stories, use
cases, functional specifications, design documents, and test cases.
10. Comments and Notes: Any additional comments, notes, or
observations that provide context or explanations for the requirement's
status or implementation.
11. Approvals: Signatures or approvals from stakeholders or relevant
authorities, confirming that the requirement has been successfully
implemented, tested, and validated.
The RTM serves as a critical tool for project managers, business analysts,
developers, testers, and other team members to ensure the successful
and accurate execution of the project's requirements. It aids in
maintaining transparency, consistency, and accountability throughout the
project lifecycle, ultimately contributing to the delivery of a high-quality
end product that meets the defined business needs and objectives.
Q10: How many environments for validation need to be used for CSV
validation of a GMP relevant software?
A10: For CSV (Computer System Validation) of a GMP (Good
Manufacturing Practice) relevant software, typically three environments
are used: Development, Validation, and Production environments. Each
environment has a specific purpose in the validation process and helps
ensure the integrity and compliance of the software system.
1. Development Environment:
- Purpose: The development environment is where the software is
created, programmed, and configured by developers.
- Activities: Developers write and test code, build software components,
and integrate features.
- Characteristics: This environment is not intended for validation or
testing; it's focused on building and coding.
- Access: Limited to development team members.
2. Validation Environment:
- Purpose: The validation environment is where the software is tested
rigorously to ensure that it meets the predefined requirements and
regulatory standards.
- Activities: Testing, validation, and verification activities are performed
here, including unit testing, integration testing, system testing, and user
acceptance testing (UAT).
- Characteristics: The validation environment closely mirrors the
production environment and should be set up to simulate real-world
conditions.
- Access: Limited to validation team members, QA personnel, and
relevant stakeholders.
3. Production Environment:
- Purpose: The production environment is the live environment where the
validated software is used for its intended purpose.
- Activities: The software is used by end-users to perform actual tasks and
operations.
- Characteristics: The production environment should be stable, secure,
and continuously monitored to ensure data integrity and compliance.
- Access: Accessible to authorized users and stakeholders.
It's important to note that the three environments should be distinct and
isolated from each other to prevent any unintended interactions or risks to
the validated system. The data and configurations in these environments
should also be consistent to ensure accurate testing and validation
results.
In addition to these primary environments, some organizations may also
have a Staging or Pre-Production environment, which serves as an
intermediate step between the Validation and Production environments.
This environment is used for final testing and validation before software is
deployed to the live Production environment.
The use of these environments helps ensure that the software system is
thoroughly tested, validated, and ready for production use, while also
adhering to regulatory requirements and GMP standards.
Q11: Describe the difference between 21cfr part 11 and annex 11 or
things which makes annex 11 apart from 21 CFR part 11.
A11: 21 CFR Part 11 and Annex 11 are both regulatory guidelines that
provide requirements for electronic records and electronic signatures in
the pharmaceutical industry. However, they are associated with different
regulatory bodies and cover slightly different aspects of electronic record-
keeping and compliance.
21 CFR Part 11:
- Regulatory Body: 21 CFR Part 11 is a regulation issued by the U.S. Food
and Drug Administration (FDA).
- Scope: It specifically addresses electronic records and electronic
signatures used in FDA-regulated industries, including pharmaceuticals,
biotechnology, and medical devices.
- Requirements: Part 11 outlines the criteria for ensuring the integrity,
authenticity, and reliability of electronic records and electronic signatures.
It covers various aspects, such as system validation, audit trails,
electronic signatures, access controls, and security measures.
- Applicability: Part 11 is applicable to all FDA-regulated organizations that
use electronic records and electronic signatures in their processes.
Annex 11:
- Regulatory Body: Annex 11 is a guideline issued by the European
Medicines Agency (EMA).
- Scope: It provides guidance on the use of computerized systems in the
GMP (Good Manufacturing Practice) environment, with a focus on ensuring
data integrity, reliability, and compliance in the pharmaceutical industry.
- Requirements: Annex 11 addresses the use of computerized systems,
including electronic records and electronic signatures, in the GMP context.
It emphasizes topics such as system validation, audit trails, access
controls, data integrity, and personnel training.
- Applicability: Annex 11 applies to pharmaceutical companies operating
within the European Union (EU) or exporting products to the EU market.
Differences and Points of Emphasis:
1. Regulatory Source: The most significant difference is that 21 CFR Part
11 is a regulatory requirement issued by the FDA, while Annex 11 is a
guideline provided by the EMA.
2. Applicability: Part 11 is specific to FDA-regulated industries in the
United States, while Annex 11 applies to pharmaceutical companies
operating in the EU.
3. Scope: Part 11 covers a broader range of electronic records and
signatures used in FDA-regulated industries, while Annex 11 specifically
addresses computerized systems in the GMP environment.
4. Emphasis on GMP: Annex 11 places a strong emphasis on GMP
requirements and ensuring data integrity within the manufacturing and
quality control processes.
5. Validation: Both guidelines emphasize the importance of system
validation, but the details and terminology may vary between the two.
6. Audit Trails: Both guidelines discuss the need for comprehensive and
secure audit trails, but there might be variations in the specifics of
implementation.
7. Electronic Signatures: Both guidelines address electronic signatures,
but the terminology and requirements may differ.
8. Data Integrity: Annex 11 places a significant focus on data integrity
throughout the entire data lifecycle, including creation, modification,
storage, retrieval, and archiving.
It's important for pharmaceutical companies to be aware of the specific
requirements of the regulatory bodies in their respective regions and to
ensure compliance with the relevant guidelines, whether it's 21 CFR Part
11 in the U.S. or Annex 11 in the EU.
Q12: How do you decide if a software system is GxP relevant in
pharmaceutical industry?
A12: Determining whether a software system is GxP (Good Practice)
relevant in the pharmaceutical industry involves assessing its impact on
processes that affect product quality, patient safety, and regulatory
compliance. GxP encompasses various regulations and guidelines, such as
GMP (Good Manufacturing Practice), GLP (Good Laboratory Practice), and
GCP (Good Clinical Practice). Here's how you can decide if a software
system is GxP relevant:
1. System Functionality:
- Consider whether the software system directly or indirectly affects any
GxP processes, including manufacturing, testing, packaging, labeling,
distribution, and quality control.
- Assess if the software system is involved in any decision-making, data
capture, or reporting that affects regulatory compliance, product quality,
or patient safety.
2. Data Integrity:
- Examine if the software system handles critical data, such as raw data
from laboratory equipment, batch records, or electronic signatures.
- Determine whether the system maintains data integrity, audit trails, and
electronic signatures in accordance with GxP requirements.
3. Process Control and Automation:
- Evaluate whether the software system controls and automates critical
manufacturing or testing processes that impact product quality and
safety.
- Consider whether the system ensures consistent and accurate execution
of GxP processes.
4. Documentation and Reporting:
- Determine if the software system generates reports, documents, or
records that need to comply with GxP regulations and guidelines.
- Check if the system provides traceability and auditability for all relevant
activities.
5. Regulatory Requirements:
- Review applicable regulatory requirements, such as FDA regulations, EU
directives, ICH guidelines, and local regulations, to identify if the software
system falls under their scope.
- Check if the software system needs to support compliance with specific
regulations, such as 21 CFR Part 11 or Annex 11.
6. Product Impact:
- Assess the impact of the software system on the final pharmaceutical
product's quality, safety, efficacy, or patient outcomes.
- Determine if the software system influences the batch release process,
stability studies, or other critical aspects of product quality assurance.
7. Patient Safety:
- Consider whether the software system plays a role in clinical trials,
adverse event reporting, patient data management, or pharmacovigilance
activities.
8. Supplier Qualification:
- Evaluate the software system's vendor or supplier to ensure they follow
GxP principles and provide necessary documentation.
9. Risk Assessment:
- Conduct a risk assessment to identify potential risks associated with the
software system's usage in GxP processes.
- Determine the criticality of the software's impact on product quality,
patient safety, and regulatory compliance.
10. Validation Requirements:
- Assess whether the software system requires validation activities to
demonstrate its fitness for intended use, data integrity, and regulatory
compliance.
Ultimately, the decision to classify a software system as GxP relevant
should involve cross-functional collaboration between IT, quality
assurance, regulatory affairs, and relevant business units. It's crucial to
thoroughly analyze the software system's functions, impact, and
compliance requirements to make an informed determination.
Q13: Which are the CSV Validation deliverables required for each category
of software?
A13: Computer System Validation (CSV) deliverables can vary based on
the categories of software and their impact on GxP processes in the
pharmaceutical industry. Here are the typical CSV deliverables for
different categories of software:
1. Category 1 - GxP Impacting Software:
- User Requirements Specification (URS): Clearly defines the user's
functional requirements and expectations for the software.
- Functional Specification (FS): Detailed description of how the software
will meet the defined user requirements.
- Design Specification (DS): Detailed design of the software including
architecture, data flows, and user interfaces.
- Risk Assessment: Documented assessment of potential risks associated
with the software's usage, data integrity, and impact on GxP processes.
- Validation Plan: Outlines the approach, scope, responsibilities, and
resources for the validation process.
- Test Protocols (IQ, OQ, PQ): Detailed scripts for Installation Qualification,
Operational Qualification, and Performance Qualification testing.
- Traceability Matrix: Links requirements to tests and verifies that each
requirement is adequately tested and documented.
- Validation Summary Report: Summarizes the validation process, results,
deviations, and overall compliance status.
- User Training Documentation: Describes how users should interact with
and operate the software to ensure data integrity and compliance.
- Change Control Documentation: Tracks any changes made to the
software and their impact on validation status.
2. Category 2 - Non-GxP Impacting Software:
- Risk Assessment: A simplified assessment of potential risks associated
with the software's usage.
- Validation Plan: A streamlined validation plan focusing on critical
aspects of the software.
- Traceability Matrix: Links requirements to tests to ensure basic
functionality is verified.
- Validation Summary Report: A concise summary of validation activities
and outcomes.
- User Training Documentation: Basic user training materials to ensure
accurate usage.
- Change Control Documentation: Tracks significant changes that impact
the software's validation status.
3. Category 3 - Commercial Off-The-Shelf (COTS) Software:
- Risk Assessment: An assessment of potential risks considering the
software's intended use.
- Vendor Assessment: Documentation on the vendor's quality system and
validation practices.
- Validation Plan: An overview of how the software will be validated and
integrated into GxP processes.
- Installation Qualification (IQ) Record: Documentation confirming
successful installation of the software.
- Functional Testing Records: Basic functional testing results ensuring the
software works as intended.
- Validation Summary Report: A summary of the validation activities and
outcomes.
The level of detail and complexity of these deliverables may vary based
on factors such as the software's impact on GxP processes, criticality,
complexity, and regulatory requirements. It's essential to align the
validation approach and deliverables with the software's intended use and
the associated risks. Cross-functional collaboration between IT, quality
assurance, and business units is crucial to ensure that the appropriate
level of validation is conducted and documented for each category of
software.
Q14: What are open and closed systems?
What are the 5 additional requirements required for open system
compared to closed systems?
A14: In the context of computer systems and software validation in the
pharmaceutical industry, "open systems" and "closed systems" refer to
different types of software environments:
Closed System:
A closed system is a software application or environment that is self-
contained and operates in isolation from external influences. In a closed
system, the software is not designed to interact with external systems,
and its behavior is predictable and controlled. Changes to a closed system
are usually tightly controlled and managed through a formal change
control process.
Open System:
An open system, on the other hand, is a software application or
environment that is designed to interact with external systems and can
exchange information or data with other systems. Open systems are more
flexible and versatile, allowing for integration with third-party software,
data sharing, and interoperability. However, the increased connectivity of
open systems can introduce additional complexity and potential risks.
When it comes to computer systems validation in the pharmaceutical
industry, there are additional requirements for open systems compared to
closed systems:
1. Data Integrity:
Open systems need more robust data integrity controls to ensure the
accuracy, completeness, and reliability of data exchanged between
systems.
2. Security and Access Control:
Open systems require more advanced security measures to prevent
unauthorized access and protect sensitive data during information
exchange.
3. Interoperability and Compatibility:
Open systems must be designed to seamlessly communicate and
integrate with other systems, requiring compatibility testing and
validation.
4. Change Management:
Open systems may undergo frequent changes due to evolving external
systems or data sources. A robust change management process is needed
to control updates and assess their impact.
5. Risk Assessment and Mitigation:
Open systems introduce additional risks related to data integrity, security
breaches, and interoperability issues. A thorough risk assessment and
mitigation strategy are essential to address these risks.
Overall, open systems provide increased flexibility and potential benefits
through their ability to interact with other systems, but they also require
more comprehensive validation efforts and stringent controls to ensure
data integrity, security, and compliance with regulatory requirements. It's
important for pharmaceutical companies to carefully assess the nature of
the system (open or closed) and tailor their validation approach and
requirements accordingly.
Q15: How document approval is done and how execution will be
performed is done in CSV lifecycle in pharmaceutical industry?
A15: In the computer system validation (CSV) lifecycle in the
pharmaceutical industry, document approval and execution play crucial
roles in ensuring that the validation process is conducted effectively and
in compliance with regulatory requirements. Here's how document
approval and execution are typically managed:
Document Approval:
Document approval involves the process of reviewing and approving
validation-related documents before they are used for validation activities.
This process ensures that the documents are accurate, complete, and
align with regulatory expectations. The steps involved in document
approval include:
1. Document Preparation: Validation-related documents are created,
including validation plans, protocols, test scripts, and reports.
2. Document Review: The documents are reviewed by designated
personnel, such as validation team members, quality assurance (QA)
personnel, and subject matter experts. The review ensures that the
content is accurate, consistent, and compliant with regulatory standards.
3. Document Approval: After the review, the documents are submitted for
approval to authorized personnel, such as validation managers, project
managers, and QA managers. Approval signifies that the documents meet
the required standards and can be used for validation activities.
4. Document Version Control: Documents are often assigned version
numbers or revisions to track changes and updates. Any changes made to
the documents are documented and reviewed to ensure they do not
compromise the integrity of the validation process.
Execution of Validation Activities:
Execution refers to the process of conducting the actual validation
activities as outlined in the approved validation documents. This phase
ensures that the system or software meets its intended specifications and
functions as expected. The steps involved in execution include:
1. Test Script Execution: Following the approved test scripts, the
validation team executes various tests on the system or software. These
tests may include functionality tests, performance tests, security tests,
and more.
2. Data Collection: During test script execution, data is collected to
document the outcomes of the tests. Data collected includes
observations, measurements, screenshots, and any deviations
encountered.
3. Issue Identification: If any issues or deviations are identified during test
script execution, they are documented and reported. These issues may
include system failures, unexpected behaviors, or discrepancies from
expected outcomes.
4. Issue Resolution: Any identified issues are investigated and resolved.
The resolution process involves determining the root cause of the issue
and implementing corrective and preventive actions to address it.
5. Review and Approval: The results of test script execution, including
data collected and issue resolution, are reviewed by validation team
members and QA personnel. Once the results are deemed satisfactory and
compliant, they are approved.
6. Documentation: The outcomes of test script execution, including any
issues encountered and their resolutions, are documented in validation
reports. These reports provide a comprehensive overview of the validation
activities conducted.
Both document approval and execution ensure that the validation process
is thorough, accurate, and compliant with regulatory requirements. These
processes contribute to the overall quality and reliability of computer
systems used in the pharmaceutical industry.
Q16: What is computer system assurance?
A16: Computer System Assurance (CSA) refers to the comprehensive and
systematic approach taken by organizations to ensure the reliability,
security, and compliance of computer systems and software applications.
CSA encompasses a wide range of activities aimed at providing
confidence in the proper functioning and integrity of computer systems
used in various industries, including pharmaceuticals, healthcare, finance,
and more. It goes beyond traditional validation processes and focuses on
the ongoing assurance of the entire computerized environment.
Key components and aspects of Computer System Assurance include:
1. Risk Management: Identifying, assessing, and managing risks
associated with computer systems to ensure data integrity, patient safety,
and compliance with regulatory requirements.
2. Quality Management: Implementing quality management practices to
maintain the reliability, accuracy, and consistency of computer systems
throughout their lifecycle.
3. Lifecycle Management: Addressing the full lifecycle of computer
systems, from planning and design to operation, maintenance, and
retirement, while ensuring compliance with relevant regulations and
guidelines.
4. Change Control: Managing changes to computer systems, software, and
configurations in a controlled and documented manner to prevent
unintended consequences and maintain system integrity.
5. Data Integrity: Ensuring the accuracy, consistency, and reliability of
data generated and processed by computer systems, including preventing
unauthorized changes, deletions, or tampering.
6. Validation and Verification: Conducting thorough validation and
verification activities to ensure that computer systems meet their
intended specifications, functions, and user requirements.
7. Audit and Inspection Readiness: Preparing computer systems and
associated documentation to be ready for regulatory audits and
inspections.
8. Documentation: Creating and maintaining accurate and comprehensive
documentation, including procedures, policies, user manuals, and
validation documentation, to support CSA efforts.
9. Cybersecurity: Implementing robust cybersecurity measures to protect
computer systems from unauthorized access, data breaches, and other
cyber threats.
10. Training and Competency: Ensuring that personnel using, managing,
and maintaining computer systems are adequately trained and competent
in their roles.
11. Continuous Improvement: Continuously monitoring, evaluating, and
improving computer systems to adapt to changing technologies,
regulations, and business needs.
CSA emphasizes the holistic approach of maintaining and assuring the
performance, reliability, and compliance of computer systems over time. It
aligns with the principles of good manufacturing practices (GMP), quality
management systems (QMS), and risk-based decision-making. Ultimately,
CSA contributes to the overall assurance that computer systems
consistently produce accurate and reliable results while meeting
regulatory and business requirements.
Q17: Summarize the validation summary report contents
A17: A Validation Summary Report is a comprehensive document that
provides a summary of the validation activities, results, and conclusions
related to the validation of a system, process, or equipment in various
industries, including pharmaceuticals. The contents of a Validation
Summary Report typically include:
1. Introduction:
- Brief overview of the system, process, or equipment being validated.
- Purpose and scope of the validation activities.
- Reference to relevant documentation and regulations.
2. Validation Approach:
- Description of the validation strategy, methodology, and approach
adopted.
- Explanation of any risk-based decisions and rationale for the approach
taken.
3. Validation Activities:
- List of validation activities performed, including IQ, OQ, PQ, etc.
- Description of the testing procedures, protocols, and test cases
executed.
- Mention of any deviations, changes, or unexpected events encountered
during the validation process.
4. Results:
- Summary of the results obtained from each validation activity.
- Presentation of data, observations, and measurements collected during
testing.
- Comparison of actual results against acceptance criteria and
specifications.
5. Critical Findings and Deviations:
- Identification of any deviations, non-conformances, or discrepancies
encountered during validation.
- Explanation of the impact of these findings on the validation process
and the system's integrity.
6. Conclusions:
- Overall assessment of the validation results and their alignment with
pre-defined criteria.
- Determination of whether the system, process, or equipment meets the
required standards and specifications.
- Statement of the system's suitability for its intended use.
7. Recommendations and Actions:
- Any corrective actions, preventive actions, or follow-up measures to
address deviations or findings.
- Recommendations for improvements, modifications, or enhancements
based on validation outcomes.
8. Final Approval:
- Signatures and approvals of relevant stakeholders, including validation
team members, quality assurance, and management.
9. Appendices:
- Detailed data and test results from each validation phase.
- Copies of protocols, standard operating procedures, and other relevant
documentation.
- References to relevant regulations, guidelines, and standards.
10. Annexes:
- Supplementary information, charts, graphs, or diagrams that provide
additional context to the validation process and results.
The Validation Summary Report serves as a comprehensive
documentation of the validation efforts and outcomes, providing a clear
and concise overview for regulatory agencies, auditors, and other
stakeholders. It helps demonstrate compliance with regulatory
requirements and confirms the suitability of the system, process, or
equipment for its intended purpose.
Q18: Describe the Contents of validation plan for CSV validation project
A18: A Validation Plan for a Computer System Validation (CSV) project is a
comprehensive document that outlines the strategy, scope, objectives,
and approach for validating a software application or computerized
system. The plan serves as a roadmap for the entire validation process
and provides a clear framework for all stakeholders involved. The contents
of a typical Validation Plan for a CSV project include:
1. Introduction:
- Purpose and scope of the validation plan.
- Overview of the system or software application being validated.
- Identification of key stakeholders and their roles.
2. Validation Strategy and Approach:
- Explanation of the overall validation strategy, including the rationale for
the approach chosen.
- Description of the risk-based approach used to prioritize validation
efforts.
- Explanation of how the project aligns with regulatory requirements and
industry best practices.
3. Project Organization:
- Listing of project team members, their roles, and responsibilities.
- Communication and reporting structure within the project team.
4. Scope and Objectives:
- Definition of the system boundaries, including in-scope and out-of-scope
components.
- Clear statement of the validation objectives and goals.
5. System Description:
- Detailed description of the system or software being validated.
- Overview of its functionalities, features, and intended use.
6. Regulatory and Standards Compliance:
- Reference to relevant regulations, guidelines, and standards that the
project aims to comply with.
7. Validation Phases and Deliverables:
- Breakdown of the validation activities into phases (IQ, OQ, PQ, etc.).
- List of key deliverables expected at each validation phase.
- Explanation of what each deliverable entails and its purpose.
8. Validation Requirements:
- Identification of validation requirements, including functional and
technical requirements.
- Traceability matrix linking requirements to specific tests and
documentation.
9. Acceptance Criteria:
- Establishment of clear and measurable acceptance criteria for each
validation phase.
- Definition of pass/fail criteria for tests and activities.
10. Risk Assessment:
- Description of the risk assessment process undertaken, including
identification of critical areas.
- Explanation of how identified risks will be managed and mitigated.
11. Testing Approach:
- Overview of the testing methodologies and techniques to be used.
- Description of how tests will be designed, executed, and documented.
12. Change Control and Deviation Management:
- Explanation of how changes to the system or deviations from the plan
will be managed.
- Description of the change control process and how deviations will be
documented and resolved.
13. Training and Qualification:
- Outline of the training requirements for project team members.
- Explanation of how team members' qualifications will be verified.
14. Validation Schedule and Timeline:
- Presentation of the project schedule, including key milestones and
timelines.
- Gantt chart or timeline illustrating the project phases and activities.
15. Documentation and Reporting:
- Description of the documentation standards to be followed.
- Explanation of the reporting structure, including progress reports and
final documentation.
16. Approval and Sign-off:
- Sign-off section for validation plan approval by relevant stakeholders.
The Validation Plan provides a structured approach to ensure that the CSV
project is conducted systematically, aligns with regulatory requirements,
and produces reliable and compliant results. It serves as a reference
document for project execution, ensuring that all activities are carried out
as planned and that the software application or system is validated
effectively.
Q19: What is CSV Qualification summary report?
A19: A Computer System Validation (CSV) Qualification Summary Report
is a comprehensive document that provides an overview of the validation
process and results for a software application or computerized system in
the pharmaceutical industry. It serves as a summary of the entire
validation effort, capturing key information and findings from the various
validation phases, including Installation Qualification (IQ), Operational
Qualification (OQ), and Performance Qualification (PQ). The report is
intended to provide a clear and concise summary of the validation
activities and outcomes to stakeholders, regulatory authorities, and other
interested parties.
The content of a CSV Qualification Summary Report typically includes:
1. Introduction:
- Overview of the purpose and scope of the report.
- Explanation of the system being validated and its intended use.
2. Project Summary:
- Brief description of the project background, objectives, and goals.
- Identification of key stakeholders and their roles.
3. Validation Strategy and Approach:
- Summary of the validation approach used, including any risk-based
considerations.
- Explanation of the rationale behind the chosen strategy.
4. System Description:
- Concise overview of the system or software application, including its
functionalities and features.
5. Validation Phases:
- Summary of the activities carried out in each validation phase (IQ, OQ,
PQ).
- Overview of the key tests, tests scripts, and test cases executed.
6. Acceptance Criteria:
- Presentation of the acceptance criteria used to determine the success or
failure of each test.
7. Test Results and Findings:
- Summary of the test results obtained during each validation phase.
- Highlighting of any deviations, discrepancies, or issues encountered and
their resolutions.
8. Summary of Critical Process Parameters:
- Listing of critical process parameters identified during the validation.
- Explanation of their significance and impact on the system's
performance.
9. Risk Assessment:
- Summary of the risk assessment process conducted, including identified
risks and mitigations.
10. Change Control and Deviations:
- Overview of any changes made to the system during the validation
process.
- Explanation of how deviations were handled and resolved.
11. Documentation and Training:
- Summary of the documentation generated as part of the validation
process.
- Explanation of the training provided to project team members.
12. Conclusion:
- Overall assessment of the validation effort and its success in meeting
the objectives.
- Statement on whether the system is deemed validated and ready for
use.
13. Recommendations:
- Any recommendations or actions for ongoing maintenance, monitoring,
or improvement.
14. Signatures and Approvals:
- Sign-off section for approval by relevant stakeholders, including project
team members and management.
The CSV Qualification Summary Report serves as a concise record of the
validation process, results, and conclusions. It provides stakeholders with
a high-level view of the validation effort and helps ensure transparency,
compliance, and accountability throughout the validation lifecycle.
Q20: How CAPA will be evaluated in CSV lifecycle?
A20: Corrective and Preventive Actions (CAPA) play a crucial role in the
Computer System Validation (CSV) lifecycle to address and rectify issues,
deviations, and non-conformances identified during the validation process.
Evaluating CAPA within the CSV lifecycle involves a systematic approach
to ensure that any identified issues are appropriately investigated,
corrected, and prevented from recurring. Here's how CAPA is evaluated in
the CSV lifecycle:
1. Identification of Issues:
During various validation phases (e.g., IQ, OQ, PQ), discrepancies,
deviations, or non-conformances may be identified. These issues could
include deviations from requirements, failures in tests, or other problems
affecting the system's integrity or functionality.
2. CAPA Initiation:
When an issue is identified, the CAPA process is initiated. A detailed
investigation is conducted to determine the root cause of the issue. This
may involve examining the system, reviewing documentation, and
analyzing relevant data.
3. Root Cause Analysis:
The investigation aims to identify the underlying reasons for the issue.
Root cause analysis involves a thorough examination of the process,
system, personnel, and environmental factors that contributed to the
problem.
4. CAPA Plan Development:
Based on the findings of the root cause analysis, a CAPA plan is
developed. This plan outlines the corrective actions to address the
immediate issue and the preventive actions to prevent similar issues from
occurring in the future.
5. Implementation of Corrective Actions:
Corrective actions involve addressing the immediate problem identified.
This could include fixing the issue, modifying the system, updating
documentation, or making other necessary changes.
6. Implementation of Preventive Actions:
Preventive actions involve addressing the root cause of the issue to
prevent its recurrence. This could include process improvements, training,
procedural changes, or system enhancements.
7. Verification and Effectiveness:
Before closing the CAPA, it's essential to verify that the corrective and
preventive actions taken are effective. This may involve retesting the
affected system components, reviewing updated documentation, and
confirming that the issue has been resolved.
8. CAPA Review and Approval:
The CAPA process should be well-documented and reviewed by relevant
stakeholders, including quality assurance and management. The CAPA
plan, actions taken, and their effectiveness are reviewed and approved
before closing the CAPA.
9. CAPA Closure:
Once the corrective and preventive actions have been implemented and
verified as effective, the CAPA can be closed. A formal closure report is
generated, summarizing the issue, actions taken, and their outcomes.
10. Documentation and Reporting:
All aspects of the CAPA process, including the initial issue, investigation,
actions taken, verification, and closure, are documented. These records
serve as a historical record of the issue and its resolution.
11. CAPA Effect on CSV Lifecycle:
The successful closure of CAPA ensures that any identified issues are
addressed and the system is brought into compliance. CAPA outcomes are
considered during the final review and approval of the validation efforts,
as they demonstrate the system's integrity and readiness for use.
Evaluating CAPA in the CSV lifecycle ensures that any deviations or non-
conformances identified during the validation process are appropriately
managed, documented, and resolved. This contributes to the overall
quality, compliance, and reliability of the validated computerized system.
Q21: how will do you perform the system assessment?
A21: Performing a system assessment is a crucial step in the Computer
System Validation (CSV) process to ensure that a software system meets
regulatory requirements, business needs, and quality standards. Here's
how you can perform a comprehensive system assessment:
1. Define Scope and Objectives:
Clearly define the scope of the system assessment, including the specific
software application, its functionalities, and the intended use. Identify the
objectives of the assessment, such as ensuring compliance, functionality,
and security.
2. Gather Requirements:
Collect and document the system requirements, including functional,
technical, regulatory, and business requirements. This information serves
as a benchmark for evaluating the system's capabilities.
3. Review Documentation:
Examine existing documentation related to the software system, such as
user requirements, technical specifications, design documents, and user
manuals. This helps in understanding the system's architecture, design,
and intended use.
4. Functional Assessment:
Evaluate the system's functionalities against the documented user
requirements. Verify that the software meets the intended purpose and
performs as expected.
5. Technical Assessment:
Review the technical aspects of the system, including its architecture,
interfaces, data flows, and integrations. Ensure that the system is
technically sound, scalable, and compatible with other systems.
6. Security Assessment:
Assess the system's security features and controls. Ensure that
appropriate access controls, authentication mechanisms, data encryption,
and other security measures are in place to protect sensitive data.
7. Data Integrity Assessment:
Verify that the system maintains the integrity of data throughout its
lifecycle. Check for data validation, audit trail capabilities, data backups,
and recovery processes.
8. Risk Assessment:
Conduct a risk assessment to identify potential risks associated with the
system's use, functionality, security, and data integrity. Evaluate the
impact and likelihood of these risks.
9. Compliance Assessment:
Evaluate the system's compliance with relevant regulations and industry
standards, such as 21 CFR Part 11, GAMP 5, and other applicable
guidelines.
10. Vendor Assessment (If Applicable):
If the system is purchased from a vendor, assess the vendor's
qualifications, support, and validation documentation to ensure that the
system meets regulatory requirements.
11. Documentation Review:
Review and update relevant documentation, such as the Validation Plan,
User Requirements Specification, Functional Specifications, and Test
Scripts, based on the assessment findings.
12. Gap Analysis:
Identify any gaps or discrepancies between the system's current state
and the desired state. Document these gaps and develop plans to address
them.
13. Risk Mitigation Strategies:
Develop strategies to mitigate identified risks and gaps. Determine
whether changes, enhancements, or additional controls are needed to
address the identified issues.
14. Validation Strategy Adjustment:
Based on the assessment findings, adjust the overall CSV validation
strategy to ensure that all risks and requirements are adequately
addressed.
15. Approvals and Sign-offs:
Obtain approvals and sign-offs from relevant stakeholders, including
quality assurance, regulatory affairs, and system users, to proceed with
any necessary changes or enhancements.
16. Continuous Improvement:
As part of a continuous improvement approach, document lessons
learned from the system assessment to enhance future validation efforts.
Performing a comprehensive system assessment allows you to identify
potential issues early in the CSV process, address them effectively, and
ensure that the software system is fit for its intended use while meeting
regulatory and quality requirements.
Q22: How you will decide if the system is GxP relevant?
A22: Determining if a system is GxP (Good Practice) relevant is an
important step in the Computer System Validation (CSV) process within
the pharmaceutical industry. Here's how you can decide if a system is GxP
relevant:
1. Understand the Purpose and Impact:
Consider the purpose of the system and its impact on regulated
processes, data, products, or services. If the system plays a role in
producing, testing, controlling, or documenting GxP-related activities, it is
likely GxP relevant.
2. Evaluate Regulatory Requirements:
Review the applicable regulations and guidelines that govern your
organization's activities. Check whether the system's functionalities, data,
or processes fall under regulatory oversight. Common regulations include
21 CFR Part 11, EU GMP Annex 11, and ICH Q7.
3. Assess Data Integrity and Criticality:
Determine if the system manages, generates, or stores data critical to
patient safety, product quality, or regulatory compliance. Systems
handling data subject to data integrity requirements are often considered
GxP relevant.
4. Consider Process Impact:
Assess how the system interacts with critical processes, including
manufacturing, testing, quality control, and distribution. If the system's
functionalities impact these processes, it may be GxP relevant.
5. Audit Trail and Accountability:
Evaluate whether the system supports robust audit trail capabilities that
ensure accountability for changes and actions. Systems with audit trail
functionality are commonly required for GxP purposes.
6. User Access and Authentication:
Check if the system enforces appropriate user access controls,
authentication, and authorization mechanisms. GxP systems often require
strict control over user roles and permissions.
7. Electronic Signatures:
Determine if the system allows electronic signatures that meet regulatory
requirements for approval, review, or other critical actions. Electronic
signatures are essential in GxP environments.
8. Data Accuracy and Completeness:
Verify that the system maintains accurate, complete, and reliable data.
GxP-relevant systems must ensure data accuracy and integrity throughout
their lifecycle.
9. Process Controls and Workflows:
Consider whether the system supports controlled workflows and
processes to ensure compliance with GxP requirements. This includes
approval workflows and change control processes.
10. Validation Complexity:
Assess the complexity of validating the system. GxP-relevant systems
often require thorough validation efforts to ensure compliance and data
integrity.
11. Impact on Patient Safety and Product Quality:
Determine if the system's functionality directly impacts patient safety
and product quality. If there's a risk to either, the system is likely GxP
relevant.
12. Risk Assessment:
Conduct a risk assessment to evaluate the potential risks associated with
the system. Consider risks related to data integrity, process control,
regulatory compliance, and patient safety.
13. Organizational Policies and Procedures:
Align your decision with the organization's policies and procedures
regarding GxP systems. Follow established criteria for categorizing
systems as GxP relevant.
14. Consultation with Stakeholders:
Engage with relevant stakeholders, including quality assurance,
regulatory affairs, and subject matter experts, to determine if the system
meets GxP criteria.
15. Document Decision Rationale:
Document the rationale for considering the system GxP relevant. Clearly
outline the factors, regulations, and considerations that led to the
decision.
It's important to note that the decision of whether a system is GxP
relevant is a critical one, as it will impact the level of validation effort and
regulatory requirements. Always consult with internal regulatory experts
and quality professionals to ensure accurate classification.
Q23: What is Backup, restore, archival and Business continuity plan?
A23: Backup, restore, archival, and business continuity plans are critical
components of an organization's data management and disaster recovery
strategies. In the context of the pharmaceutical industry, these plans are
essential for ensuring data integrity, regulatory compliance, and the
ability to maintain operations even in the face of unexpected disruptions.
Here's an overview of each concept:
1. Backup Plan:
A backup plan involves creating copies of critical data and information to
protect against data loss due to hardware failures, software glitches,
human errors, or other unexpected events. Backups are typically stored in
a separate location from the original data to ensure their availability if the
primary data is compromised.
2. Restore Plan:
A restore plan outlines the process of recovering data from backups to
the original or replacement systems. This plan defines the steps to restore
data, the order in which data should be restored, and the resources
required to perform the restoration. Regular testing of the restore process
is crucial to ensure its effectiveness.
3. Archival Plan:
An archival plan focuses on the long-term retention of data for regulatory
compliance, historical records, and reference purposes. It involves
identifying which data should be retained, establishing retention periods,
and organizing data in a structured manner. Archival plans ensure that
data remains accessible, searchable, and secure over time.
4. Business Continuity Plan (BCP):
A business continuity plan is a comprehensive strategy that outlines how
an organization will continue to operate during and after disruptive
events, such as natural disasters, power outages, cyberattacks, or other
emergencies. It encompasses not only data recovery but also overall
business processes, resources, personnel, and communication strategies.
Key Components of These Plans:
- Data Classification: Identify data types, categories, and their criticality to
prioritize backup, restoration, and archival efforts.
- Backup Frequency: Define how often backups are performed (e.g., daily,
weekly), considering the frequency of data changes and the acceptable
level of data loss.
- Backup Locations: Determine where backups will be stored, ensuring
they are physically separate from primary data to protect against data
loss.
- Retention Policies: Specify how long backups and archived data will be
retained based on regulatory requirements and business needs.
- Testing and Validation: Regularly test backup, restore, and archival
processes to ensure their effectiveness. Validation ensures that data can
be recovered accurately and within acceptable timeframes.
- Data Encryption: Ensure that data backups and archives are encrypted
to maintain data confidentiality and integrity.
- Restoration Procedures: Document step-by-step procedures for data
restoration, including required tools, roles and responsibilities, and
communication protocols.
- Documentation: Maintain detailed documentation of all backup, restore,
archival, and business continuity activities. This documentation helps
ensure compliance, track changes, and aid in audits.
- Incident Response: Outline the steps to be taken when a data loss or
disruption occurs. Define the roles and responsibilities of individuals
involved in responding to incidents.
- Testing and Drills: Regularly conduct testing and drills of the business
continuity plan to ensure that all stakeholders are familiar with their roles
and responsibilities during a crisis.
These plans collectively contribute to safeguarding critical data,
maintaining compliance, and enabling a quick recovery from unforeseen
disruptions. In the pharmaceutical industry, where data integrity and
patient safety are paramount, these plans play a crucial role in ensuring
operational resilience and regulatory adherence.
Q24: in which phase of CSV Deviations are handled?
A24: Deviations in the context of Computer System Validation (CSV) are
typically handled during the "Execution and Testing" phase of the CSV
lifecycle. This phase involves testing the system to ensure that it meets
the defined requirements and functions correctly. During this phase, any
deviations or discrepancies that are identified are documented,
investigated, and resolved.
Here's how deviations are typically handled during the "Execution and
Testing" phase of CSV:
1. Identification of Deviations:
During testing, if any discrepancies, errors, or deviations from the
expected behavior or requirements of the system are identified, they are
documented as deviations.
2. Documentation of Deviations:
The deviations are documented in a formal manner, which includes
capturing details such as the nature of the deviation, where and how it
occurred, the impact on the system, and the potential risk associated with
it.
3. Assessment of Deviations:
The identified deviations are assessed to determine their significance and
potential impact on the system's functionality, data integrity, and
compliance.
4. Investigation:
A thorough investigation is conducted to understand the root cause of the
deviation. This may involve analyzing logs, code, configuration settings, or
any other relevant data to identify why the deviation occurred.
5. Root Cause Analysis:
The root cause analysis aims to identify the underlying reasons for the
deviation. This helps in implementing corrective and preventive actions to
prevent similar deviations in the future.
6. Risk Evaluation:
The impact and severity of the deviation are evaluated to determine the
level of risk associated with it. This assessment guides the decision-
making process on how to address the deviation.
7. Decision-Making:
Based on the assessment of the deviation and associated risks, a decision
is made on how to proceed. This decision could involve accepting the
deviation, implementing corrective actions, or deciding to reject the
system if the deviation is critical.
8. Correction and Resolution:
Corrective actions are planned and executed to address the deviation.
This may involve modifying the system configuration, fixing the code,
updating documentation, or other necessary actions.
9. Verification and Re-Testing:
After corrective actions are implemented, the affected area of the system
is retested to ensure that the deviation has been effectively resolved
without introducing new issues.
10. Documentation of Actions:
All actions taken to investigate, address, and resolve the deviation are
documented, including a description of the deviation, the investigation
findings, actions taken, and verification of the resolution.
11. Approval and Sign-Off:
The resolution of the deviation, along with supporting documentation, is
reviewed, approved, and signed off by appropriate stakeholders, such as
quality assurance, validation, and project management.
12. Reporting:
A deviation report is generated summarizing the details of the deviation,
investigation, root cause analysis, corrective actions, and verification.
Handling deviations in the "Execution and Testing" phase ensures that
any issues are identified, addressed, and documented in a systematic
manner. This helps in maintaining the integrity of the validation process
and ensures that the system meets the defined requirements and
compliance standards.
Q25: in which phase do you prepare FRA preparation?
A25: FRA (Functional Risk Assessment) is typically prepared during the
"Planning" phase of Computer System Validation (CSV). The "Planning"
phase is the initial phase of the CSV lifecycle and involves defining the
scope, objectives, and approach for the validation project. FRA is an
important component of the planning process as it helps in identifying
potential functional risks associated with the computer system.
Here's how FRA preparation fits into the "Planning" phase of CSV:
1. Scope Definition:
In the "Planning" phase, the scope of the validation project is defined.
This includes identifying the computer systems that require validation,
determining the functionalities that need validation, and understanding
the criticality of the systems to the overall operation.
2. Objective Setting:
The objectives of the validation project are established. This involves
clarifying the goals of the validation, such as ensuring data integrity,
compliance with regulatory requirements, and system reliability.
3. Approach and Strategy:
The approach and strategy for validation are determined. This includes
deciding whether to use a risk-based approach, defining the validation
plan, and identifying the methodologies and tools to be used.
4. FRA Preparation:
As part of the "Planning" phase, the FRA is prepared. The FRA involves
identifying and assessing potential functional risks associated with the
computer system. This includes analyzing the system's functionalities,
interactions, interfaces, and potential failure points.
5. Risk Identification:
During FRA preparation, potential functional risks are identified. These
risks could relate to data integrity, system functionality, process impact,
regulatory compliance, and patient safety.
6. Risk Assessment:
Each identified risk is assessed to determine its potential impact and
likelihood. The risk assessment helps prioritize risks based on their
significance and the potential consequences they could have.
7. Risk Mitigation Strategies:
Based on the risk assessment, strategies to mitigate identified risks are
developed. These strategies could involve implementing controls, process
improvements, validation activities, or other measures to reduce the
likelihood or impact of identified risks.
8. Documentation:
The FRA is documented, detailing the identified risks, their assessment,
potential consequences, and proposed mitigation strategies. This
documentation becomes an important reference for the validation project.
9. Incorporation into Validation Plan:
The FRA findings and mitigation strategies are incorporated into the
overall validation plan. This ensures that the validation activities are
aligned with the identified risks and that appropriate focus is given to
addressing them.
10. Stakeholder Review and Approval:
The FRA documentation is reviewed and approved by relevant
stakeholders, such as quality assurance, validation, and project
management. Their input ensures that all critical risks are adequately
addressed.
11. Execution Planning:
The FRA outcomes play a role in planning the execution of validation
activities. It guides the focus of testing and validation efforts toward the
areas of highest risk.
The FRA is a foundational document that sets the stage for the validation
activities to follow. By identifying and assessing functional risks early in
the planning phase, organizations can develop a targeted and effective
validation strategy that addresses potential challenges and ensures the
integrity of the computer system and its impact on regulated processes.
Q26: If there are discrepancies found during PQ which are GxP relevant
and customer wants to go live of application, what is your advice to Client
to use application in production and how it will be tracked for closure in
future?
A26: When discrepancies are found during Performance Qualification (PQ)
that are GxP relevant and the customer wants to proceed with the
application going live in production, it's important to carefully consider the
potential risks and implications. Here's my advice to the client on how to
proceed and ensure proper tracking for closure of the discrepancies in the
future:
Advice to Client:
1. Risk Assessment:
Conduct a thorough risk assessment to evaluate the impact of the
discrepancies on GxP compliance, patient safety, data integrity, and
overall system functionality. This assessment will help you make an
informed decision.
2. Risk Mitigation:
Implement risk mitigation strategies to minimize the impact of the
discrepancies. This could involve additional procedural controls, manual
workarounds, enhanced monitoring, or other measures that reduce the
risk associated with the identified discrepancies.
3. Go-Live Decision:
If the risk assessment indicates that the identified discrepancies do not
pose significant risks to GxP compliance and patient safety, you can
consider proceeding with the application going live in production.
However, ensure that appropriate controls and monitoring mechanisms
are in place to manage the identified discrepancies.
4. Communication:
Communicate transparently with all relevant stakeholders, including
quality assurance, regulatory affairs, IT, and business units. Clearly
explain the identified discrepancies, the risk assessment, the proposed
risk mitigation strategies, and the rationale behind the decision to proceed
with the go-live.
5. Document Decision:
Document the decision to proceed with the application going live, along
with the risk assessment, risk mitigation strategies, and the approval of
relevant stakeholders. This documentation will serve as a record of the
decision-making process.
6. Ongoing Monitoring:
Implement an ongoing monitoring plan to closely track the performance
of the application in production. This includes monitoring for any adverse
events, deviations, or incidents related to the identified discrepancies.
Tracking for Closure:
1. Discrepancy Tracking System:
Set up a discrepancy tracking system that captures all identified
discrepancies, their associated risks, and the actions taken for mitigation.
This system will serve as a central repository for managing and tracking
discrepancies.
2. Action Items:
Assign action items to responsible individuals or teams for addressing
each identified discrepancy. Specify the corrective and preventive actions
that need to be taken to resolve the discrepancies.
3. Due Dates:
Assign due dates for each action item to ensure that the discrepancies
are addressed within a reasonable timeframe. Due dates should consider
the urgency and potential impact of each discrepancy.
4. Review and Approval:
Establish a review and approval process for the corrective and preventive
actions. This ensures that proposed actions are well-documented,
effective, and aligned with regulatory requirements.
5. Follow-Up and Verification:
After implementing the corrective and preventive actions, conduct follow-
up and verification to ensure that the discrepancies have been effectively
addressed and resolved. This may involve testing, validation, and
documentation review.
6. Closure and Documentation:
Once the discrepancies have been satisfactorily addressed and resolved,
close them in the discrepancy tracking system. Document the actions
taken, the outcomes, and any relevant supporting evidence.
7. Periodic Review:
Periodically review the discrepancy tracking system to ensure that all
identified discrepancies have been properly addressed and closed. This
review is important for maintaining GxP compliance and ensuring the
ongoing integrity of the application.
By following these steps, the client can make an informed decision about
proceeding with the application going live while managing the identified
discrepancies and ensuring that they are effectively addressed and
tracked for closure in the future.
Q27: What are the procedures that needs to be followed after system
released for use to maintain the system in compliance during its life
cycle?
A27: After a system is released for use, several procedures need to be
followed to ensure that the system remains in compliance and continues
to operate effectively throughout its lifecycle. Here are the key procedures
that should be followed:
1. Change Control Process:
Implement a robust change control process to manage any changes to
the system, including software updates, configuration changes, and
hardware modifications. All changes should be properly evaluated,
documented, reviewed, approved, and validated to ensure that they do
not negatively impact system compliance or functionality.
2. Periodic Review and Monitoring:
Conduct regular periodic reviews and monitoring of the system's
performance, data integrity, and compliance. This includes reviewing
system logs, audit trails, and user access records to identify any
anomalies or deviations from expected behavior.
3. User Access Control:
Maintain strict user access control to ensure that only authorized
personnel have access to the system. Regularly review and update user
access permissions based on job roles and responsibilities.
4. Security and Data Integrity:
Implement and maintain security measures to protect the system from
unauthorized access and ensure the integrity of data. This includes
measures such as encryption, password policies, and user authentication.
5. Backup and Restore Procedures:
Establish and maintain robust backup and restore procedures to ensure
that critical system data is regularly backed up and can be restored in
case of data loss or system failure.
6. Disaster Recovery Plan:
Develop and maintain a comprehensive disaster recovery plan that
outlines procedures for recovering the system and its data in the event of
a major system failure, natural disaster, or other emergencies.
7. Training and Documentation:
Provide ongoing training to system users to ensure that they are familiar
with the system's functionality and compliant use. Maintain up-to-date
documentation, including user manuals and standard operating
procedures (SOPs).
8. Vendor Management:
If the system is supported by external vendors or suppliers, establish a
vendor management program to ensure that the vendors adhere to GxP
requirements and provide timely support and updates.
9. Validation Maintenance:
Periodically review and update the system validation documentation to
ensure that it remains current and accurately reflects the system's
configuration, functionality, and compliance.
10. Incident Management:
Implement an incident management process to promptly address and
investigate any incidents, deviations, or issues related to the system's
compliance, functionality, or data integrity.
11. Regulatory Compliance:
Stay informed about relevant regulatory updates and ensure that the
system remains compliant with evolving regulatory requirements
throughout its lifecycle.
12. Retirement and Data Archival:
Develop procedures for the eventual retirement of the system, including
proper data archival and migration to ensure data integrity and
compliance with data retention requirements.
13. Continuous Improvement:
Continuously seek opportunities for system improvement and
optimization. Implement lessons learned from past incidents and issues to
enhance the system's compliance and performance.
By following these procedures, organizations can maintain the system in
compliance, ensure data integrity, and uphold GxP standards throughout
the entire lifecycle of the system.
Q28: What is root cause analysis in pharmaceutical industry and what are
the tools used?
A28: Root cause analysis (RCA) is a systematic process used in the
pharmaceutical industry to identify the underlying factors that contribute
to problems, issues, or non-conformances. The goal of RCA is to determine
the primary cause of a problem rather than just addressing its symptoms.
By identifying and addressing the root cause, organizations can
implement effective corrective and preventive actions to prevent
recurrence of the issue.
Tools used in root cause analysis in the pharmaceutical industry include:
1. Fishbone Diagram (Ishikawa Diagram or Cause-and-Effect Diagram):
This visual tool helps identify potential causes of a problem by
categorizing them into different categories (such as people, process,
equipment, materials, environment). It helps to organize brainstormed
ideas and visualize the relationships between different factors.
2. 5 Whys: This technique involves asking "Why?" repeatedly (usually five
times) to explore the cause-and-effect relationships underlying a problem.
It helps to drill down to the root cause by uncovering multiple layers of
causation.
3. Fault Tree Analysis (FTA): FTA is a systematic approach that uses
logical diagrams to analyze the relationships between various potential
causes and their effects. It is particularly useful for complex systems with
multiple interrelated factors.
4. Failure Modes and Effects Analysis (FMEA): FMEA is a proactive
approach that assesses potential failure modes, their causes, and their
potential effects. It assigns a risk priority number (RPN) to each failure
mode to prioritize corrective actions.
5. Pareto Analysis: Also known as the 80/20 rule, Pareto Analysis helps
identify and prioritize the most significant contributing factors by focusing
on the few vital factors that account for the majority of the issues.
6. Change Analysis: Examining changes that were made before the issue
occurred can help identify whether they are related to the problem. This
tool can be particularly useful for investigating deviations and
discrepancies.
7. Process Mapping: Visualizing the process involved in the problem can
help identify potential areas where issues might arise. Process maps help
in understanding the sequence of steps and interactions.
8. Data Analysis and Trending: Analyzing data, trends, and statistical
information can provide insights into patterns and anomalies that could be
causing the problem.
9. Human Performance Analysis: If human error is suspected, human
performance analysis tools can be used to assess factors such as training,
procedures, cognitive workload, and environmental conditions that might
contribute to errors.
10. Root Cause Tree Analysis: Similar to FTA, this method breaks down
causes and sub-causes in a tree structure to systematically identify root
causes.
The choice of tool depends on the complexity of the problem and the
context in which it occurs. Often, a combination of tools may be used to
thoroughly investigate and identify the root cause. It's important to note
that the RCA process should be documented, and the identified root
causes should be verified and validated before implementing corrective
actions.
Q29: What is Data integrity, ALCOA and ALCOA++?
A29: Data integrity refers to the accuracy, completeness, consistency, and
reliability of data throughout its entire lifecycle, from creation to archival.
It is a critical aspect in the pharmaceutical industry to ensure the quality,
safety, and efficacy of products and to maintain compliance with
regulatory requirements.
ALCOA is an acronym that represents key principles of data integrity. It
stands for:
1. Attributable: Data should be traceable to the individuals who created,
modified, or reviewed it. This includes recording electronic signatures and
maintaining audit trails.
2. Legible: Data should be easily readable and understandable.
Handwritten entries should be clear and indelible.
3. Contemporaneous: Data should be recorded at the time of the activity
or event. Delayed recording of data can raise concerns about accuracy
and authenticity.
4. Original: Data should be the first recording of an observation or result.
Transcription errors and copies of data should be avoided whenever
possible.
5. Accurate: Data should be error-free and reflect the true values and
observations. Calculations, measurements, and other data should be
precise.
ALCOA++ extends the ALCOA principles by adding additional
requirements to ensure data integrity:
6. Complete: Data should be comprehensive, including all necessary
information to understand the context of the observation or event.
7. Consistent: Data should be uniform and coherent, both within a single
record and across related records.
8. Enduring: Data should be retained for the required retention period and
remain accessible and legible throughout its lifecycle.
9. Available: Data should be easily retrievable and accessible when
needed for review, audits, and regulatory inspections.
10. Originality: Data should be generated in its original form, and any
changes should be appropriately documented and justified.
11. Traceable: There should be a clear and documented trail of data,
showing its creation, modification, and review history.
ALCOA and ALCOA++ principles are essential to maintain data integrity
and ensure that data is trustworthy and reliable. Regulatory agencies,
such as the FDA and EMA, emphasize the importance of adhering to these
principles in various guidance documents and regulations to prevent data
manipulation, errors, and fraud in the pharmaceutical industry. Proper
implementation of these principles helps to build confidence in the
accuracy and authenticity of data generated during various stages of drug
development, manufacturing, and distribution.
Conclusion:
In the pharmaceutical industry, where precision, accuracy, and
compliance are paramount, Computer System Validation plays a pivotal
role. It safeguards patient safety, data integrity, and regulatory
compliance by ensuring that computer systems operate as intended and
contribute to the production of high-quality pharmaceutical products.
Embracing CSV not only meets regulatory requirements but also instills
confidence in the industry's commitment to excellence and patient
welfare.
Report this
Published by
Antonio ViscontiAntonio Visconti
President, Founder & Principal GMP Consultant at GxP VISCONTI Pharma
Consulting Services (ViPCS)President, Founder & Principal GMP Consultant
at GxP VISCONTI Pharma Consulting Services (ViPCS)
Published • 4mo
1 articleFollow
hashtag#CSV
Like
Comment
Share
83
20 comments
Reactions
+71
20 CommentsComments on Antonio Visconti’s article
Current selected sort order is Most relevantMost relevant
Open Emoji Keyboard
SIMONE B.(She/Her) out of network3rd+Director Quality
Assurance at Arcus Biosciences
1mo
Very comprehensive article and guide summarizing everything you need to know
about computer system validation in the pharmaceutical industry. A must read for
CSV and QA professionals who want to ensure compliance with regulations. The Q&A
format makes it very easy to read such a complex topic. Well done Antonio Visconti!
Like
1
Reply1 Reply1 Comment on SIMONE B.’s comment
Antonio Visconti 2nd degree connection2ndPresident,
Founder & Principal GMP Consultant at GxP VISCONTI
Pharma Consulting Services (ViPCS)
1mo
SIMONE B. Tks a lot Simone
Like
Reply
Jayesh Prajapati out of network3rd+OSD + QMS Manager
at Cadila Pharmaceuticals Limited
4mo
Ever best article of CSV...
Thanks a lot..
Like
1
Reply1 Reply1 Comment on Jayesh Prajapati’s comment
Antonio Visconti 2nd degree connection2ndPresident,
Founder & Principal GMP Consultant at GxP VISCONTI
Pharma Consulting Services (ViPCS)
4mo(edited)
Jayesh Prajapati Tks Jayesh
Like
Reply
Load more comments
Antonio Visconti
President, Founder & Principal GMP Consultant at GxP
VISCONTI Pharma Consulting Services (ViPCS)
Follow
Like
Comment
Share
83
20 comments
About
Accessibility
Talent Solutions
Community Guidelines
Careers
Marketing Solutions
Privacy & Terms
Ad Choices
Advertising
Sales Solutions
Mobile
Small Business
Safety Center
Questions?
Visit our Help Center.
Manage your account and privacy
Go to your Settings.
Recommendation transparency
Learn more about Recommended Content.
Select Language
LinkedIn Corporation © 2024
Pharmaceutical Courses
o
o
Find a Job
o
o
Salaries
o
What is meant by Computer System Validation?
By: Gerry Creaner [Link] Eng and Donagh Fitzgerald [Link] Eng. Last
Updated: December 2022
GET UPDATES
Computer Systems Validation (CSV) – is a process used to test,
validate and formally document that a regulated computer-based
system does exactly what it is designed to do in a consistent and
accurate manner that is secure, reliable and traceable.
These sectors use computer systems to operate and record a range of
processes and activities from clinical trials, manufacturing, product
testing, distribution, storage, logistics, etc so it’s critical that these
systems can be relied upon to:
produce data accurately and consistently
create an indelible electronic data trail that is transparent, traceable
and tamper-proof
store those electronic data records in a way that is safe, secure and
can stand the test of time
And that the standard operating procedures (SOPs) and processes are put
in place to manage these systems.
In addition, the CSV process is used when replacing paper records and
handwritten signatures with electronic data records and electronic
signatures in highly regulated environments that impact public health and
safety such as the pharmaceutical and medical device industries.
Computer System Validation Course
Go rapidly from a beginner to an advanced level Computer System
Validation expert. Extend your or your team’s role into CSV projects or
charge higher hourly rates. Check out our 10-week online Computer
System Validation Training Course
Where is Computer System Validation Used?
Computer System Validation (CSV) is applied to GxP computerized system
applications used at any point in the research, clinical testing,
manufacturing, distribution and storage processes. Examples might
include:
Process Control Software
o PLC for Controlled Packaging Equipment
o Supervisory Control and Data Acquisition (SCADA)
o Distributed Control System (DCS)
Commercial off-the-shelf (CTOS) or Software as a service
(SasS) Software
o Laboratory Information Management System (LIMS)
o Clinical Trial Monitoring Systems
o Chromatography Data System (CDS)
o Enterprise Resource Planning (ERP) Systems
o Manufacturing Execution System (MES)
o Batch Record System
o Building Management Systems (BMS)
o Cloud base software services
o Spreadsheets
Why Do We Need CSV?
The pharmaceutical and medical device industries are regulated, meaning
that what goes on inside the factory walls is subject to the law of the land.
We have to be confident that the computer system generates and stores
data accurately and reliably and doesn’t do anything that could harm the
patient. In addition, in the 90s, the regulatory authorities for these
industries took the decision to replace paper records and handwritten
signatures with electronic records.
CSV is a process that creates an indelible electronic data trail that allows
us to treat the regulated data and electronic signatures captured in drug
discovery, drug trials, manufacturing, distribution and storage as the legal
equivalent of paper records and handwritten signatures and have the
equivalent level of confidence in their accuracy, reliability and data
integrity.
What Are Examples of Computer System Validation Regulation?
The Food and Drug Administration (FDA) provides detailed controls for
electronic records and electronic signatures in the Code of Federal
Regulations (CFR) under FDA 21 CFR 11. Part 11 mandates the
requirements for electronic records and signatures to be accurate,
reliable, readily retrievable, and secure and to be able to legally replace
paper records and handwritten signatures. This code applies to
(bio)pharmaceutical and medical device manufacturers, biotechnology
companies and other FDA-regulated industries. Some examples of the
controls required are:
1. Data should be stored in electronic format and can be archived.
Electronic records should be as trustworthy as paper records.
2. The system must ensure that electronic signatures are as trustworthy
and secure as handwritten signatures. Controls on electronic
signatures should include:
1. the name of the singing user
2. the day/time the signature was executed
3. and the meaning of the signature
3. Signed records cannot be altered by users. The system can determine
when a record was created, altered, or deleted by a user of the
system.
4. Password masking facility should be available in the system.
5. Password complexity should be required i.e. password should be of
minimum character length, etc. Previous passwords can’t be re-used.
6. Screen lock should trigger after a defined period of time.
7. Only authorised people can use the system.
8. The system should have different access levels based on the criticality
of the system.
9. The system must be able to generate an audit trail. i.e. every activity
should be stored in the system such as who, when, what, etc should
be captured.
For medical device makers, the FDA requires them to validate the
software and systems used in manufacturing medical devices using 21
CFR 820.
Within the EU, EudraLex – Volume 4 – Good Manufacturing Practice
(GMP) provides guidance on the validation of Computerised Systems
under Annex 11: Computerised Systems.
The Pharmaceutical Inspection Co-Operation Scheme (PICs) provides
guidance with its document on Good Practices for Computerised Systems
in Regulated GxP Environments.
Computer System Validation is also required in ISO 13485 standard 2016 –
For Medical Devices.
In late 2022, the FDA released a draft document on Computer Software
Assurance for Production and Software Systems. As this document is still
in draft format, it is unclear what impact it will have of the approach to
software validation so we have made no reference to it throughout this
post.
What are GAMP Guidelines and How Do They Work Together With
the Regulations?
Good Automated Manufacturing Practice (GAMP) is a set of guidelines and
procedures that pharmaceutical manufacturers and users of automated
systems use to validate the computer systems and achieve compliance
with the above regulations such as FDA 21 CFR 11
Think of the controls outlined in FDA 21 CFR 11 as what the regulators
want to see the computer system be able to do. Think of GAMP as the
approach taken to achieve those controls.
GAMP has enjoyed the support of numerous regulatory authorities and is
now a recognized good practice worldwide.
How do you Validate a Computer System?
In the pharmaceutical and medical device sector, the most widely used
method is to follow the GAMP® 5 guidelines and break the process down
into its life cycle phases. According to the GAMP, there are four life cycle
phases in a computer system and each phase contains a number of
individual activities. (See diagram below)
In order, these phases are:
1. Concept – a high-level overview of the system and design
considerations
2. Project – detailed description of the system and objectives
3. Operation – how the system will be managed during operations
4. Retirement – how to retire the system
In addition, there are a lot of supporting processes or activities that
take place across the phases of the life cycle such as risk management,
document management, repair activity, security management, etc and
you should include them when visualizing the CSV process. You will often
use various suppliers across different phases of the lifecycle and you
should leverage their expertise as much as possible.
Now, let’s take a look at the individual activities in each of the four phases
as well as the relevant supporting processes of each phase viewed
through the project life cycle.
1. Concept Phase
The concept phase is where the idea for a system is conceived. At this
point, there is a general idea of what the system should do. This phase
would include data gathering (i.e. vendors, costs, high-level project plan,
resources, timelines, benefits, restrictions and justification of the
requirement for the system). The first risk assessment should take place
at this point and should ask questions such as, what are the risks to the
business of implementing or not implementing this system? Where will
further risk assessment be required?
This phase contains the following specific activities:
1.1 Planning
1.2 System Software Categorization
1.3 Risk Assessment
1.4 Supplier Assessment
1.1 Planning
In this activity, you typically define the following:
the overall requirements of the system
the critical functions it will be controlling
the available options for hardware and software platforms
the overall regulatory impact of the proposed system
the novelty of the system and its complexity
the requirement for document control
the testing that is required (what are the considerations for operating
and maintaining the system?)
what data and records will be generated that will subsequently need
to be retained
1.2 System Software Categorization
The range of activities required to validate a computerized system varies
greatly depending on the type of software you are using.
The GAMP 5 guidelines now categorize software into 4 types. The category
your software falls into will determine the validation approach, the
amount of time the project takes and the deliverables.
The 4 categories are 1, 3, 4 and 5 (note: Category 2 was discontinued):
Software Category 1 – Infrastructure software
This is software that applications are built on and used to manage the
operating environment.
Examples: Operating Systems (Windows, Mac OS, Android, Linux etc),
Database engines, Middleware, Programming languages, Statistical
packages, Network monitoring tools, Scheduling tools, Version control
tools
Software Category-3 – Non-Configured Software
Software in this category allows parameters to be entered and stored in
the system, but the software cannot be configured to suit the business
process.
Examples: Setting the time (i.e. entering a parameter) on the alarm on
your smartphone, Firmware-based applications, Configured Off-The-Shelf
(COTS) software, Instruments (i.e. for measuring temperature, pressure,
flow volume).
Instruments (i.e. for measuring temperature, pressure, flow volume)
Software Category-4 – Configured Software
This software is often very complex. It can be configured by the user to
meet the specific needs of their business process but the software code
itself is not altered. Much of the software you come across in the
Pharma/Medtech sector is category-4
Examples: Laboratory Information Management System (LIMS),
Enterprise Resource Planning (ERP), Clinical Trial Monitoring, Distributed
Control System (DCS), Chromatography Data System (CDS), Building
Management Systems (BMS), Spreadsheets
Software Category-5 – Custom Software
This software is custom designed and coded to suit the specific business
process.
Examples Internally and externally developed IT applications,
Internally and externally developed process control applications,
Custom firmware, Spreadsheets (macro)
1.3 Risk Assessment
Think of a risk assessment as a way of visualising catastrophe before it
happens (by trying to think up of as many failure modes as possible) and
coming up with methods to mitigate those risks. A quality risk assessment
is carried out to determine if the computer system has the potential to
impact product quality, patient safety or data integrity in a way that could
ultimately harm the patient.
1.4 Supplier Assessment
Part of the concept phase will also be narrowing down the vendor options
and getting an idea of the cost to include in the business case. This
typically involves sending out a Request for Services document with some
general details of system requirements to several vendors. The response
from the vendors will detail how their system can meet those
requirements, alongside an estimated cost. Where the response looks
promising a request for a demonstration of the system will be made
before an in-depth supplier assessment is carried out.
2. Project Phase
In this phase you finalise the planning; develop the user requirement
specifications, functional specifications and design specifications; have
the project built; and have it commissioned and qualified. Finally, you
hand over the completed project to the end-user or client.
Specific activities in this phase include:
2.1 Planning
2.2 Utilising the CSV Process V model
2.3 Risk assessment
2.4 Writing Standard Operating Procedures
2.5 Training
2.6 Handover
2.1 Planning
This stage defines what will be validated and what approach you will use.
It also defines individual roles and responsibilities, and the acceptance
criteria. It is typically completed by the end-user or client. The planning
step of the project phase will contain activities that may overlap with the
concept phase – there is not always a sharp delineation between the tasks
of both phases.
Documentation completed within this stage includes:
2.1.1 Validation Master Plan (VMP)
Every regulated pharma organisation should have a validation master
plan in place to govern its approach to validation. A subsection of this is
“computerized system quality and compliance”. Depending on the size of
an organisation there may be several sub validation plans within the
validation master plan (e.g. site, departmental or system-specific) or there
may be multiple validation master plans, one for each business unit.
2.1.2 System Overview
In order to satisfy the regulatory inspectors, you need to give a brief
description of the system
It’s best to take a top-down approach and start with its operating
environment. This would include other networked or standalone
computerised systems, other systems, media (how you store your
electronic records), people, equipment and procedures.
In addition, you need to describe:
Hardware /Firmware
Software
Computer System (Controlling System)
Operating procedures and people
Data managed by the system
Equipment controlling function or process
During this phase, you take the V-model approach (see diagram below) to
determine the specifications and the verification and testing deliverables.
2.2 Computer System Validation Process V-
Model
In pharmaceutical manufacturing, most companies and organisations
follow the Good Automated Manufacturing Practice (GAMP®5) V-Model to
validate their systems as it meets the requirements of the industry
regulators. The model is used to visualize the relationship between user
requirements and specifications, and the verification and testing
performed on them (see diagram below).
You start at the top left (planning), proceed down ( through to configure
and/or coding), and then back up to the right, ending at (reporting). The
left-hand side of the V represents what the computer system does (i.e.
what you use the software for), along with how the computer system
works. The right-hand side of the V represents how you test the system to
confirm that the system is fit-for-purpose (i.e. will make safe medicines for
patients).
2.2.1 User Requirements Specification (URS)
The user requirements specification maps out what the user needs from
the software and how they will use it. It also contains any constraints such
as regulations, safety requirements or operational requirements. The
specifications must be agreed between the end-user/customer and
supplier. The URS will have input from the process owners, system
owners, technical subject matter experts (SMEs), quality and the supplier
where required.
2.2.2 Functional Specification
The functional specification contains a detailed description of how the
system will meet each of the requirements outlined in the URS such as:
how the software works
what data needs to be captured
user interfaces
2.2.3 Design Specification
The design specification describes in detail how each function is to be
designed or configured. It is a more technical follow-on document from
the functional specification and may contain configuration specifications
or code – its intended audience is the system developer.
2.2.4 Configuration and/or Coding
In this step you build, develop or purchase the software (depending on the
software category) and then configure it to meet the criteria laid out in
the previous specification documents. This is typically completed by the
vendor.
2.2.5 Installation Qualification (IQ)
Installation qualification (also referred to as “configuration or integration
testing”) confirms that the software or system is installed and set up
according to the design specification.
2.2.6 Operational Qualification (OQ)
Operational qualification (also referred to as “functional testing”) confirms
that all functionality defined in the functional specification is present and
working correctly. In the case of bespoke software, it confirms that there
are no software bugs.
2.2.7 Performance Qualification (PQ)
Performance qualification (also referred to as “user requirement testing”)
confirms that the software will meet the user’s needs and is suitable for
their intended use, as defined in the user requirements specification.
2.2.8 Final Report
The last step in this validation method is to write the summary report
declaring that the system is fit for its intended use and that every
deliverable that was planned has been delivered.
In the V-model, you’ll notice a link between the two sides of the V. For
example, the reporting stage verifies the planning stage, the performance
qualification stage verifies the user requirement specification stage and
ensures that the specifications have been achieved.
2.3 Risk Assessment
Risk assessment may be utilised at various stages of the validation
lifecycle. Within the project phase, the risk assessment will review the
same document at each stage of the V model to ensure the risk
classification and controls are accounted for in the validation actions that
are carried out.
2.4 Standard Operating Procedures
Standard Operating Procedures (SOPs) are a key part of CSV
documentation. They outline how the computer system should be used.
Any staff using a particular system will be thoroughly trained on the
relevant SOPs to ensure they are using the system correctly, in the way it
was intended
2.5 Training
You need to train key users of the system on how to use the system
software, applications and procedures.
2.6 Handover
You need to write a handover plan to define when the application will
move into the operation phase and how any disruption will be managed,
making sure that the system can be used and supported in a controlled
manner.
The handover process must verify the following:
1. The system is fit for purpose.
2. Roles and responsibilities are defined e.g. process owner/s, system
owner.
3. All personnel are appropriately trained e.g. standard user, system
admin.
4. Operational and support procedures/personnel are in place.
5. Supporting quality controls and personnel are in place to maintain
compliance.
6. Any residual risks have been accepted.
3. Operation Phase
The following processes must be in place before the system goes live, and
be observed throughout operation:
3.01 Change Management
3.02 Configuration Management
3.03 Patch and Update Management
3.04 Security Management
3.05 Business Continuity Management
3.06 Disaster Recovery Planning
3.07 Service Level Agreement
3.08 Backup and Restore
3.09 Incident Management
3.10 Periodic Review
3.11 On-Going Projects
3.12 Electronic Data Archiving
3.13 Documented Control
3.01 Change Management
You must implement procedures and processes to document, evaluate
and approve any changes to the system once it’s live.
3.02 Configuration Management
You must identify and document the functional and physical attributes of
the system and controls on their status e.g. live, retired, under
amendment.
3.03 Patch and Update Management
Similar to Microsoft Windows or Apple OS, you may need to install updates
on the system on both a regular and ad-hoc basis. This will be done on the
advice of the supplier but the customer will be included in the planning.
3.04 Security Management
You need to define the controls required for securing a computerized
system in its operational environment. This typically involves
authentication controls and access levels controls so the system data
can’t be tampered with . In addition, you must provide technical security
(such as antivirus software) and a company firewall to protect against
spyware and malware.
3.05 Business Continuity Management
You need to develop a detailed plan to regain access to the system and its
data following a disaster such as a power outage, fire damage, water
damage or a virus attack. The plan must outline how you will restore
critical business processes following a disruption while continuing to
provide products or services.
3.06 Disaster Recovery Planning
This is a subset of Business Continuity Management (BCM). It contains the
steps to follow to regain access to the hardware, software and data
following a disastrous event
3.07 Service Level Agreement
This is an agreement with both the system supplier and a data centre
provider for escalation of issues that includes agreed response and
resolution times depending on the issues categorisation.
3.08 Backup and Restore
Backup and restore is a mechanism to protect electronic information and
records against loss of original data. Copies of the system configuration
and data are made on a regular basis (e.g. daily) so that it is retrievable in
the event of a disaster. To do this, you must copy the software, data and
electronic records to a separate, safe and secure area where it is available
and protected so you will be able to restore it in its original format if
required.
3.09 Incident Management
This will involve recording issues encountered by the system. End users
will be able to raise these issues using an electronic IT helpdesk system.
3.10 Periodic Review
You need to conduct periodic reviews to ensure a computerized system
remains compliant with regulatory requirements throughout its
operational life, remains fit for intended use, and continually satisfies
company policies and procedures. For example, the validation team (in
conjunction with relevant stakeholders) might conduct a periodic review
every 2 years.
3.11 On-Going Projects
In the operation phase of the life cycle, you might discover that certain
aspects of the system need to be updated or changed. For example, the
user interface might be causing problems with users not being able to
input data accurately. Or you might want to expand the reach of the
system or update and improve its processes. You would manage these
changes like mini-projects using the V-model approach (as outlined
above).
3.12 Electronic Data Archiving
You need to develop a suitable data archiving strategy that moves data
that is no longer actively used in the environment it was created, to a
separate data storage area for long-term retention. You would also
complete data archiving in the retirement phase of the validation life
cycle.
3.13 Documented Control
The computer system is now in operation. You must keep all aspects of
the system and the operating environment in a state of documented
control to maintain its validated status.
4. Retirement Phase
In the retirement phase, the system might be shut down completely or
upgraded to a new system. Either way, you need to assess what data
needs to be migrated from the existing computerised system, what data
needs to be retained and what data needs to be destroyed.
Specific activities in this phase include:
4.1 Data Migration
4.2 Electronic Data Archiving
4.3 Data Destruction
4.1 Data Migration
Data migration involves moving data from one platform to another. To do
this you need to create a data migration plan. The following
considerations should be taken into account:
1. That all required data has been moved.
2. The context of the data, its attributes and metadata have been
preserved.
3. Any requested transformation has yielded the expected result.
4. No unexpected transformation has been introduced.
Similarly to implementation, you need to take a risk-based approach to
migration. You need to perform a data flow analysis exercise to identify
points of weakness in the transition from the old to the new system. And
you’ll need to provide documented evidence of these actions.
4.2 Electronic Data Archiving
Data archiving is the process of moving data that is no longer actively
used to a separate data storage device for long-term retention. You need
to develop a suitable data archiving strategy for moving data in this way
The considerations for data archiving are essentially the same as for data
migration, as you are moving data from one platform to another. The
archiving platform may be the same as the live system but with slower
and less costly storage for data that is still required to be readily available
for trending or updating.
4.3 Data Destruction
Where data is no longer required you may completely remove it from the
live system, the archiving solution and any back-ups. You must ensure
that the method of destruction completely removes all the data,
particularly where it pertains to personal data governed by data integrity
regulations (e.g. overwriting or destroying hard disks, shredding or
burning rendered off paper records).
Supporting Processes
Finally, let’s take a look at the supporting processes or activities that take
place across different phases of the life cycle such as risk management,
document management, repair activity and traceability matrix.
Specific activities in this phase include:
1. Risk Management
2. Traceability Matrix
3. Repair Activity
4. Document Management
1. Risk Management
You need to apply risk management throughout the lifecycle of a
computerized system and decide how to manage the process for various
categories of systems. You should also look at an approach to conducting
risk assessments on computerized systems based on their impact on
product quality, patient safety and data integrity.
2. Traceability Matrix
You need to develop a traceability matrix – this is an important project
document for tracing all user requirements to design specifications and
appropriate verification tests.
3. Repair Activity
You need to develop a process by which non-functional systems (i.e.
systems which have stopped working) are returned to a functional state
under the control of a repair activity procedure.
4. Document Management
The documentation associated with CSV is extremely detailed. Someone
reading through it should be able to repeat the process simply by
following the steps outlined in the document. As a result, the language
used must be clear and concise.
Before any computer system is used, documents will outline specifics such
as:
Defining the purpose of the computer system in question
The features it needs
The hardware it needs
When it will be used
The requirements that it is expected to meet
The specifications defined here will then be used throughout the CSV
process – continuing throughout the life of the computer system.
In addition to this, the system will continue to be tested throughout its
lifecycle. Rigorous routine testing will be used to show that the system
continues to meet the predefined requirements that were laid out in the
design phase.
All CSV documentation can be called for review and audit at any point of
the system life cycle. It would be expected that the documentation meets
appropriate standards at all points.
Even after a company has stopped using a particular computer system,
you will need to keep the documentation showing that it was correctly
validated while in use in the event that the regulatory authorities need to
retrospectively check the data integrity of a past event.
Final Summary
As you can see, the computer system validation process is time-
consuming and expensive but necessary in order to keep data quality
safe, accurate and secure.
Watch Video on CSV
Check out this video from INTERPHEX 2015 for a great introductory
conversation about CSV and how it’s implemented within the pharma
industry:
Play
The Growth of Computer System Validation Opportunities
As manufacturing processes become increasingly automated, the need for
CSV professionals is growing. This trend is only expected to continue.
There is also an acute shortage of trained CSV professionals in certain
geographic areas, including Ireland. For this reason, salaries for these
roles are extremely competitive .
One of the single biggest misconceptions of the CSV role is that you need
to be able to code. This is usually not the case. However, we do
sometimes see a requirement for the ability to code in some roles where
the job description overlaps with automation engineering. And you will
always need a solid understanding of the computer process you will be
validating.
If you have the relevant skills, as well as the experience of pharmaceutical
or medical device manufacturing, you might be closer than you think to be
a great candidate for CSV roles within pharmaceutical companies.
Computer System Validation Course
If you want to learn computer system validation, check out our 10-week
online Computer System Validation training course.
About the Authors
Gerry Creaner
President
Senior Lecturer with GetReskilled
Gerry Creaner has over 30-years of experience in the Life Sciences
Manufacturing industry across a range of technical, managerial and
business roles. He established a very successful engineering consultancy
prior to founding GetReskilled, an online education and learning business,
with offices in Singapore, Ireland and Boston (USA), focussed on the
manufacture of safe and effective medicines for the public.
He is also a founding Director of two Singapore based philanthropic
organizations, the Farmleigh Fellowship and the Singapore-Ireland Fund,
both of which deepen the well established and historical Singapore –
Ireland relationship and deliver long-term benefits to both countries.
Gerry has an undergraduate degree in Chemical Engineering (UCD, 1980)
and an MSc (Management) from Trinity College Dublin (2003) and is
currently doing research for his Ph.D.
Donagh Fitzgerald
Head of Marketing & Product Development
Mechanical/Production Engineer
Donagh looks after the marketing and product development including the
training and pedagogical elements of our programs and makes sure that
all GetReskilled’s users can have a great online learning experience.
Donagh has lived and worked in many countries including Ireland,
America, the UK, Singapore, and Hong Kong. Donagh has also served as
the Program Manager for the Farmleigh Fellowship based out of
Singapore.
Donagh holds Degrees in Production Engineering and Mechanical
Engineering from South East Technological University, Ireland.
Validation Courses
Validation Course – Starter
Validation Course – Senior
Validation Articles
Pharmaceutical Validation
Commissioning vs Qualification vs Validation
What are IQ OQ PQ?
Equipment Qualification Protocol
IQ OQ PQ Templates
Process Validation
Cleaning Validation
Computer System Validation (CSV)
GxP Software Systems
Validation Master Plan
Validation Regulations
8 Risk Management Tools
Critical Components Lists
5 Validation Mistakes to Avoid
Working for Pharma Company vs Engineering Consultancy
P&ID Symbols – Reading Guide
Validation Glossary
Validation Careers
Validation Engineer
CQV Engineer
Validation Technician
CSV Engineer
Validation Salaries
8 Skills for Validation Jobs
Why Choose a CSV Career?
9 Soft Skills for CSV
Top 15 LinkedIn Groups for CSV
Validation Career Path
GetReskilled is an awarding-winning education company. We'll
retrain or upskill you ONLINE for a higher-paying career or a
promotion in the Pharmaceutical and Medical Device
Manufacturing Industry or the Engineering and Validation
Consultancies that work in this sector.
WINNER - Best Online Learning Experience, Education Awards 2018
WINNER - Pharma Education and Training Award, Pharma Industry
Awards 2018
2nd PLACE - European Commission: Vocational Education Training
Excellence Awards 2019
Join 8,000 people and get updates on the latest pharma
news in your inbox once a month
Get Updates
GetReskilled
Pharmaceutical Courses
Find a Job
Pharma Blog
About
FAQ
University Graduates and
School Leavers
Pharma Validation Centre
Research
Contact Us
Contact Us
Ireland
City Gate Building, 1000 Suite
1201 Mahon, Cork
+353 (0) 21 240 9016
UK
Falcon Drive,
Cardiff
Mid Glamorgan
CF10 4RU
+44 (0) 780 237 0554
USA
361 Newbury Street
Fifth Floor
Boston MA 02155
+1 (617) 901 9268
© 2008-2024 GetReskilled | Privacy Policy | Sitemap