Pharmaceutical Validation Framework
Understanding Verification, Validation, and Qualification in CSV/CSA with Real-World Examples
✓
Verification
Confirming that design outputs meet design inputs through objective evidence.
Focus
Design vs. Requirements alignment
Technical specification compliance
As-designed vs. specifications
Typical Activities
Design reviews
Engineering calculations
Code reviews
Documentation audits
Scope
Design & Development Phase
⚙
Validation
Establishing by objective evidence that system requirements can be consistently fulfilled.
Focus
Functional requirements vs. actual use
System behavior in operating conditions
Real-world performance
Typical Activities
IQ/OQ/PQ protocols
Performance testing
Data integrity testing
Stress & edge-case testing
Scope
Implementation & Operation
◆
Qualification
The process of proving and documenting that systems/equipment/processes meet predetermined specifications.
Focus
Installation (hardware/software)
Operational readiness
Environmental conditions
Typical Activities
IQ – Installation Qualification
OQ – Operational Qualification
PQ – Performance Qualification
Environmental monitoring
Scope
Infrastructure & Deployment
Side-by-Side Comparison
| Dimension | Verification | Validation | Qualification |
|---|---|---|---|
| Core Question | Did we design it right? | Did we build the right thing? | Is it ready to operate? |
| Definition | Design meets specifications | System meets user needs consistently | System ready for intended use |
| Primary Question | Are outputs consistent with inputs? | Does system perform as intended in actual use? | Is equipment/software properly installed & configured? |
| Timing | Design & Development | Implementation & Operation | Pre-deployment & Ongoing |
| Key Documents | URS, FRS, Design specs, DVP/DVR | Validation protocol, test cases, reports | IQ/OQ/PQ protocols, equipment specs |
| Evidence Type | Design reviews, calculations, audits | Functional & performance test results | Equipment certifications, calibration data |
| Who Performs | Design/Engineering team | Quality/Validation team with users | Quality/Engineering with vendors |
| Risk Focus | Design defects, specification gaps | Functional failures, data integrity | Installation errors, environmental factors |
| Scope | Software/hardware design logic | Full system behavior & data handling | Infrastructure, installation, environment |
Verification Evidence
Design HAZOP analysis confirms risk mitigation
Code review validates against secure coding practices
Traceability matrix shows all specs are addressed
Calculation verification for batch size formulas
FRS review against user requirements document
Technical design review sign-off
Validation Evidence
System produces correct outputs for known inputs
Data integrity under stress conditions (10x normal load)
Audit trail captures all critical operations
System behaves correctly with concurrent users
Electronic signature compliance with 21 CFR Part 11
Recovery/business continuity testing
Qualification Evidence
IQ: Servers installed per specification, OS versions documented
OQ: Database performs lookups within SLA parameters
PQ: Batch processing produces validated data output
Environmental: Temperature/humidity controls verified
Network connectivity meets uptime requirements
Calibration certificates for analytical instruments
Example 1: Laboratory Information Management System (LIMS)
Verification – Design Phase
LIMS Implementation (Commercial Off-The-Shelf)
What we’re asking: Does the vendor’s LIMS design properly address our User Requirements Specification?
Activities:
Evidence: Design Verification Report (DVR) showing FRS-to-URS traceability matrix
Review vendor’s Functional Requirements Specification against our URS (sample tracking, result calculation, audit trail)
Verify automated calculation for QC result limits matches our standard operating procedures
Confirm audit trail design captures User ID, timestamp, and data changes per 21 CFR Part 11
Review database schema design for referential integrity constraints
Validate security architecture includes role-based access controls for analyst vs. QA vs. admin
Validation – Implementation & Testing
LIMS Implementation (Commercial Off-The-Shelf)
What we’re asking: Does the configured LIMS produce correct, traceable results under actual lab conditions?
Activities:
Evidence: Validation Protocol (VP), IQ/OQ/PQ reports, test case execution log, Validation Summary Report (VSR)
Execute Operational Qualification: Test LIMS sample entry, result logging, report generation
Execute Performance Qualification: Run 100+ samples with known reference values; verify calculated results match manual calculations ±0.5%
Data Integrity Testing: Verify audit trail records all result edits with User ID, timestamp, reason, old/new value
Stress Testing: Load 10,000 concurrent sample records; verify query response time <3 seconds
Backup/Disaster Recovery: Simulate database corruption; verify restoration from backup produces identical data
Electronic Signature Testing: Verify digital signatures (if used) remain valid after 10-year data retention period
Qualification – Installation & Readiness
LIMS Implementation (Commercial Off-The-Shelf)
What we’re asking: Is the LIMS hardware/software infrastructure properly installed and ready for production use?
Activities:
Evidence: Installation records, OQ test results, System Readiness Report, Environmental monitoring setup
IQ: Verify database servers installed per specifications (CPU, RAM, OS version); document server serial numbers, BIOS versions
IQ: Confirm network connectivity (bandwidth test); verify firewall rules restrict unauthorized access
OQ: Test database startup/shutdown procedures; verify automatic failover to backup server <2 minutes
OQ: Verify system clock synchronized via NTP; confirm timestamp accuracy ±100ms for audit trail compliance
PQ: Run full workflow: create sample → enter results → generate report → archive data; verify all steps complete without errors
Environmental: Document server room temperature 18-22°C, humidity 45-55%, backup power availability
Example 2: Analytical Balance (Laboratory Instrument)
Verification – Design Phase
Analytical Balance (Hardware)
What we’re asking: Does the balance manufacturer’s design specifications meet our weighing requirements?
Activities:
Evidence: Equipment specification review, manufacturer datasheet assessment, technical design agreement
Confirm balance readability (0.1 mg) supports our USP <41> requirements for tablet weight variation
Review manufacturer specifications: repeatability, accuracy, linearity, temperature compensation
Verify calibration frequency (daily, weekly) aligns with our internal standards
Check design includes data output (USB/Ethernet) to export weights to our LIMS
Confirm balance can be sealed to prevent unauthorized calibration changes (if applicable)
Validation – Performance Testing
Analytical Balance (Hardware)
What we’re asking: Does the balance actually perform its weighing function accurately under real laboratory conditions?
Activities:
Evidence: Analytical Balance Validation Report (ABVR), calibration data records, statistical analysis of test results
Repeatability Test: Weigh same reference standard (NIST-traceable) 10 times; verify standard deviation <0.2 mg (per manufacturer spec)
Linearity Test: Weigh certified weights from 10 mg to 100 g; verify measured ± actual within ±0.5 mg across range
Accuracy Test: Use calibration weights at 30%, 50%, 100% of capacity; verify within ±0.2 mg
Temperature Stability: Measure weights at 18°C and 28°C; verify difference <0.3 mg
Data Integrity: If balance interfaces with LIMS, verify transmitted weights match display value; check for data corruption
Environmental Impact: Test balance performance in typical lab (airflow from HVAC, vibration from equipment); document any drift
Qualification – Installation & Operation
Analytical Balance (Hardware)
What we’re asking: Is the balance properly installed, calibrated, and ready for regulatory use?
Activities:
Evidence: Installation Checklist, Operational Qualification Report, Performance Qualification Summary, Calibration Certificate
IQ: Document balance unpacked, serial number recorded, placed on level surface (leveling verified with built-in indicator); no visible damage
IQ: Confirm calibration weights (class M2) on-site; certificate of traceability present
OQ: Run internal calibration routine; confirm balance accepts calibration and returns to zero
OQ: Verify draft shield operates smoothly; confirm balance not affected by typical lab door opening
PQ: Weigh 5 routine tablet samples; confirm weights reproducible within specification
Environmental: Document weighing area temperature stability ±2°C; humidity 45-60%; vibration from adjacent equipment monitored
Calibration Schedule: Establish calibration frequency (e.g., daily internal cal, monthly external certification, annual service)
Example 3: Electronic Batch Record (EBR) System
Verification – Software Design
Manufacturing EBR (Custom-Developed Software)
What we’re asking: Does the vendor’s software code/design properly implement our batch record requirements?
Activities:
Evidence: Design Review Report (DRR), Requirements Traceability Matrix (RTM), Code Review Checklist, Security Assessment Report
Requirements Traceability: Map each FRS item (e.g., “System shall prevent batch completion without QA review”) to design document section and code module
Design Review: Verify data validation rules prevent manual entry errors (e.g., weight entry must be numeric, ≤ 5 significant figures)
Code Security Review: Confirm no hardcoded passwords, SQL injection vulnerabilities, or insecure data storage
Audit Trail Design: Verify database schema captures User ID (not just username), timestamp, field name, old value, new value, and reason for change
Electronic Signature Design: Confirm system enforces intention-to-sign workflow per 21 CFR Part 11.100
Testing Strategy: Review test plan covers normal workflows, edge cases (blank fields, system timeouts), and error scenarios
Validation – Functional & Data Integrity Testing
Manufacturing EBR (Custom-Developed Software)
What we’re asking: Does the EBR correctly execute batch operations and maintain data integrity in production?
Activities:
Evidence: Validation Protocol (VP), Test Execution Report (TER), Validation Summary Report (VSR), Data Integrity Assessment Report
Functional Testing: Execute 50+ test cases covering batch creation, material weighing, equipment operation, in-process testing, QA release
Data Integrity Testing: Edit batch record field (e.g., correct batch weight from 50.1 kg to 50.0 kg); verify audit trail records: timestamp, user, field name, “50.1” → “50.0”, reason code
21 CFR Part 11 Testing: Verify electronic signatures cannot be forged/altered; test signature validity after 10 years; confirm signature invalid if batch record is modified post-signature
Stress Testing: Simulate 100 concurrent users editing same batch; verify no data corruption, proper locking mechanisms prevent overwrites
Edge Case Testing: Test batch completion at 23:59:59 on Dec 31; verify system clock transition handled correctly; timestamp recorded accurately
Backup/Recovery: Corrupt batch record database; restore from backup; verify all audit trails, signatures, and data intact
Qualification – System Infrastructure & Production Readiness
Manufacturing EBR (Custom-Developed Software)
What we’re asking: Is the EBR infrastructure (servers, network, backups, security) properly installed and ready for manufacturing use?
Activities:
Evidence: Installation Checklist, OQ Test Results, Network Architecture Diagram, Disaster Recovery Test Report, System Readiness Sign-Off
IQ: Document application servers, database servers, backup systems installed per specifications; OS versions, patches, database versions recorded
IQ: Verify security: firewall rules restrict access to manufacturing floor only; VPN for remote users requires multi-factor authentication
IQ: Confirm disaster recovery: secondary data center has real-time database replication; network connectivity documented
OQ: Test system startup/shutdown; verify all services start in correct order; database initializes without corruption
OQ: Verify system performance: batch record load time <3 seconds; query results returned within SLA
OQ: Test failover: simulate application server failure; verify automatic failover to secondary within 30 seconds; no data loss
PQ: Simulate full batch workflow: create batch → enter materials → start equipment → record in-process test → QA review → release; verify all steps complete without errors
Environmental: Document server room temperature 18-22°C; humidity 45-55%; UPS capacity for 4-hour graceful shutdown; backup power tested monthly
Example 4: Autoclave (Sterilization Equipment)
Verification – Design Phase
Steam Sterilizer / Autoclave (Equipment)
What we’re asking: Does the autoclave manufacturer’s design meet our sterilization requirements per ISO 17665 and 21 CFR 211.63?
Activities:
Evidence: Equipment Specification Review, Manufacturer Technical Data, Pressure Vessel Certification, Design Assessment Report
Verify autoclave design specification: 121°C saturated steam, 15-30 min exposure time (depending on load configuration)
Confirm pressure vessel certification (ASME Section VIII or equivalent) and safety relief valve design
Review steam supply quality: dry, saturated steam ≥99% quality to prevent liquid carryover and contamination
Verify temperature uniformity: thermocouple/RTD sensor placement ensures temperature measured at coldest point of chamber
Confirm instrumentation design: recording thermometer or data logger captures temperature vs. time profile with ±1°C accuracy
Verify exhaust air quality: steam condensate drain design prevents moisture accumulation; air removal system functional
Review door lock mechanism design: door cannot open until pressure <0.2 bar; safety interlocks prevent accidental chamber opening during sterilization
Confirm drain valve design: thermostatic trap prevents live steam from escaping; condensate drained completely
Validation – Sterilization Efficacy Testing
Steam Sterilizer / Autoclave (Equipment)
What we’re asking: Does the autoclave actually achieve sterilization (≥3 log reduction = 99.9% kill of worst-case microorganisms)?
Activities:
Evidence: Autoclave Validation Report (AVR), Temperature Profile Charts, Biological Indicator Test Results, Physical Mapping Data, Pressure/Time Graphs
Physical Validation: Run empty chamber; place temperature probes at top, center, bottom of chamber; verify all reach & maintain 121±2°C for specified hold time (e.g., 15 min)
Loaded Validation: Full test load (worst-case: wrapped instruments, dense packs); verify all locations reach sterilization temperature within <3 min rise time
Packs with Thermocouples: Place calibrated thermocouples inside wrapped instrument trays at various positions; record temperature profiles; verify all reach 121°C ±2°C
Biological Indicators (BI): Use Geobacillus stearothermophilus spores (worst-case for steam sterilization); inoculate with >10^6 spores; place in hardest-to-reach locations (center of dense pack)
BI Results: Post-sterilization incubation >24 hours; growth indicates cycle failure; NO growth confirms ≥6 log kill (99.9999%)
Pressure & Time Profile: Document pressure rise rate, plateau duration, exhaust phase; verify no pressure spikes (indicates steam quality issues)
Air Removal Efficacy: If autoclave has vacuum stage, verify <2% residual air (measured by Helix test packs with thermal/mechanical indicators)
3+ Sterilization Cycles: Run minimum 3 replicate cycles with BI at worst-case location; all BIs must show no growth to demonstrate repeatability
Qualification – Installation, Calibration & Ongoing Maintenance
Steam Sterilizer / Autoclave (Equipment)
What we’re asking: Is the autoclave properly installed, instrumented, and ready for routine sterilization of pharmaceutical products?
Evidence: Installation Checklist, IQ/OQ/PQ Test Reports, Thermometer/Gauge Calibration Certificates, Biological Indicator Records, Maintenance Schedule, Annual BI Challenge Documentation
Activities:
Ongoing Monitoring:
IQ – Installation: Document equipment serial number, model, year manufactured; verify manufacturer manual provided; confirm unpacking inspection (no visible damage, all accessories present)
IQ – Utilities: Verify steam supply: boiler capacity adequate for autoclave demand (≥150 kg/h for typical pharmaceutical autoclave); steam quality tested (dryness >99%, no oil/salt contamination)
IQ – Utilities: Confirm water supply for steam generation; verify drainage system adequate for condensate; electrical supply verified (correct voltage, grounding)
IQ – Instrumentation: Calibrate primary recording thermometer (Hg-in-glass or electronic) against NIST-traceable standard; accuracy ±1°C at 121°C
IQ – Instrumentation: Calibrate pressure gauge (0-2 bar range); accuracy ±2% of full scale; install low-cost backup mechanical gauge as secondary verification
IQ – Safety: Verify pressure vessel safety relief valve set at 2.0 bar nominal; test by applying external pressure; confirm valve opens at set point ±0.1 bar
IQ – Safety: Confirm door interlock functional: attempt to open door under pressure (15-20 bar); door must not open; once pressure <0.2 bar, door opens freely
OQ – Performance: Empty chamber run: verify steam admission, temperature rise, plateau at 121°C, exhaust phase, residual pressure <0.2 bar at cycle end
OQ – Repeatability: Run 3 consecutive empty chamber cycles; verify temperature, pressure, timing consistent within ±2°C, ±0.1 bar, ±1 min
OQ – Drain System: Run cycle with collection container under drain valve; weigh condensate; verify >90% of steam condensed (indicates proper steam utilization); no live steam escaping
OQ – Vacuum Function (if applicable): Verify vacuum pump removes air to <2% residual; test with manometer or vacuum gauge; confirm exhaust filter not clogged
PQ – Worst-Case Load: Sterilize worst-case load per validation protocol (e.g., 5 wrapped trays, 50 instruments per tray); verify all reach & maintain 121°C for hold time
PQ – Routine Validation: Quarterly or semi-annual BI test cycles with Geobacillus stearothermophilus (per AAMI or ISO 11135); NO growth required to remain qualified
Daily: Pre-sterilization empty run; verify temperature reaches 121°C, pressure plateau; visual check of steam quality (condensate clear, no oil)
Weekly: Chemical indicator test (autoclave tape or Helix indicator pack); confirms temperature >121°C achieved
Monthly: Pressure gauge calibration check against standard; document reading (±2% tolerance)
Quarterly/Semi-Annual: Biological indicator challenge test; use Geobacillus stearothermophilus spore strips; incubate post-sterilization >24 hours; NO growth confirms continued efficacy
Annual/Biennial: Safety relief valve re-certification by qualified technician; pressure vessel inspection by authorized inspector
As-Needed: Change control: any equipment modifications (e.g., new drain valve, modified timer setting) requires change control documentation & revalidation of affected cycle(s)
Example 5: HPLC System (Analytical Instrument)
Verification – Design & System Architecture
HPLC System (Instrument + Software/Chromatography Data System)
What we’re asking: Does the HPLC system design (hardware + software) meet our analytical requirements per USP <621>, <1010>, and 21 CFR Part 11?
Evidence: Equipment Specification Sheet, CDS Technical Specifications, System Architecture Diagram, Design Review Report, Requirements Traceability Matrix (RTM)
Hardware Design Activities:
Software/CDS Design Activities:
Verify pump design: capable of isocratic & gradient elution at specified flow rates (e.g., 0.5-2.0 mL/min); pressure rating adequate for method (e.g., >300 bar for reverse-phase)
Confirm detector type (UV-Vis, diode array, fluorescence) suitable for analyte detection; sensitivity & wavelength range specified
Verify column oven temperature control ±0.5°C to ensure chromatographic reproducibility (temperature stability critical for retention time precision)
Confirm autosampler design: cool sample storage (4°C or ambient), injection precision ±0.5 µL, needle washing between injections
Verify flow cell design: minimal peak broadening; cell volume <10 µL (typical); optical path length matches method requirements
Confirm plumbing materials: PEEK or stainless steel fittings; no leachable contaminants into mobile phase or samples
Review Chromatography Data System (CDS) architecture: automated method setup, peak detection algorithms, integration parameters (tangent skim baseline, exponential tail fitting)
Verify calibration workflow: standards can be organized in calibration curve structures; curve fitting methods (linear, quadratic) with R² calculation
Confirm sample result calculation: automated unknown concentration calculation from calibration curve; appropriate significant figures
Verify audit trail design: all method changes, calibration adjustments, result modifications logged with user ID, timestamp, old value, new value, reason
Confirm data integrity: audit trail immutable after result is signed; electronic signatures per 21 CFR Part 11.100 (hash value, timestamp, user identity)
Verify system suitability (SST) design: CDS automatically verifies peak resolution (Rs ≥1.5), symmetry (0.8-1.2), repeatability (RSD <2%) before allowing sample analysis
Confirm peak purity assessment (if diode array): CDS calculates peak homogeneity; flags co-eluting peaks to prevent false results
Verify backup & recovery: database replication, encrypted backup, disaster recovery procedures documented
Validation – Analytical Performance & Data Integrity
HPLC System (Instrument + Software/Chromatography Data System)
What we’re asking: Does the HPLC system produce accurate, repeatable, reliable analytical results under actual laboratory conditions?
Evidence: HPLC Validation Report (HVR), Analytical Performance Summary, System Suitability Data, Data Integrity Report, Electronic Signature Test Results, Backup/Recovery Test Documentation
Analytical Performance Validation:
Data Integrity Validation (21 CFR Part 11):
Linearity: Prepare 5-7 reference standard concentrations (10%, 50%, 100%, 150%, 200% of target); inject in triplicate; plot peak area vs. concentration; verify R² ≥0.99; residual plot shows random distribution
Accuracy: Analyze reference standards at 80%, 100%, 120% of target concentration (n=3 each); recover 98-102% ± acceptable RSD; demonstrates no systematic error
Precision: Intra-day: inject same standard 6 times within 1-2 hour window; RSD ≤2%; Inter-day: repeat over 5-10 days; RSD ≤5%; demonstrates instrument stability
System Suitability: CDS verifies resolution (Rs ≥1.5 for critical peaks), peak symmetry (0.8-1.2), theoretical plates (N ≥5000); SST must pass before sample analysis allowed
Detector Response Linearity: Confirm detector output (peak area/height) is linear across analyte concentration range; no saturation or baseline drift at high concentrations
Specificity/Selectivity: Inject blank (no analyte); placebo formulation; standard; verify no interference at analyte retention time; peak purity ≥95% (if diode array detector available)
Range: Establish lower limit of quantitation (LLOQ – S/N ratio 10:1) and upper limit of quantitation (ULOQ – maintain linearity, accuracy)
Robustness: Deliberately vary method parameters (pH ±0.2 units, flow rate ±0.1 mL/min, oven temperature ±2°C, mobile phase composition ±2%); confirm results remain within specification
Audit Trail Testing: Modify calibration point (e.g., change conc. from 100 to 101 ppm); verify audit trail records: timestamp, user ID, old value (100), new value (101), reason code; audit trail accessible but not editable
Electronic Signature Testing: User signs analytical result; verify signature captured with user ID, timestamp, intent statement (“I attest this data is accurate”); signature invalid if result modified post-signature
Backup & Recovery: Simulate database corruption; restore from backup; verify all chromatograms, calibrations, audit trails, signatures intact and unchanged
Access Control: Test role-based permissions: analyst creates data; QA cannot modify without authorization; audit trail shows attempted unauthorized access (if applicable)
System Clock Accuracy: Verify CDS computer clock synchronized to atomic time server (NTP); timestamp accuracy ±1 second; critical for audit trail temporal integrity
Data Archival & Retrieval: Archive validated batch to read-only storage; verify data retrievable after 30 days; bit-for-bit integrity check (checksum or hash value) confirms no data corruption
Concurrent User Testing: 3+ analysts simultaneously log in, create methods, run calibrations; verify no data conflicts, proper locking mechanisms prevent overwrites; all audit trails recorded accurately
System Failure/Recovery: Simulate power failure during data collection; restart system; verify partial chromatogram not corrupted, data complete & accurate from previous successful runs
Qualification – Installation, Configuration & Ongoing
HPLC System (Instrument + Software/Chromatography Data System)
What we’re asking: Is the HPLC system properly installed, configured, and ready for GMP analytical use?
Performance Qualification (PQ):
Ongoing Monitoring & Maintenance:
Evidence: HPLC Installation Report, Hardware/Software Inventory List, IQ Checklist, OQ Test Results (pump flow, detector wavelength, injection precision), PQ Method Performance Data, System Suitability Trending Reports, Ongoing Monitoring Log, Annual Maintenance Records, Change Control Documentation
Installation Qualification (IQ):
Operational Qualification (OQ):
Hardware Documentation: Document all system components: instrument model, serial number, firmware version, detector type, column oven serial #, pump serial #, autosampler version
Software Documentation: CDS version/build, operating system version, database engine, security patches applied; generate CDS system information report
Utility Verification: Confirm stable electrical power supply (>95 VAC standard deviation); backup power available; room temperature stable 18-25°C; humidity 45-60%; no vibration from adjacent equipment
Network Setup: If CDS networked: document IP addresses, firewall rules, network stability (packet loss <0.1%); verify secure connection (encrypted, VPN if remote access)
Detector Lamp (if applicable): Record lamp serial number, installation date, hours of operation (baseline for maintenance); lamp certification provided
Column Installation: Verify correct column installed per method specification (particle size, pore size, bonding phase); document lot number, expiration date, installation date
Mobile Phase Preparation: Confirm degasser functional (vacuum, helium sparging, or other); mobile phase prepared per method (HPLC-grade solvents, filtered <0.2 µm)
Calibration of Balances/Pipettes: All balances & pipettes used to prepare standards & solutions within calibration intervals (±tolerance documented)
Pump Performance: Verify pump delivers specified flow rate (e.g., 1.0 mL/min); measure actual vs. expected; within ±5% (or ±0.05 mL/min, whichever is more stringent)
Pump Pressure Stability: Run isocratic method; monitor back-pressure over 30 min; verify <±10 bar drift; indicates good column packing & no leaks
Detector Response: Inject standard solution; detector generates signal; confirm wavelength accurate (e.g., ±2 nm for UV detector); response linear across range
Autosampler Temperature Control: If cooled sampler: set to 4°C; verify temperature achieved within 30 min & maintained within ±2°C throughout day-long use
Column Oven Temperature Control: Set to 30°C; verify achieved within 15 min & maintained ±0.5°C; repeat at 25°C & 40°C to confirm across range
Injection Precision: Inject same standard 6 times; verify peak area RSD ≤2%; indicates autosampler needle positioning reproducible
Gradient Linearity (if applicable): Run gradient program from 5% to 95% B over 30 min; verify actual solvent composition matches programmed; use calibrated refractive index or other method
System Suitability Program: Verify CDS automatic SST function operational: calculates Rs, symmetry, N before allowing sample analysis; passes SST criteria per method
Data System Functionality: Create test method in CDS; run sequence; verify chromatogram acquired, peak detected, integration performed, result calculated; save to database; retrieve & view archived data
Detector Wavelength Accuracy: If applicable, use holmium oxide filter or wavelength standard; verify detector wavelength reading within ±2 nm at multiple settings (254 nm, 280 nm, 210 nm)
Method Performance: Execute full analytical method (real assay): prepare standards, run calibration, analyze QC samples, analyze test sample; verify expected results recovered; document pass/fail
System Suitability Pass: For each analytical run, SST must pass (Rs ≥1.5, symmetry 0.8-1.2, N ≥5000, RSD ≤2%); verify CDS blocks result release if SST fails
Accuracy Confirmation: Analyze reference standard at expected concentration; recover 98-102% ± RSD; confirms system producing valid quantitative results
Repeat Precision: Run same sample 3 times on same day (intra-day); RSD ≤2%; indicates reproducibility within acceptable limits
Data Integrity Workflow: Analyst creates & signs result; QA reviews, approves, & counter-signs; verify audit trail captures all actions; data immutable after approval
Daily: System Suitability test run before production analysis; document pass/fail; if fail, troubleshoot (column reequilibration, mobile phase degassing, detector lamp check) before proceeding
Daily-Weekly: Visual inspection: check for mobile phase leaks, pump noise changes, detector lamp intensity (if variable wavelength); document observations
Monthly: Pump performance check: measure flow rate accuracy; pressure stability at nominal setting; record results
Monthly: Column performance monitoring: calculate theoretical plates (N) from SST peaks; if N declines >10%, consider column aging or backpressure issues
Quarterly: Detector wavelength accuracy verification (holmium oxide filter or standard); document wavelength readings ±tolerance
Semi-Annual: Column oven temperature verification: run standards at multiple oven temperatures (±5°C from nominal); verify SST parameters still pass
Annual: Full system performance revalidation: repeat key OQ/PQ tests (pump flow, detector response, injection precision, method accuracy); document pass/fail
Annual: Preventive maintenance by vendor: detector lamp replacement (if due); pump seal inspection; column oven recalibration; software updates applied & tested
As-Needed: Change control: any method modifications (new wavelength, different column lot, gradient change, new reference standard lot) requires revalidation of affected analytical performance parameters
Software Security: Monthly user access review (active accounts, permission levels); quarterly backup testing (restore sample data file, verify bit-for-bit identity); annual security patch application & testing
Example 6: Water System (Purified Water)
Verification – Design Phase
Purified Water System (Equipment/Infrastructure)
What we’re asking: Does the water system design meet our water quality and pharmaceutical use requirements?
Activities:
Evidence: Engineering Design Package (EDP), PFD/P&ID review, Material certification documents, Design specifications document
Verify system design produces water meeting USP <645> Purified Water specification (conductivity <2.1 µS/cm, TOC <500 ppb, bacteria <100 CFU/mL)
Confirm piping material: 316L stainless steel with mechanical polish (Ra 10 µm) per ASME BPE
Validate reverse osmosis membrane specifications and replacement frequency (per manufacturer guidelines)
Review heating/circulation design: system maintains >65°C to inhibit bacterial growth; hot water returns to source
Verify sampling ports at: source (post-treatment), post-RO, post-polishing, distribution loop (multiple points)
Confirm system includes conductivity monitoring, temperature sensors, alerts for out-of-spec conditions
Validation – Performance & Microbial Testing
Purified Water System (Equipment/Infrastructure)
What we’re asking: Does the water system actually produce purified water meeting specification under routine operation?
Activities:
Evidence: Water System Validation Report (WSVR), Microbial Study Report, Chemical Analysis Report, Temperature Monitoring Data
Chemical Composition: Collect samples from source, post-RO, post-polishing, distribution loop; test for conductivity, TOC, ions, heavy metals; document results vs. USP <645>
Microbial Validation: Collect samples (100 mL per location) weekly for 4 weeks; culture for total aerobic microbes and gram-negative bacteria per USP <2023>; all results must be <100 CFU/mL
Biofilm Testing: If system >5 years old, swab internal pipe surfaces post-CIP; test for biofilm growth; if detected, perform chemical passivation and retest
Temperature Stability: Document loop maintains >65°C at all points; verify temperature sensors calibrated; alarm triggers if <60°C
System Efficiency: Measure RO membrane rejection rate; if <95%, schedule membrane replacement
CIP Effectiveness: Validate cleaning-in-place (CIP) cycle removes residues; conduct TOC/conductivity tests before and after CIP
Qualification – Installation, Commissioning & Ongoing
Purified Water System (Equipment/Infrastructure)
What we’re asking: Is the water system properly installed, flushed, and ready for production use?
Activities:
Evidence: Installation Records, Commissioning Report, OQ Test Results, PQ Documentation, Maintenance & Monitoring Schedule
IQ: Document all equipment serial numbers: RO unit, UV lamp, conductivity sensor, temperature probes; verify material certificates (316L SS piping)
IQ: Inspect piping welds for defects; confirm mechanical polish completed per specification (surface roughness <10 µm)
IQ: Verify all instrumentation calibrated and in-service (conductivity probe, temperature sensors)
OQ: Perform system flushing: circulate water at elevated flow for 8+ hours to remove particles; monitor conductivity (should decrease as contaminants flush out)
OQ: Test all alarms: simulate high conductivity condition, low temperature, low flow; verify alerts trigger and notify operations
OQ: Run CIP cycle; verify spray pattern covers all internal surfaces; drain volumes and CIP chemical usage documented
PQ: Verify system performance under production load: water supplied to 3 points simultaneously; conductivity, temperature remain within spec
Ongoing: Weekly conductivity/TOC sampling; monthly microbial sampling; quarterly temperature verification; annual UV lamp replacement
Key Takeaways from Real-World Examples
Verification is about DESIGN: Review documents, design specifications, and technical requirements. Verify the design LOGIC is sound.
Validation is about PERFORMANCE: Test the actual system/equipment under realistic conditions. Prove it works as intended and maintains data integrity.
Qualification is about READINESS: Verify installation is correct, equipment calibrated, and infrastructure ready. Confirm the environment supports operation.
All Three Work Together: A poorly verified design will fail validation testing. Unqualified infrastructure will cause validation failures. All three are essential.
Timeline Matters: Verification happens early (design phase). Qualification happens at installation. Validation spans both and includes ongoing periodic reviews.
📋
Verification – Regulatory Basis
21 CFR 210.30 (Design)
21 CFR 820.30 (Design Controls)
EU Annex 11 (Design Qualification)
Key Requirements
Document design specifications & inputs
Perform design reviews at appropriate stages
Establish design output verification methods
Maintain design history file
Traceability from requirements to design outputs
GAMP 5 Context
Part of overall validation lifecycle. Supplier verification of design is vendor responsibility; user conducts assessment.
FDA Inspection Focus →
Inspectors review design documentation, verify design reviews occurred, assess traceability. Missing design verification is common 483 finding.
🔬
Validation – Regulatory Basis
21 CFR 211.25(b) (Validation)
21 CFR 820.25 (Personnel)
EU Annex 11 (Validation)
21 CFR Part 11
Key Requirements
Prospective or concurrent validation protocols
Document user requirements & acceptance criteria
Include edge cases, stress conditions, disaster recovery
Data integrity & audit trail validation
Validation reports with deviations resolved
GAMP 5 Context
Core activity. Risk-based approach determines scope & depth. CSV includes IQ/OQ/PQ for computerized systems.
FDA Inspection Focus →
Inspectors expect prospective validation, adequate protocols, documented change controls post-validation. Inadequate scope is major finding.
✅
Qualification – Regulatory Basis
21 CFR 211.63 (Equipment)
21 CFR 820.70 (Equipment)
EU Annex 11 (Installation & Operational Qualification)
Key Requirements
IQ – Verify equipment per specifications received
OQ – Verify equipment operates within limits
PQ – Verify equipment produces valid results
Environmental monitoring & maintenance schedules
Change control for equipment modifications
GAMP 5 Context
Part of supplier management. Risk-based approach determines extent of IQ/OQ/PQ. Routine calibration is operational activity, not validation.
FDA Inspection Focus →
Inspectors review IQ/OQ/PQ documentation, equipment certifications, calibration records. Verify qualification maintained through periodic reviews & change controls.
Critical Compliance Notes
ALCOA+ Principles: Verification, Validation, and Qualification are foundational to data integrity – ensuring attributability, legibility, contemporaneous recording, originality, and accuracy.
Lifecycle Context: All three are required and interconnected. Validation cannot succeed without verification; qualification enables validation to function properly.
Risk-Based Approach: GAMP 5 emphasizes proportionate effort. Lower-risk COTS systems require less extensive validation than bespoke or complex systems.
Ongoing Responsibility: Post-approval, change control and periodic review maintain validation status. Revalidation is triggered by changes exceeding established limits.
Documentation: Validation Master Plan (VMP) and Validation Summary Reports (VSR) required for regulated submissions and FDA inspections.