A Concise Review of Analytical Method Development and Validation

  

Development and Validation

What is Analytical Method Development?

Analytical technique development and validation are ongoing and interdependent tasks linked to research and development, quality control, and quality assurance departments. Analytical processes are essential in the assessment and management of equivalence and risk. It aids in the development of product-specific acceptability criteria as well as the consistency of findings. Validation should show that the analytical process is appropriate for its intended purpose. Experiment design is a valuable tool for method characterization and validation. Analytical specialists should feel at ease using it to characterize and optimize analytical methods. Effective analytical technique development and validation can result in considerable increases in accuracy and a reduction in bias errors. It can also assist in avoiding costly and time-consuming activities.

Analytical Method Development

Analytical chemistry is a discipline of science that employs advanced technology to determine the composition of a substance using an analytical approach. We can obtain both qualitative and quantitative outcomes. Analytical equipment serves a critical role in achieving high-quality and trustworthy analytical results. As a result, everyone in the analytical laboratory should be concerned about equipment quality assurance.


The analytical method might be spectroscopic, chromatographic, electrochemical, hyphenated, or other. Analytical technique development is designing an objective assay approach to identify the composition of a formulation. It is the process of demonstrating that an analytical method is suitable for use in a laboratory to determine the concentration of future samples. Analytical techniques should be utilized in GMP and GLP environments and must be designed by the procedures and acceptance criteria outlined in the ICH standards Q2. The following are the requirements for method development:


  • Qualified analysts

  • Qualified and calibrated instruments

  • Reliable reference standards

  • Documented methods

  • Sample selection and integrity

  • Change control


An analytical procedure is created to test a specific property of the material against accepted acceptance criteria. The selection of analytical instruments and techniques in creating a novel analytical procedure should be based on the desired purpose and scope of the analytical technique. Specificity, linearity, detection and quantitation limits, range, accuracy, and precision are critical to consider during method development. Throughout the early phases of method development, this quality eventually helps select which method will be authorized. The creation of analytical processes is generally based on a mix of mechanistic understanding of the underlying approach and earlier experiences. Can utilize early procedure experimental data to influence future progress.


The following are the steps that are often taken during method development:


  • Standard analyte characterization

  • Method requirements

  • Literature search

  • Selecting the method

  • Instrumental setup and preliminary studies

  • Optimization of parameters

  • Documentation of analytical figure

  • Used the sample to evaluate the method development.

  • Determination of percent recovery of the sample

  • Demonstration of quantitative sample analysis


The analytical chemist's motivation is to be able to deliver accurate, trustworthy, and consistent results. Method development methods are complex, time-consuming, and costly efforts. An analytical method describes the methods and procedures required to conduct an analysis. Preparation of samples, standards, and reagents; use of apparatus; development of the calibration curve; application of equations for computation, and so on. Analytical Method Development is necessary for the following tasks:


  • Herbal products and their potency

  • New process and reactions

  • New molecules development

  • Active ingredients

  • Residues 

  • Impurity profiling

  • Component of interest in different proportion

  • Degradation studies

development and validation

Need of analytical method development and validation

The necessity to validate its analytical technique development and validation arose from worldwide rivalry, the necessity to maintain the standard of goods with high economic and market value, and ethical considerations. Various International Regulatory Agencies have established the standard and procedures for giving approval, authentication, and registration. The following are some well-known organizations that manage quality standards:


  • World Health Organization (WHO)

  • (US FDA) The United States Food and Drug Administration 

  • ISO/IEC 17025

  • Current Good Manufacturing Practice (cGMP) regulations

  • The International Conference for Harmonization (ICH)

  • Good Laboratory Practice (GLP) regulations.

  • Pharmaceutical Inspection Cooperation Scheme (PIC/S)

  • The Pharmaceutical Inspection Cooperation Scheme’s (PIC/S)


When modifications are made to validated nonstandard techniques, the impact of those modifications should be recorded and should perform a new validation. For example, if standard techniques for a particular sample test are available, should utilize the most recent version. Validation entails specifying requirements, determining method characteristics, ensuring that the requirements can be satisfied using the method, and issuing a statement of validity.


Adopt a systematic methodology for method robustness study, followed by an initial risk assessment and multivariate experiments, to fully grasp the influence of changes in method parameters on an analytical technique. Such techniques enable us to comprehend the implications of parameters on technique performance. Analyses of samples acquired from in-process production stages through the completed product may be used to evaluate the performance of a procedure. May use the data gathered during this research on the sources of technique variation to evaluate the technique's performance.

Validation of the method

The combination of four components ensures data quality: analytical instrument qualification, analytical technique validation; system appropriateness testing; and quality control checks. The purpose of validating an analytical technique is to demonstrate that it is appropriate for its intended usage. The approach is typically validated under the following conditions:


  • During the method development process

  • Examining the system's suitability

  • Application, environment, and analyst changes

  • When used after a lengthy amount of time

  • Examining reliability and consistency


The kind and scope of the validation studies required will be determined by the method and analytical technique utilized. The most popular validation procedures are identification, assay, and impurity determination.


The validation report describes the validation study's findings. Its objective is to offer information about the qualities examined during the study, the results achieved, and the interpretation of those data. A validation report will typically contain the following information:


  • The results

  • Analytical method

  • Validation protocol

  • Details of batch number

  • The validation parameters

  • Interpretation of the results

  • Relevant validation information

  • Details of the reference materials

  • References to the laboratory details

  • Details about the study's equipment


The following are typical validation parameters recommended by the FDA, USP, and ICH:


  • Accuracy

  • Precision

  • Robustness

  • Specificity

  • Solution stability

  • Limit of Detection 

  • Linearity and Range

  • Limit of Quantification 


Method validation is a broad field that covers various validation factors with varying methodologies for varying requirements depending on the intended application of the analytical method. During normal usage, a validated approach elucidates an unexpected or undiscovered difficulty. The amount of trust in a validated approach is low. Following method creation, it must be validated by the requirements to provide a specific level of trust in its intended application.

Criteria of Validation

The validation of an analytic method indicates the measurement's or characterization's scientific soundness. Throughout the regulatory filing process, it is necessary to various degrees. The validation process shows that an analytic technique measures the proper material correctly and within the suitable range for the samples. It enables the analyst to comprehend the method's behaviour and determine the method's performance boundaries.


The laboratory should follow a documented standard operating procedure (SOP) that specifies the method validation process to validate a method. The laboratory's instrumentation should be certified and calibrated. Before validation, there should be a well-developed and documented test procedure as well as an authorized methodology. The protocol is a systematic strategy that specifies which check method performance parameters and how the parameters will be evaluated using the protocol's acceptance criteria. As with pharmaceuticals, an API or drug product, placebos, and reference standards are required to conduct validation trials.


The degree of agreement between the values discovered is referred to as accuracy—the value regarded as an actual conventional value or the recognized reference value. There are several ways for determining accuracy available. First, it can be screened by applying an analytical technique to an analyte of known purity and comparing the findings of the proposed analytical technique to those of a second recognized process, the accuracy of which is declared and specified. Second, it can also be deduced once accuracy, linearity, and specificity have been confirmed.


The precision of an analytical technique reflects the degree of agreement between a series of measurements acquired from numerous samplings of the same homogenous sample under the defined circumstances. It is further subdivided into repeatability, intermediate precision, and reproducibility. For each form of accuracy studied, the standard deviation, relative standard deviation, coefficient of variation, and confidence interval should be stated.


1. Repeatability should be evaluated using a minimum of 9 determinations covering the procedure's stated range by three duplicates or six first, it determinations at 100% of the test concentration.


2. The procedure's ability to provide immediate precision is dependent on the conditions in which it is meant to be employed. The random occurrences that affect the precision of the analytical technique include the specific day, the analyst executing, and the equipment. It is not thought essential to investigate these impacts separately. Should promote the usage of an experimental design.


3. An inter-laboratory trial is used to measure reproducibility. When standardizing an analytical technique, reproducibility should be taken into account.


4. Specificity is the capacity to examine the analyte for multiple components that may be present. It can be established using a variety of techniques, depending on the method's intended goal. The method's capacity to assess the analyte of interest in a drug product is assessed by check for placebo interference. Can determine specificity by measuring the API in samples that have been spiked with contaminants or degradants. If API-related molecules are not accessible, the medication might be stressed or force-degraded to yield degradation products. Peak purity assessments by photodiode array, mass purity measurements by mass spectrometry (MS), or proving separation efficiency using alternate column chemistry can all be used to validate the apparent separation of degradants in chromatographic separations. Degradation of the API is planned at 5 to 20% during forced degradation trials to eliminate worries about secondary degradation. A particular analytical procedure's lack of specificity may be compensated for by additional supporting analytical techniques.


5. The detection limit of an individual analytical procedure is the least amount of analyte detected in a sample. It may be determined visually using the signal-to-noise ratio, response standard deviation, and slope. The detection limit signal to noise technique is only applicable to analytical methods with baseline noise. Comparing measured signals from samples with known analyte concentrations to those from blank samples to determine the minimal concentration may consistently identify the analyte. A signal-to-noise ratio of 3 or 2:1 is typically deemed appropriate for evaluating the detection limit. The detection limit may be stated as DL=3.3 / S, where is the standard deviation of the response and S is the slope of the calibration curve. The slope S may be determined using the analyte calibration curve. Based on the standard deviation of the blank and the calibration curve, we can estimate various methods.


6. The capacity of an analytical process to produce test findings directly proportional to the concentration of analyte in the sample is referred to as linearity. Test findings should be analyzed using proper statistical methods, such as the least-squares approach, which involves calculating a regression line. The correlation coefficient, y-intercept, slope of the regression line, and residual sum of squares are suggested for a minimum of five concentrations.


7. The range of an analytical technique is the interval between the higher and lower analyte concentrations in the sample. It has been proved that the analytical technique has a satisfactory degree of precision, accuracy, and linearity.


8. The effect of slight modifications in chromatographic procedures on system suitability metrics such as peak retention, resolution, and efficiency is often used to determine robustness. The following experimental conditions are commonly modified during technique robustness evaluations:


  1. age of standards and sample preparations 

  2. sample analysis time 

  3. variations to pH of mobile phase 

  4. variation in mobile phase composition 

  5. analysis temperature 

  6. flow rate 

  7. column manufacturer 

  8. type and use of filter against centrifugation


Robustness studies provide an excellent chance to use the statistical design of experiments, which provides data-driven technique control.


The ICH validation guidelines define method kinds based on the method's purpose and offer appropriate evaluation kinds. The ICH guidelines provide specific validation techniques based on the aim of the procedures. In addition, it specifies the data that should report for each validation parameter. Validation acceptance criteria must be based on the method's historical performance, product specifications, and development phase.


The route to validation, as previously said, is a continuum. It begins in the early stages of development as a series of informal trials to prove the method's suitability for its intended goal. Then, throughout the regulatory submission procedure, it is grown into a fully documented report necessary for commercial production. Finally, it is carried out anytime there is a substantial change in the instruments, technique, standards, or process.


Conclusion


Analytical technique development aids in the understanding of crucial process factors and their impact on accuracy and precision. Analytical techniques should be established utilizing the procedures and acceptance criteria outlined in the ICH guidelines Q2. Method validation aids in validating the analytical method across a wide range of concentrations, ensuring that changes in formulation or concentration do not necessitate additional validation. Once the procedures have been established, qualified, and verified, the influence on out-of-specification rates and process capabilities must be assessed and analyzed to establish their efficacy for future usage.


Comments

Popular posts from this blog

The 5 Phases of Drug Development | Rondaxe CMC Consulting