What Has Changed in Method Development and Validation for Pharmaceuticals and Why You Need to Know

The 4 best practices that assure robust method validation

The traditional approach for defining Drug Master Files (DMFs), New Drug Applications (NDAs) or Abbreviated New Drug Applications (ANDAs) is no longer acceptable.  The expectations for analytical methods have changed over time and a successful filing depends on designing methods around these new expectations.  The concept of developing or validating a method for drug substance or drug product has changed drastically, largely due to two factors:

  • Issuance of a new set of guidelines by the FDA and ICH.
  • Advancements in analytical instrumentation.

These are now intertwined.  With the new regulations, FDA and ICH guidelines serve only as a foundation for a task: they are no longer the entire concept for meeting compliance levels.  Now if a guideline calls for routinely monitoring genotoxins or a residual solvent at 1 ppm to sub ppm, for example, the instruments for doing so have to be available at the facility. 

New regulatory requirements and their implications

To be in compliance, the sponsor and CDMO must now prove to the FDA that:

  • They are in control of the entire process.
  • That they understand and monitor the entire process.
  • That they have the instrumentation to monitor the process.
  • That there is a system in place to ensure the integrity of the data.
  • That the data is aligned with the latest regulations.

The implication of these changes is that before sponsors grant projects to a CDMO, it is more important than ever to understand the CDMOs capabilities and experience.  The one-size-fits-all concept is no longer applicable in choosing a cGMP manufacturing facility.  To meet the new regulatory requirements the CDMO must have a group of well qualified analytical and organic scientists, chemical engineers, process chemists, and a well-equipped quality control (QC) department with the latest technology, in order to deliver a product that meets specifications to the satisfaction of the FDA. 

TECH TRANSFER

Critical knowledge the CDMO team must possess

One of the criteria for developing a successful drug product is selecting an API manufacturer that has an experienced, knowledgeable team of experts who understand:

  • The process chemistry.
  • The components the sample may contain.
  • The physiochemical characteristics of the of the molecule.
  • The impurity profile of the sample.
  • Whether the sample contains any genotoxic components.
  • Whether the impurities and genotoxic components will remain in the sample or be purged out.
  • How to set the specifications for each sample, which are aligned with ICH and regulatory guidelines.

Method development – the key to optimization

In conjunction with the process chemistry, the analytical scientists will develop a method or methods for each step of the process. In most cases it will be a chromatographic technique that tracks the process chemistry each step of the way.

The analytical department, organic scientists, engineers and regulatory experts then design suitable specifications for each step and develop methods suitable for its intended use.  A well-equipped and well-staffed analytical department is crucial for a manufacturing facility.  It is the type of expertise sought by savvy sponsors.

Sponsors should also focus on the strength of the analytical department’s ability to measure the performance of a process. If one can’t measure it, one can’t optimize the process. And if the process is not optimized, the FDA filing will be delayed, and delays are often costly. 

How poor validation design stalls the drug development process

The intent of the FDA’s updated requirements, based on new scientific discoveries, is to ensure drug safety and efficacy. Method validation is a key component of the equation and includes checks and balances as assurance of data integrity.

Successful method validation depends on determination of the following:

  • Is the method developed suitable for its intended use? 
  • Is the sample specification designed for its purpose and justifiable? 
  • Is the process chemistry well understood, and is it under control? 
  • Does the specification comply with ICH and FDA guidelines? 
  • What is the maximum daily dose (MDD) of the drug product? 
  • Are the validation protocol and acceptance criteria designed around the sample specification? 
  • Have meaningful forced degradation studies been conducted for the product? 

If any of these factors are not considered carefully, the FDA will request further justification, additional method development and validation, changes to the specifications, and/or process changes. Therefore, a poor design in addressing the above tasks will certainly delay or sometimes even stall the product development or product launch. 

In one example of poor validation design, based on the maximum daily dose (MDD) of the drug, the concentration of each unknown impurity should not exceed 0.05 % in the sample. However, the validation protocol, and the method, were based on a 0.10% specification (which typically is the norm in the industry). This shortcoming certainly will be followed by an FDA deficiency letter, ensuring a filing delay that will require re-visiting the poorly executed validation.   
In a second example, let’s say a molecule has component X with specification of 6.5 to 7.5%. Upon close examination of the validation protocol, one will find the percentage of relative standard deviation (RSD) for the precision study is set at not more than 5% (which again it is typically an acceptable precision value in industry at this concentration level). Unfortunately based on these criteria, if the component X in the sample is close to the upper or lower specification limit, one can test and fail a good batch—or pass a failed batch. This can happen when the method precision is not tight enough for its intended use. 

The point from these two examples is that the validation protocol and the acceptance criteria must be carefully designed and scientifically driven to adhere to guidelines rather than based on standard criteria values previously used for similar studies.

The 4 best practices to assure method validation that conforms to today’s standards

Adherence to these best practices will ­­­­help avoid costly regulatory delays and missed milestones.

  1. Design a unique validation protocols for each unique product
    Analytical laboratories must refrain from a cookie-cutter approach that uses the same designs of testing and acceptance criteria for each method. Each validation protocol needs to be unique and be designed with the specific goals and acceptance criteria that apply to that analysis. 
  1. Conduct a forced degradation study prior to method validation
    In developing a stability indicating method, conduct a meaningful forced degradation study. If faced with a poor material balance, determine why the result was unacceptable, and how one can justify or detect and quantify the missing mass balance. If poor material balance is not addressed, the method can’t be claimed as a “stability-indicating” method.
  1. Choose an appropriate technique
    Appropriate techniques and choosing the right instrumentations are the foundation for successful method development and method validation. A method has to be scientifically sound, and justifiable.  The specification has to fit the purpose and the protocol acceptance criteria need to be defensible. 
  1. Design a realistic method robustness study
    Robustness and ruggedness studies can determine the margin of error in your method to avoid problems when other attempts are made to replicate the method. Outcomes can change under a variety of conditions that can include use of different equipment, instruments, reagents, temperatures, elapsed times, mobile phase preparation, and other factors. It’s better to conduct the method robustness study earlier rather than later in the process. I have seen methods that worked only in the originator’s lab, and sometimes even the originator lab can’t duplicate what they did six months earlier. Such shortcomings certainly will delay process chemistry optimization, launch of the product, and increase project cost.  

Summary

There is no substitution for experience, know-how, understanding of current regulations, and scientifically driven validation protocols.  Method validation along with the set of systems in place which can demonstrate the integrity of the data, ensure the safety and efficacy of each unique drug substance. The success of each method validation depends on a well thought out process, specification, qualified experienced scientists, well-equipped analytical department, and fully compliance driven lab. If any of the above elements is missing, there is good chance that product launch will experience costly delays.

For more articles about method development, check outMehdi Yazdi the Art of API Method Development,” “Analytical Method Development in API Manufacturing: The Key to Successful Commercialization Five Questions to Ask to Ensure You Pick the Right CDMO for the Job,” “Top Early Stage Method Validation Mistakes and How to Avoid Them,” and “Top Analytical Method Validation Mistakes and How to Avoid Them – Part 2: Mistakes to avoid during and after method validation.” Or call us at (978) 462-5555.