One of the many factors that must be considered when manufacturing a pharmaceutical product is the control and prevention of contamination. The presence of something unexpected and unwanted within a dosage form can pose serious risks to patients and, rightly, regulators demand that any contaminants are documented, and that their levels are below specific tolerances deemed safe. Ensuring that the levels are controlled within these limits is an important part of any manufacturing process, and suitable analytical techniques must be developed to prove that these standards are being met.
Various regulatory agencies publish guidance documents to assist manufacturers in meeting the pertinent regulatory specifications. Examples of these documents would be USP General Chapter 1086 in the United States, or EP General chapter 5.10 in Europe. Other relevant documents include ICH Q3 a, b, c & d, European Monograph 5.10, and EMA document CPMP/QWP/1529/04. Every global regulatory agency will have some variation of guidance document to assist in meeting its own individual requirements for impurity control.
In the U.S., regulators require screening to be carried out for three types of impurities: organic impurities, inorganic impurities and residual solvents. The organic impurities are generally process-related, being created as by-products during the synthesis or purification of the molecule, or they might be an unreacted starting material or reagent. Inorganic impurities are typically substances that have been added during the process, such as catalysts and other chemicals that might be required for the reaction to proceed. Residual solvents can remain from the synthesis or, more likely, be left over from the purification process. Again, these have been deliberately added during the process.
Identifying true unknowns is usually a very slow and expensive business, so the first step in monitoring for contaminants is to develop a thorough understanding of the process. This allows predictions to be made for what might be present, and therefore what should be tested for.
Because of the regulatory requirements for routine testing, elemental impurities are the most common tests to be run. Arsenic, cadmium, mercury and lead levels must always be tested, regardless of how likely they are to be present in any given sample. They can be found in natural sources, with lead the most common, and their toxicity has led to their inclusion in testing monographs. Any other metals that might have been added during the synthesis, such as catalysts, must also be checked for.
Inorganic impurities are typically monitored via inductively coupled plasma techniques, either via optical emission spectrometry or mass spectrometry. Atomic absorption spectroscopy can also be appropriate, depending on the levels that are being tested.
The safe levels of any metals that may be present must also be taken into account. What is deemed acceptable greatly depends on the size and form of the dosage that will be administered to patients. Limits for oral formulations are generally less strict than they are for products that will be administered parenterally. The more frequently the tablets are taken, or injections are given, the lower those limits will have to be in order to preserve patient safety.
However, the dosage method and level is often unknown early on in the development process. This poses a particular challenge for a contract lab, as the client may have no idea whether the product they are developing will be given as, say, a 250 mg oral dose, or a 100 µg injection, and the limits will be very different. The contract laboratory’s job is to advise the client what their options are. It might be appropriate to test each individual component that goes into the finished drug product, or it could be preferable to focus solely on the finished product.
As organic impurities are most likely to be process-related, a sound knowledge of the synthetic route, and an understanding of what side-products might be generated during the reaction, are important. These will give a starting point for identifying what impurities could be present, rather than having to undergo the slow, laborious and expensive process of identifying true unknowns.
It is, of course, impossible to predict everything that might be present. Any unexpected peak in a mass spectrometry analysis should be investigated and evaluated for potential toxicity. If the culprit is not toxic at this level, no further action is required besides its monitoring. Anything truly unknown needs to be identified and characterized before its toxicity can be determined. This can often be a slow and laborious process. Modern advances in mass spectrometry detection, coupled with gas chromatography (GC) and high-performance liquid chromatography (HPLC) separation techniques, have greatly improved this process.
Any problematic organic impurities will have to be controlled in some way in the manufacturing process. This may involve changes to the purification protocol to ensure the impurity is more effectively removed; if this proves impossible, then the only solution may be to modify the conditions of the synthesis to ensure that it is not made, or, made at a level that is sufficiently low to meet specifications.
Either way, if it is known that a potentially hazardous organic impurity can be produced, then the process must be carefully monitored for its presence on a routine basis. The monitoring does not involve further complex identification, but uses a validated testing protocol that can definitively say whether the impurity is present, and at what level. This testing is an important part of controlling impurity levels in a product, and preventing problems further down the line. Any resulting product recall would obviously be much more expensive.
Most organic impurity testing is carried out by HPLC techniques. Where appropriate, other methods can be employed, including Ultra-Violet (UV) spectroscopy, or even simple thin layer chromatography, depending on the level of specificity required.
Although they are also organic in nature, identifying residual solvents is much more straightforward. The identities of all the solvents that have been used throughout the process are known, whether they were employed as a reaction medium or in purification, and thus, it is usually a relatively simple task to develop a method for routine monitoring.
Gas chromatography is the most common technique for residual solvents. A few solvents, such as high-boiling solvents or organic acids, do not respond well using traditional GC detection techniques. Customized GC methods or ion chromatography techniques must be developed and validated in order to monitor these types of solvent.
Most manufacturers will have a good idea of exactly what chemical contaminants they need to control before approaching a CRO for assistance, so the analytical process rarely needs to be started from scratch. They are much more likely to be looking for assistance in developing a validated method for detecting a contaminant that they know may be problematic. A precise, linear and robust method is required for the detection of a contaminant. This requires thorough validation testing, performed according to a well-documented protocol that would be appropriate for a regulatory filing.
If an unexpected and unknown contaminant is identified, then its toxicology will have to be evaluated, and the safe exposure level established. Based on dosages, this will inform whether the level at which it is present is acceptable and nothing needs to be done, or too high, in which case some remedial action will have to be taken. There are many points at which it might have entered the manufacturing process other than as a by-product of the reaction. A thorough investigation of the entire process is often required to discover the cause.
Organic impurities are the most difficult to control, as they are process related. Both inorganics and solvents are added during manufacture, which makes them easier to control. However, the source of the starting materials is key, so supplier qualification, and verification of their certificates of analysis, are essential to ensure that what is received is as advertised. As the market becomes more global, managing the supply chain has increasing significance.
In the long run, it is much faster to run the tests in-house, so once the testing protocol has been developed and validated, provided the necessary expertise and equipment is available, it will normally be transferred to the client so that they can use it on a routine basis on their manufacturing train.
The big advantage of using an outsourcing partner for method development is the experience they bring to bear. Having worked with a huge number of widely varying samples and processes, the issues that have arisen in the past, and the solutions that have been created, can help inform future strategies. In-line process testing for contaminants is only as good as the methods used, so it is essential to employ carefully developed and thoroughly validated protocols.