One of the many factors that must be considered when manufacturing a pharmaceutical product is the control and prevention of contamination. The presence of something unexpected and unwanted within a dosage form can pose serious risks to patients and, rightly, regulators demand that any contaminants are documented, and that their levels are below specific tolerances deemed safe. Ensuring that the levels are controlled within these limits is an important part of any manufacturing process, and suitable analytical techniques must be developed to prove that these standards are being met.
Various regulatory agencies publish guidance documents to assist manufacturers in meeting the pertinent regulatory specifications. Examples of these documents would be USP General Chapter 1086 in the United States, or EP General chapter 5.10 in Europe. Other relevant documents include ICH Q3 a, b, c & d, European Monograph 5.10, and EMA document CPMP/QWP/1529/04. Every global regulatory agency will have some variation of guidance document to assist in meeting its own individual requirements for impurity control.
In the U.S., regulators require screening to be carried out for three types of impurities: organic impurities, inorganic impurities and residual solvents. The organic impurities are generally process-related, being created as by-products during the synthesis or purification of the molecule, or they might be an unreacted starting material or reagent. Inorganic impurities are typically substances that have been added during the process, such as catalysts and other chemicals that might be required for the reaction to proceed. Residual solvents can remain from the synthesis or, more likely, be left over from the purification process. Again, these have been deliberately added during the process.
Identifying true unknowns is usually a very slow and expensive business, so the first step in monitoring for contaminants is to develop a thorough understanding of the process. This allows predictions to be made for what might be present, and therefore what should be tested for.
Because of the regulatory requirements for routine testing, elemental impurities are the most common tests to be run. Arsenic, cadmium, mercury and lead levels must always be tested, regardless of how likely they are to be present in any given sample. They can be found in natural sources, with lead the most common, and their toxicity has led to their inclusion in testing monographs. Any other metals that might have been added during the synthesis, such as catalysts, must also be checked for.
Inorganic impurities are typically monitored via inductively coupled plasma techniques, either via optical emission spectrometry or mass spectrometry. Atomic absorption spectroscopy can also be appropriate, depending on the levels that are being tested.
The safe levels of any metals that may be present must also be taken into account. What is deemed acceptable greatly depends on the size and form of the dosage that will be administered to patients. Limits for oral formulations are generally less strict than they are for products that will be administered parenterally. The more frequently the tablets are taken, or injections are given, the lower those limits will have to be in order to preserve patient safety.
However, the dosage method and level is often unknown early on in the development process. This poses a particular challenge for a contract lab, as the client may have no idea whether the product they are developing will be given as, say, a 250 mg oral dose, or a 100 µg injection, and the limits will be very different. The contract laboratory’s job is to advise the client what their options are. It might be appropriate to test each individual component that goes into the finished drug product, or it could be preferable to focus solely on the finished product.
As organic impurities are most likely to be process-related, a sound knowledge of the synthetic route, and an understanding of what side-products might be generated during the reaction, are important. These will give a starting point for identifying what impurities could be present, rather than having to undergo the slow, laborious and expensive process of identifying true unknowns.
It is, of course, impossible to predict everything that might be present. Any unexpected peak in a mass spectrometry analysis should be investigated and evaluated for potential toxicity. If the culprit is not toxic at this level, no further action is required besides its monitoring. Anything truly unknown needs to be identified and characterized before its toxicity can be determined. This can often be a slow and laborious process. Modern advances in mass spectrometry detection, coupled with gas chromatography (GC) and high-performance liquid chromatography (HPLC) separation techniques, have greatly improved this process.
Any problematic organic impurities will have to be controlled in some way in the manufacturing process. This may involve changes to the purification protocol to ensure the impurity is more effectively removed; if this proves impossible, then the only solution may be to modify the conditions of the synthesis to ensure that it is not made, or, made at a level that is sufficiently low to meet specifications.
Either way, if it is known that a potentially hazardous organic impurity can be produced, then the process must be carefully monitored for its presence on a routine basis. The monitoring does not involve further complex identification, but uses a validated testing protocol that can definitively say whether the impurity is present, and at what level. This testing is an important part of controlling impurity levels in a product, and preventing problems further down the line. Any resulting product recall would obviously be much more expensive.