A New Era in Handheld Raman Devices for Pharmaceutical Applications

June 27, 2012
FDA’s John Kauffman describes the vast potential of portable Raman devices, even for those who are not trained spectroscopy experts.

Portable Raman spectroscopic devices have become part and parcel of every pharmaceutical manufacturer’s QA/QC toolbox. These mobile devices are being embraced for an increasing range of applications, from raw materials identification to API classification to counterfeit detection. They may soon become more prevalent even as a means of characterizing finished-dosage forms.

Portable Raman is experiencing somewhat of a heyday, thanks to an obvious need within industry for better, faster quality and inspection tools, and thanks to regulators who have pushed vendors and manufacturers to collaborate in order to develop these tools.

Inside FDA, a leading proponent of Raman as a solution for ensuring the safety and purity of raw materials, APIs, and finished product is John Kauffman, a research chemist within CDER’s Division of Pharmaceutical Analysis (DPA). Kauffman has written and presented extensively on Raman and other analytical technologies. Here, Pharmaceutical Manufacturing Senior Editor Paul Thomas asks Kauffman about what manufacturers need to know about portable Raman as technology improves and becomes easier to use.

PhM: Portable Raman and other spectroscopic devices have rapidly gained in popularity for pharmaceutical raw material inspection (with increasing encouragement by FDA). How significantly have these devices improved over the past few years? Would you say that it’s possible to get “lab quality” results in the field?

J.K.: The technologies that enabled the development of portable Raman spectrometers were holographic sharp cutoff and notch filters and battery powered, narrow-band laser diodes. These components have been available for many years. Incremental improvements in the sharpness of the filter cutoff can improve the wavelength range of portable Raman spectrometers, and frequency stabilization of the diode lasers may improve the instrument resolution if it is laser bandwidth limited.

But many recent innovations in portable Raman spectrometers have focused on other design elements that often depend on the vendors’ target markets. For example, some instruments are designed for ruggedness, while others are designed to be small enough to fit in a pocket. One encouraging development is that vendors are improving the user interface to simplify the use of these instruments by non-experts, and some vendors are also developing chemometric tools that offer flexibility for method development scientists.

The typical figures of merit for Raman spectrometers are spectral range, resolution, signal collection times and signal-to-noise ratio. Vendors of portable instruments must also consider the size, weight and form factor of their instruments, and are therefore faced with a set of trade-offs between measurement characteristics and portability during instrument development. Signal collection times depend on laser power, detector sensitivity and the optical throughput of the instrument, and this consideration is often more important in field applications than in lab applications.

Signal-to-noise ratio can be improved by cooling the detector to temperatures at which thermal noise is negligible. Unfortunately this feature generally requires heavy components and has a high power demand. There are some portable instruments with low temperature detectors (i.e., less than -40 °C) but they tend to be larger and heavier. Most portables with cooled detectors use single-stage thermoelectric cooling to drop the temperature by 20 °C or so below ambient temperature, and this helps, but very few portable spectrometers can match the signal-to-noise performance of laboratory instruments.

Resolution is determined in part by the focal length of the spectrograph, so vendors of portable instruments have to strike a compromise between size and resolution. Laboratory instruments usually do not have these constraints, so one can expect better resolution and better signal-to-noise ratios from benchtop Raman spectrometers. Similarly, if spectral range is important for a given application, laboratory spectrometers with double or triple monochromators are available.

Having said this, we have utilized portable Raman instruments from a number of vendors, and their resolutions, ranges, signal-to-noise ratios and collection times have been adequate for most of our applications.

PhM: Are drug manufacturers making use of these portable inspection technologies to the degree that they should—with APIs and also with excipients?

J.K.: Nearly every major pharmaceutical innovator firm has a program in Raman spectroscopy. These programs support applications such as raw materials identification, API characterization, counterfeit detection, and characterization of finished dosage forms. Raman spectroscopy is often very useful for APIs because they are often strong Raman scatterers, whereas many excipients exhibit Raman spectra with broad features and high background.

I am not aware of any applications of Raman spectroscopy for excipient characterization within the pharmaceutical industry, with the exception of excipient identification. As you know, the FDA Division of Pharmaceutical Analysis initiated an Excipient Library program about 1 year ago, in collaboration with IPEC. We hope to develop chemometric methods that allow us to assess certain quality attributes of excipients.

With respect to the use of Raman spectroscopy in process control, it’s best to consider this question by comparison to near infrared (NIR) absorbance applications. NIR has been utilized in the pharmaceutical industry since the late 80s and early 90s, but it has only been in the last 8-10 years that we have seen a significant number of NIR methods in new drug applications. Raman spectroscopy is probably 10 years behind NIR in terms of its development as a tool for pharmaceutical manufacturers, so while we have seen very few Raman methods in new drug applications to date, we anticipate an increasing number in the future.

PhM: It seems like a simple thing, but you’ve mentioned that “user-friendliness” is a big issue. Why, and are you seeing portable devices improve in this regard?

J.K.: User-friendliness is an issue for our rapid screening program because the methods are being implemented in the field by non-experts. When a vendor’s instrument control software assumes that the user is a trained Raman spectroscopist, the sequence of steps required to measure and analyze a spectrum can present a significant hurdle to method adoption. In some cases, we have addressed this issue by creating our own user interface which makes appropriate selections for a number of instrument parameters such as laser intensity and collection time. These instrument settings are part of the method, and do not require user action. Intuitive interfaces are always more inviting than idiosynchratic interfaces, even for experts. We have been working with portable spectrometers for a little over four years now, and we’ve seen some very intuitive user interfaces in recent years.

PhM: With regards to Raman, you’re a strong proponent of library-based spectral correlation methods (over visual inspection), but have noted that some methods don’t have the sensitivity to detect adulterants. What sort of methods should be used, therefore?

J.K: When we began our rapid screening program, we developed PLS [partial least squares] models to demonstrate the full capability of Raman spectroscopy coupled to chemometrics for pharmaceutical surveillance. This initial phase of the program was very successful, but the amount of work required to develop a single PLS method was substantial.

In the second phase of the program, we wanted to assess the capabilities of spectral correlation methods that are widely used in a variety of applications. These methods are relatively simple, requiring only a single representative spectrum of a material in a spectral library. But the capability of spectral correlation is limited, as we demonstrated in some recent publications.

We are currently evaluating the capabilities of chemometric methods based on principal component analysis for detection of adulterants in raw materials as well as substandard finished drug products. The results to date are very promising, and we think these methods may offer a good compromise between sensitivity to adulterants and the work required to develop and utilize the method.

PhM: Why are calibration transfer procedures important for users to consider? What is the risk of wavelength shift as libraries are distributed to different instruments in different geographical locations?

J.K.: Calibration transfer methods have two important advantages. The first is that they allow a method developed on a single spectrometer to be distributed to many other spectrometers at remote sites without the need to deliver the remote instruments to the lab. We have done a lot of work to confirm that the calibration transfer methods give comparable results on different instruments from the same vendor as well as different instruments from different vendors.

The second advantage is that calibration transfer procedures can correct for small instrument drifts that may occur over time. To date we have not seen evidence that any of our portable Raman spectrometers are drifting, but the difference between instruments simulates some of the characteristics such as spectral sensitivity and wavelength shift that could occur as an instrument ages.

We have evaluated two methods for calibration transfer. The method that we’ve applied to spectral libraries involves instrument correction for wavelength shift and spectral intensity on the basis of reference material spectra. These appear to be adequate for spectral correlation methods, but may not be suitable for methods based on PLS or PCA. For these methods we currently use multivariate methods of calibration transfer that were developed in the early ‘90s such as direct standardization or piecewise direct standardization.

The primary challenge in implementing multivariate methods of calibration transfer is the proper selection of the calibration transfer samples. When the transfer samples are properly chosen, the transferred method can perform well on many instruments, but if they are not chosen properly the method capability may vary from one instrument to another. We expend considerable effort to make sure we’re selecting the proper calibration transfer samples.

PhM: Can you clarify limited range vs. global calibration models?

J.K.: Let’s consider the detection of diethylene glycol (DEG) in glycerin. It is well known that glycerin is hygroscopic, and in fact USP glycerin may have up to 5% water. When we began to develop a PLS method to detect DEG in glycerin, we wanted a model that covered the entire range of possibilities, 0-100% glycerin, DEG and Water. We used design of experiments methodology to develop a calibration set that covered the entire ternary phase diagram, and we developed this model, which we referred to as a global model. It had good predictive capability, but it is well known that the minimum error in any model is close to the center point of the concentration ranges.

Because this model was intended to detect DEG as a contaminant in glycerin, we felt it would be beneficial to develop a second model whose concentration ranges were limited to the vertex close to 100% glycerin in the phase diagram. We developed a limited range model over the ranges of 0-20% DEG, 0-10% water, and 70-100% glycerin. This reduced our predictive error in the important region near 100% glycerin, thereby reducing our statistical detectability threshold.

As an added bonus, we found that the limited range model was linear up to about 80% DEG. The error was higher at 80% DEG than at 5% DEG, but we can tolerate larger error at 80% contamination, and in either case the decision would be the same: identify the sample as suspicious and send it to a lab for confirmatory analysis.

PhM: Finally, you mentioned recently that you hope to begin doing more work on inspection of finished-dosage forms. What are the main challenges?

J.K.: The DEG/glycerin application above is an example of a specific adulterant in a specific material, and in this case a PLS model can be developed. The main challenge with finished dosage forms is that we don’t necessarily know what we’re looking for. It could be an unknown adulterant, it could be a counterfeit with no API or it may be some other marker of substandard quality. Couple this with the fact that finished dosage forms are mixtures, and may be manufactured in more than one facility, and the problem of identifying an adulterated or substandard product becomes much more difficult.

We understand that there are sources of spectral variability that are acceptable, and we must be able to distinguish acceptable variability from the variability that reflects a substandard product. To address these challenges, we are collecting products manufactured in different countries, from different lots manufactured at the same facility, different dosage forms, etc. We measure multiple spectra from each source and then use PCA-based methods to identify significant sources of variability. This helps us to establish appropriate classification schemes for pharmaceutical surveillance.

About the Author

Paul Thomas | Senior Editor