Launching from the theme of space travel, former astronaut Larry DeLucas, PhD, director of Center for Biophysical Science and Engineering at the University of Alabama (Birmingham) discussed the Centers work in High-Throughput Self-Interaction (HTSI) Chromatography and the prototype that has been developed. The technique would help optimize protein expression, crystallization and formulation in protein therapeutics, where solubility and stability are issues.
The technology is based on the second virial coefficient, a light-scattering term that indicates how much the protein interacts with itself. DeLucas describes the technique as very quantitative, but predictive. In addition, he says, it does not require large volumes of sample. The Center is working to enable the technique to be used with even smaller volumes of protein, and eventually aims to develop a high-throughput version of the technique.
Analyzing excipients could be an important application. Today, DeLucas said, the drug industry must try 30 to 50 different excipients in different concentrations for each formulation project. Typically, equilibrium sedimentation studies are performed, but the technique is low-throughput and inconvenient.
Center researchers aimed to develop a method that would require very little protein but provide highly sensitive measurements. They performed validation studies on the technique with pharma companies, focusing on vaccine proteins. As measured by the companies, solubility data as tracked very well with measured values. The technique has been used to study NamNet protein and membrane proteins such as cystic fibrosis transregulator protein. We were able to eliminate aggregation, DeLucas explained.
Industrial applications for a high-throughout Self-Interaction Chromatography system would include: optimization of crystallization conditions to make diffraction-quality crystal; determining protein formulations; to screen, quantitatively, one protein for possible interactions with a library of other proteins. Potentially, with chip technology, this could be done over an entire genome, DeLucas noted. Currently, the project is receiving funding from the Cystic Fibrosis Foundation.
Janet Woodcock, FDA Deputy Commissioner, then discussed the Critical Path and how it ties in with pharmaceutical manufacturing, and how the 21st Century GMP initiatives established a paradigm for the Critical Path.
Woodcock admitted that 2007 was a difficult year for FDA, but said the outlook has improved this year. However, she noted, the challenge of linking product attributes to clinical outcomes continues.
Woodcock then reviewed the reasons for the Critical Path. She observed that so far, investment in basic biomedical science has surpassed investment in medical product development, and as a result, the development process is becoming a serious bottleneck to the delivery of new products. The public is waiting, she urged.
The industry is at a challenging point. Genomics and other new sciences will take another 10-15 years to reach their full potential, Woodcock said. At the same time, mergers and acquisitions have decreased the number of new drug candidates, because post-merger, similar candidates are often dropped. Easy targets have been taken, and treatments for chronic diseases are often less attractive from a financial viewpoint, she said. With costs rapidly escalating, companies are less willing and able to bring new drug candidates forward.
Woodcock also provided some sobering statistics: Today, new compounds entering Phase I have roughly an 8% chance of reaching the market vs. a 14% chance 15 years ago, she said. Even more worrisome is the fact that, even after significant investment and sunk costs, the Phase III failure rate is now 50%, vs. 20% a decade ago. Our ability to weed out compounds early is decreasing, she said.
Woodcock went on to discuss the reasons for the Critical Path program, noting that societys investment in R&D is needed to improve the drug development process. There has been huge private and public basic research and investment in specific product development, but minor investment in development tools and public standards, she indicated.
We havent built the generalized knowledge base that is required, Woodcock said. In 2004, we decided to make an issue of this, she pointed out. Basic research isnt enough. We have to look at the critical path that a product must follow before it is introduced to market.
Policymakers in the U.S. government still dont understand that the science required to evaluate new product safety and efficacy and to enable manufacture is different from basic discovery science. NIH now has more interest in translational research, to get lab discoveries into clinics and Phase I, but theyre just starting down this path, she said.
We tried to make a simple message that policymakers could understand, Woodcock remarked. We applied science to three key dimensions: safety assessment, proof of efficacy, and industrialization/commercial scale-up, which will result in a product with consistently high quality.
The Critical Path concept was recently expanded to cover foods and veterinary drugs, and a generic drug white paper was published. In addition, the concept was broadened to include post-market surveillance since, as she explained, a robust safety net is needed if development programs cannot answer all safety questions.
Woodcock noted that today, 65% of all prescriptions are generics, but there are tremendous challenges in getting them to market and ensuring that they are truly equivalent to brand-name drugs. Much more robust conversation is needed about the need for market safety, she said.
Industry needs to share existing knowledge and databases and develop enabling standards, Woodcock added. This information should not stay within the walls of one organization, but must be shared to develop generalized knowledge. It isnt of such proprietary nature that it couldnt be shared, she said. Such sharing will be required in order to develop public standards that will enable evolution of the field and that everyone can use.
This year, the government funded an additional $7.5 million for the Critical Path Initiative. While it may not seem like a huge sum, Woodcock explained, this is the first time that Congress has explicitly authorized money for FDA to do research, so it is significant. It has energized programs at FDA, and our staff members are now thinking about the scientific work that needs to be done, she said.
She pointed out that the Reagan-Udall Foundation was established as part of the FDA Amendments Act to fund Critical Path research, education, and training, and Rachel Berman was named director of FDAs Critical Path Office. Priorities include:
- Biomarker qualification for in vitro diagnostics, imaging, and preclinical toxicogenomics
- Clinical trial modernization
In modernizing manufacturing, there is a big challenge to develop international standards, Woodcock said, but harmonization is essential.
The C-Path Institute, meanwhile, is examining such topics as cross-validation of animal toxicology markers. The first set of biomarkers for drug-induced nephrotoxicity is now being reviewed by FDA. Cardiac safety markers are another topic of interest, as is evaluating the genetic basis of rare adverse events. Eventually, the sequencing of the human genome will permit further study. NIH, FDA, PhRMA, BIO and others are involved in the biomarker work, while Duke University is participating in the cardiac safety consortium. This type of work may not win Nobel prizes, but it is extremely necessary science, Woodcock said.
Other consortia in formative stages will focus on modernizing clinical trials (Duke is participating in this work) and on nanotechnology. FDAs Sentinel Network, meanwhile, will focus on post-marketing surveillance.
As Woodcock explained, the 21st Century GMPs document was a prototype for the larger Critical Path Initiative. We are accelerating the pace of introduction of new science and technology, she said. PAT was the poster child. It may be a small piece of the picture, but is emblematic of the problems facing the industry.
She added, We want to move from empirically derived trial and error methods (e.g., formulation, excipient selection) to rigorous, mechanistically based and statistically run processes. We need to break down silos between R&D and production and to be less conservative.
Global harmonization of regulations will be critical, Woodcock noted. Were developing regulatory standards with ICH, developing technical standards within standards organizations, she said. FDA and EMEA are discussing modernizing the regulation of manufacturing. Another key task will be to bring other nations regulators, those nations outside ICH, to the table.
There remains a disconnect between clinical and manufacturing sides, Woodcock said, adding, We still dont know how the attributes that you measure and control for pharmaceuticals actually control the clinical performance, or the extent to which manufacturing failures adversely affect clinical outcomes.
Much of todays drug safety debate has to do with intrinsic safety problems inherent to the drug itself, rather than errors in manufacturing. However, Woodcock said, suboptimal formulations still occasionally get to market and fail to improve over time. For science-based manufacturing, you need to have a better idea of how parameters impact clinical performance and to exercise tighter controls, she stated.
Generic drugs and some consumers reactions to them illustrate the problem. Were having a huge debate right now about switching from innovator drugs to generics, Woodcock said. Pharmacists can switch freely from name-brand to generic, so patients are switched from one product to another based on cost, she noted.
On a total population basis, this is okay, but many individual medical and patient groups are becoming upset about this because they believe that, in individual cases, there is nonequivalence between the two drugs, she explained. We dont know the critical quality factors that are resulting in clinical results, but consumers just feel that name-brand product is inherently better.
FDA and industry will to have to investigate this much further, said Woodcock. It.gets right to the issue of QbD and critical-to-quality parameters, but the fact is that we really dont know if consumer concerns are justified.
The Agency and Commissioner von Eschenbach are committed to moving the Critical Path Initiative forward, she noted, and manufacturing is a significant component of that work.
A mature, dedicated FDA group is working on this initiative, and we expect continued progress this year, Woodcock said, adding that CDER funding and staffing will be considerably improved this year.
Pharma Needs to Think Physics
Ray Scherzer, senior vice president of Engineering Technology and Capital Management at GSK, discussed QbD and PAT from an engineering perspective. A renaissance is under way, he said, and industry is getting a new life. Creativity is being encouraged and CEOs are energizing the business. Technology is the key to this change.
Starting with the example of tablet making, Scherzer remarked, We can remove product variability, but one cant remove patient variability. A variety of targets need to be looked at, he said. For instance, when the target is a coating that dissolves in a defined [period of] time, you need a firm understanding of coating material dissolution properties. Other key parameters are coating thickness, material properties, effects of corners and surfaces and testing methodology.
Now, consider the case of improving tablet core design, to make a tablet strong enough to withstand static and dynamic forces, he said. Most are measurable and predictable, based on the mechanical properties of excipients. For instance, we need to know the axial compression force and the rate of compression.
Compression technology exists with good control, Scherzer observed, but the industry needs to have a better understanding of material science and properties improvements.
Consider another goal: Core disintegrants that will occur in a controlled pattern in a required amount of time, he said. This will require an understanding of the mechanism of percolation and adsorption, including the material properties of the disintegrants. He noted that current technology can analyze quantity and distribution.
Finally, he offered another target: minimizing adherence of material to process equipment. This would require a better understanding of material science and force determination. Research is needed into the blended coefficient of friction, he urged.
He then analyzed QbD and where the industry is in its adoption of the concept. Good progress has been made, but many different areas need work, Scherzer said. Good science exists in some areas, but science and engineering will be the foundation for the furthering of QbD. We need to go beyond just the correlation of data.