Launching from the theme of space travel, former astronaut Larry DeLucas, PhD, director of Center for Biophysical Science and Engineering at the University of Alabama (Birmingham) discussed the Centers work in High-Throughput Self-Interaction (HTSI) Chromatography and the prototype that has been developed. The technique would help optimize protein expression, crystallization and formulation in protein therapeutics, where solubility and stability are issues.
The technology is based on the second virial coefficient, a light-scattering term that indicates how much the protein interacts with itself. DeLucas describes the technique as very quantitative, but predictive. In addition, he says, it does not require large volumes of sample. The Center is working to enable the technique to be used with even smaller volumes of protein, and eventually aims to develop a high-throughput version of the technique.
Analyzing excipients could be an important application. Today, DeLucas said, the drug industry must try 30 to 50 different excipients in different concentrations for each formulation project. Typically, equilibrium sedimentation studies are performed, but the technique is low-throughput and inconvenient.
Center researchers aimed to develop a method that would require very little protein but provide highly sensitive measurements. They performed validation studies on the technique with pharma companies, focusing on vaccine proteins. As measured by the companies, solubility data as tracked very well with measured values. The technique has been used to study NamNet protein and membrane proteins such as cystic fibrosis transregulator protein. We were able to eliminate aggregation, DeLucas explained.
Industrial applications for a high-throughout Self-Interaction Chromatography system would include: optimization of crystallization conditions to make diffraction-quality crystal; determining protein formulations; to screen, quantitatively, one protein for possible interactions with a library of other proteins. Potentially, with chip technology, this could be done over an entire genome, DeLucas noted. Currently, the project is receiving funding from the Cystic Fibrosis Foundation.
Janet Woodcock, FDA Deputy Commissioner, then discussed the Critical Path and how it ties in with pharmaceutical manufacturing, and how the 21st Century GMP initiatives established a paradigm for the Critical Path.
Woodcock admitted that 2007 was a difficult year for FDA, but said the outlook has improved this year. However, she noted, the challenge of linking product attributes to clinical outcomes continues.
Woodcock then reviewed the reasons for the Critical Path. She observed that so far, investment in basic biomedical science has surpassed investment in medical product development, and as a result, the development process is becoming a serious bottleneck to the delivery of new products. The public is waiting, she urged.
The industry is at a challenging point. Genomics and other new sciences will take another 10-15 years to reach their full potential, Woodcock said. At the same time, mergers and acquisitions have decreased the number of new drug candidates, because post-merger, similar candidates are often dropped. Easy targets have been taken, and treatments for chronic diseases are often less attractive from a financial viewpoint, she said. With costs rapidly escalating, companies are less willing and able to bring new drug candidates forward.
Woodcock also provided some sobering statistics: Today, new compounds entering Phase I have roughly an 8% chance of reaching the market vs. a 14% chance 15 years ago, she said. Even more worrisome is the fact that, even after significant investment and sunk costs, the Phase III failure rate is now 50%, vs. 20% a decade ago. Our ability to weed out compounds early is decreasing, she said.
Woodcock went on to discuss the reasons for the Critical Path program, noting that societys investment in R&D is needed to improve the drug development process. There has been huge private and public basic research and investment in specific product development, but minor investment in development tools and public standards, she indicated.
We havent built the generalized knowledge base that is required, Woodcock said. In 2004, we decided to make an issue of this, she pointed out. Basic research isnt enough. We have to look at the critical path that a product must follow before it is introduced to market.
Policymakers in the U.S. government still dont understand that the science required to evaluate new product safety and efficacy and to enable manufacture is different from basic discovery science. NIH now has more interest in translational research, to get lab discoveries into clinics and Phase I, but theyre just starting down this path, she said.
We tried to make a simple message that policymakers could understand, Woodcock remarked. We applied science to three key dimensions: safety assessment, proof of efficacy, and industrialization/commercial scale-up, which will result in a product with consistently high quality.
The Critical Path concept was recently expanded to cover foods and veterinary drugs, and a generic drug white paper was published. In addition, the concept was broadened to include post-market surveillance since, as she explained, a robust safety net is needed if development programs cannot answer all safety questions.
Woodcock noted that today, 65% of all prescriptions are generics, but there are tremendous challenges in getting them to market and ensuring that they are truly equivalent to brand-name drugs. Much more robust conversation is needed about the need for market safety, she said.
Industry needs to share existing knowledge and databases and develop enabling standards, Woodcock added. This information should not stay within the walls of one organization, but must be shared to develop generalized knowledge. It isnt of such proprietary nature that it couldnt be shared, she said. Such sharing will be required in order to develop public standards that will enable evolution of the field and that everyone can use.