Ram Sasisekharen on Systems Biology and Standardization

Feb. 7, 2007
In his plenary speech at IFPAC 2007, Dr. Ram Sasisekharen of MIT, a specialist in glycoengineering, discussed the need for standardization in techniques used for characterization, process understanding, and clinical trials. The technology already exists, he says, but must be used wisely.

This is an abridged and edited excerpt. To hear the audio file of Dr. Sasisekharen's entire presentation, please click the Download Now button below this transcript.

To familiarize yourself with Dr. Sasisekharen and his work, click here to access a National Institutes of Health article. For more information on orthogonal analytical techniques being applied to address glycobiological systems, click here to read an article by Dr. Sasisekharan from the journal Nature Methods. To view the PowerPoint on follow-on biologics that Dr. Sasisekharan and colleagues presented to FDA earlier this year, click here.


...We need to redefine our thinking, and to think outside the box, for both new biopharmaceutical and biosimilar development…There are some “unreal” issues to be seen between “small” molecules and large complex molecules (which include polysaccharides, polypeptides, in drugs such as erythropoietin and heparin) that really illustrate some of the fundamental challenges, and from those challenges, opportunities to improve our understanding.

[Biomolecules] are complex molecules that represent a continuum of technical, regulatory and legislative challenges… Technology is available to [define and characterize] them , but the critical issue is to use these technologies effectively, to understand and de-mystify the challenges… so that we can manage the risks associated with these molecules. And I also believe that there is an opportunity for science to play a role in shaping the cockpit – so that we are able to think of the framework [that will be needed].

I’ll focus first on characterization and process-related issues, then on the Critical Path Intiative issues such as mechanism of action and full clinical based aspects…using biologicals …to get into some of the mechanistic issues that we heard Dr. Woodcock talk about.

Consider the historical view of the globe, and the world as we now know it. Technology… brings everything together, which in many ways has brought about globalization and all its challenges. But, enabling this connectivity to happen is the very important message of “standardization.”

As Dr. Woodcock has said, science, and especially life science has been done, up until now, in a rather “hit or miss” fashion, without the proper framework, and many of the mechanisms have not been understood. But modern biology is different now.

The old biology was largely a reductionist biology. You had a concept of one gene, one protein, a very important magic bullet target of a “reductionist science.” And of course, much of that was based on very important technologies relating to sequencing DNA protein....which led to the biotech revolution.

But within the last 5-10 years, and after the human genome sequencing, we have taken a more integrated approach – a more systems approach to life sciences. We see that it’s not the pieces, and how you get a point-to-point comparison, but it’s how you get things holistically – how you apply genomics, proteomics, high-throughput technologies… that has really brought a very different framework in terms of looking at the life sciences.

So, you’ve moved from a hypothesis-driven, reductionist approach to a more integrated way of looking at complexity. That takes us into very interesting dimensions – allowing us to view not only the number of components that make up the body but how these components are hierarchically organized, to an “integrated” approach to structure/function relationships, to get a truly mechanistic understanding of what is going on. In many ways you can look at this as a simple circuit board of how the cell or tissue components come together.

…Standardization enables us to come up with a more systematic way of being able to manipulate systems, use a variety of different techniques to measure the manipulation, store this information in databased and use the data to develop models. This is at the heart of how the life sciences are really changing. Not just in the way we’re thinking about or approaching the problem, but in the way we’re framing it, so that we have a more integrated way of looking at it…If you do not standardize, and make the manipulations and measurements, it’s sort of like garbage-in, garbage-out.

Genomics is a classical example…we’ve come to realize that standardization will be central to our being able to…derive meaningful information from the human genome.

So let’s think about problems from the systems approach to address the challenges that we face in “hit or miss” clinical trials. Regarding characterization of complex molecules, as Dr. Nasr very eloquently said, in order to demystify the “black box” associated with them, one must leverage cutting-edge technology to achieve characterization… of the …chemical constituents. You need to look at the relationship between the process and the product to be able to eventually get to the mechanistic underpinnings of function …

…I think that those two spaces – the spaces of characterization and the spaces of mechanism is where there is a gap…that we truly need to understand …using an integrated approach… not only the biology but the chemistry and processes that result in a product.

Unlike with small molecules, the challenge with biopharma is the fact that very often, analytical techniques are used as stand-alone techniques to check the box; in other words, if you use NMR or a mass spec to get a certain measurement, you’re looking at that measurement as an individual measurement and not really looking at that measurement in the context of a complex mixture.

Historically, attributes have been captured as an ensemble of averages, in other words, when you’re looking at a mixture of proteins or a mixture of peptides, you’ve tried to capture an average property. But this is a compendial way of looking at characterization, based on point-to-point measurements.

If you’re looking at a complex mixture, whether it’s a protein or a peptide, I think you to need to move from the space of an average property to a way where we can capture an ensemble of attributes quantitatively and describe the mixture, and once the mixture has been characterized, map that into a process space, in a meaningful way.

You can now use orthogonal analytical techniques that can do different types of measurements and start beginning to think about describing orthogonal space in a fashion that we have never done before.

Why would you even think of relating an NMR measurement to a Mass Spec measurement to an HPLC measurement in in some sort of quantitative way? So you can get a much more comprehensive picture of the measurements you made and the inter-relationship between these measurements when you describe what is going on.

In order to do that, we need to get into the concept of data integration … And I’m going to use characterization as an example to say if you do the integration of unique complementary …data sets, multiple active techniques, that what you’re able to do is not only capture the structure of this mixture, but you are then able to very accurately describe the very important features that capture this mixture. Then when you move from the characterization space to a process space, you need to understand this relationship and how this relationship can be described in order to manage the risk associated with making this process in a robust manner, as Dr. Nasr said.

…It’s important to understand complex molecules on three levels: Start at the cure level, at the process of transformation and production, and the final purification and isolation level.

You have test points that enable us to accurately capture the important features associated with this process. But part of the challenge is that as you go through the entire process, how do you bring the characterization hand in glove through a process to be able to determine this “process/product relationship” from the point of view of characterization. And that becomes the very central thing when you’re looking at biopharmaceuticals from a point of view of a very integrated approach of orthogonal techniques of characterization.

You want to define the transformation… space through characterization and understanding the process so that you could very seamlessly go back and forth and really define how each of these steps are related in a very quantitative … way, bringing in a characterization approach, or an integrated characterization approach, that enables us to look at this micro-homogeneity associated with biopharmaceuticals. Compendial tests do not capture this information.

What becomes important is a rapid way to integrate different techniques. So when you’re looking at DOE, you can look at different parameters in a very quantitative way.

To get to QbD, knowledge gained from every step is very important in looking at the product and process through characterization to product step. This can be achieved with existing technologies. You have informatics techniques to map or connect various measurements, captured as object-based relationship measurements.

You then bring in middleware interface and a database. Informatics has advanced to the point where it is easy to integrate this information, seamlessly. Bringing the characterization space into the manufacturing space can define the discrete attributes that define a product, rather than “averaging.”

We should move away from point-to-point measurement, to orthogonal measurements that allow us to measure attributes of the entire mixture, rather than points… Understanding the process and product relationship is very important in addressing today’s product and process challenges.

Now, when you have a biopharma, you do random studies and random trials.

An integrated systems approach can help you be more focused so that these studies become more predictable, and offer a better “signal.”

To me, genomics is a tool. How do you use the tool to understand mechanisms?

...If you look at different races or disease categories, how do you define the different patient groups? You can then ask yourself, “Am I looking at the right patient group?” …Choosing the right filters can segregate patient groups, so you can use genomics as a tool.

If you choose a population that won’t respond, you will be dooming your drug to fail.

Accurate definition of clinical trials will require choosing the right patient populations, a common vocabulary, databases, measurements. Standardization can make this work.

Look at function as an integrated signaling output. Get a more comprehensive picture of economic and functional genomic affects.

I believe that the right framework is needed to link the right properties and studies. What needs to be done is to bridge the technologies from characterization to process…We’re at a very exciting crossroad to clarify the new drug and the generic drug opportunities. Technology is available to define and capture these molecules. It must be used wisely.