There is little more irksome to senior budget-holders than when an IT and process transformation initiative is purely a cost to the business, especially when it’s driven by a need to conform to the latest demands of industry regulators. As much as financial decision-makers may buy into the bigger picture, and have patient safety at heart, they can find it galling when the direct gains for the organization from any investment appear to be minimal. And in life sciences, it would be forgivable for finance directors to resent the ever-increasing circles of spending currently required. Risk mitigation is one thing, but even the wealthiest pharma leaders do not have a bottomless budget for regulatory compliance.
Yet, what choice do they have? The call to transform regulatory information management (RIM) is growing louder with every new reporting requirement set out by international authorities. The upcoming ISO IDMP (Identification of Medicinal Products) standard set continues to command a lot of media attention, with its emphasis on data completeness and quality. It’s not the only driver of regulatory spending either. Other international regions are adapting to IDMP at their own pace or putting their own twist on the requirements, while alternative transparency initiatives exist throughout the global life sciences industry. In Europe, there is also the influence of Brexit which has created some uncertainty around the timelines of new standards. At the latest check, IDMP was due for implementation across the EU in mid-2019, though this could slip into 2020.
There is no risk of IDMP fading into the background though. The U.S., Canada, and non-EU countries like Switzerland appear keen to embrace the ISO standards because of the improved visibility and accountability they promise. The European Medicines Agency has been facilitating discussions through the International Pharmaceutical Regulators Forum, an adjunct to the International Conference on Harmonization (ICH), and set up an IDMP implementation group to foster more discussion between international regulators about the global potential of the standards.
But what approaches are organizations taking to manage critical regulated data — and what is required to ensure that companies derive some operational and strategic benefits from their investments? For pharma companies, success will depend on a taking a holistic and strategic approach to regulatory information management while also preparing for the future, as AI tools enter the scene.
Forming a Data Strategy
Given all of this continued regulatory diversity and uncertainty, life sciences organizations have decided that finding new budgets and initiating dedicated new projects each time a new regulation comes out, or is updated, is not an efficient nor effective way to go about compliance in a complex, continuously evolving global market.
While nothing moves quickly in life sciences, organizations’ plans to get their product lifecycle data management in order started to move up a gear in 2017, and momentum has been building steadily since.
At the same time, companies have now begun to grasp the strategically important role that product data could play in future — especially as a means of driving new productivity, efficiency and competitive differentiation. Success depends on making the right provisions, and finding a practical way to make regulated data work harder and deliver more across a range of use cases.
An Evolution Towards Reusable Data Assets
At a recent conference, independent industry expert Andrew Marr, who recently hung up his hat after more than 30 years in the business, emphasized the rising importance of making product and regulatory information more shareable between and beyond specific functions in the business. His perspective on companies’ evolving data strategies was particularly poignant, given that he has spent his career helping life sciences organizations improve product lifecycle visibility and keep accurate records for regulatory reporting purposes — developments that have taken place in parallel with advances in software and data management technology.
To emphasize how far things have come, Marr recalled the shift from manual paper records to rudimentary electronic regulatory submissions, preceding the evolution of relatively static, 2D documents to more dynamic and intelligent digital versions which can be searched more readily, paving the way for smarter, data-oriented RIM. That’s as long as information is complete, captured in a standard form, readily accessible, and reliable as a robust source of product truth.
In a global regulatory context, we’re also seeing the growing synchronization of efforts to drive up data quality and consistency, a consistent emphasis on transparency and data sharing, and the promotion of online portals for submitting and interacting with data. All of this creates considerable potential to do something more clever with this richer, more holistic and meaningful data bank that companies are building about their products across the complete lifecycle, and their status at any given time in the global market.
It is here that artificial intelligence begins to have significant appeal, as a means of making sense of and doing more with these increasingly substantial data assets.
Smarter Resource Allocation: The Emerging Role of AI
Combined with machine learning capabilities, artificial intelligence algorithms can both make discrete connections and spot trends in masses of data, and become increasingly efficient at this over time in response to the conditions they are exposed to and the results they find. This offers a wealth of potential to transform the way the life sciences industry manages and extracts value from data.
The last year has seen a surge in next-generation technology-themed events for medical science industries, which is no coincidence. A broad spectrum of industry conferences and exhibitions have explored AI and its potential from a number of different angles. “Next generation healthcare’” has become a central theme at pharma tradeshows and AI summits have also been hosted around the world, featuring prominent speakers from across the industry. Whatever their specialist field, all of the major players now recognize they need to have a position on AI.
Automation is a big attraction of the AI proposition, which includes identifying, checking and preparing the right data for a given purpose — and doing this increasingly effectively and efficiently over time. This means that companies can keep improving their data quality, and shortening data preparation cycles, leaning on machines for the “grunt work” which is time-consuming, at risk of error, and unrewarding for skilled professionals.
Structured Authoring and Regulatory Submissions
If machines can get to grips with routine knowledge work, and do it more rapidly without needing breaks to sleep, rest and refuel, then it makes sense to apply technology to sift, fill, find and organize. As long as humans are overseeing and sense-checking the results, why not let IT systems take the load and let experts do the more interesting and mentally demanding tasks?
It is in this context that structured content authoring has begun to attract a lot of interest in regulatory affairs. The ability to automate the assembly of routine documents such as regulatory submissions, labeling or even translations of these assets to some degree — using already-approved content assets or content “fragments” — is among the opportunities now ripe for exploitation because the right conditions have converged to make this viable and trustworthy.
The ability to tag contents and link documents to databases contributes to the growing range of possibilities for firms to be smarter about how they track and manage knowledge so that they’re no longer losing time finding the latest version of something, or recreating documents from scratch when already-approved building blocks already exist somewhere on the company network.
Ensuring greater consistency and depth to the way information is recorded across an organization, then, will not only enable new process efficiency in its own right, but it will also pave the way for companies to exploit advanced technology options to improve output and make better use of people’s time.
The scope for AI in transforming life sciences as we know it today is great. But this is not a fast-moving industry, and there are a number of things that need to happen first if companies are to adapt to and exploit the potential ahead of them in a sufficiently timely fashion.
The first is a recognition and acceptance of the fact that change is coming, and no industry is immune to disruption from emerging market entrants or new potential competitors with bold ideas and the advantage of not being tethered to legacy thinking and ways of working.
The second is preparing an IT and data environment that allows for new experimentation and insights — within the restrictions of regulatory control and privacy protection, of course.
Completing the Picture: Why Stop With RIM?
All information management progress starts with the ability for complete, consistent, high-quality data to flow between departmental and even organizational boundaries, and for this to transcend traditional regulatory data wherever a broader perspective could add greater value for the business.
The critical enabler for all of this is the creation of a comprehensive master data model — one that also includes inter-dependencies between the data, so sources can be exploited to maximum potential.
Regulated product data has proven to be the ideal place to start with data hygiene, standardization, management and automation ambitions, because of the expanding and intensifying compliance demands already outlined.
Maintaining close, regular dialogue with national and regional regulatory agencies can help companies keep ahead of evolving requirements with their investments and project focus. But from an internal benefits perspective, there is every reason to expand the same capabilities to a much broader pool of knowledge and content — not just immediate operational insight, but also wider knowledge about markets and the competitive playing field.
Do IT Delivery Mechanisms Matter?
Although once the subject of frequent debate, the specifics of whether cloud-based IT infrastructure and software will play a role in accelerating progress with data management ambitions are less important for now.
For all that cloud service propositions have matured and become more sophisticated, the option to connect and run systems via the cloud will not in itself drive an advanced RIM scenario. Unfortunately there are no shortcuts to the more dynamic, data-driven world organizations need to be working towards. As one client has described it, the key to the step change in data utility that companies crave is being able to bottle good data at the source. Once every part of the business is drinking from the same cup, the benefits can flow — but not before.
When definitive, accurate records, documents and database entries from one end of an organization to the other can be readily accessed, processed, searched and relied upon as an accurate narrative and status marker for the business and the markets it operates in, then the scope for making better decisions, and driving greater efficiencies through targeted automation, will be unlimited.