Enhancing Analytical Productivity by Breaking Down Silos (Part 1)


From upstream to downstream labs

The pharmaceutical drug discovery and development process is often represented in a simplified graphic form, such as the following:

 

 

Seen in this way, it appears to be a straightforward linear endeavor where a molecule moves one way through a series of steps — but in reality it is not. Oftentimes molecules will advance downstream, but then analytical issues will come up (for example: data artifacts or newly discovered impurities), and the molecule will be sent back upstream for further analysis and development before resuming its journey downstream again.

Also, if you look at the number of drug candidates being worked on at each stage along the way and apply a timeline, the whole process is more accurately depicted as a funnel, as seen here:

Source: Saurav Patyal, BenevolentAI, Digital Initiative, Harvard Business School

 

When you consider the many drug candidates initially investigated upstream that get eliminated at various stages along the development pathway, and how it takes on average 10 to 15 years for a molecule to complete the journey into an FDA-approved drug, one can understand how the average cost of developing a new drug today (including failures) falls in the $800M to $1B range.

Add to this evolving regulatory requirements that are becoming ever more stringent (especially for large and increasingly complex biopharmaceuticals), and increasing pressure to accelerate time-to-market while reducing development and production costs (fueled by intense competition, a fast-emerging biosimilars market, and governments looking to contain healthcare costs), it’s clear the biopharmaceutical industry needs to achieve much higher levels of productivity.

To that end, one area where significant productivity losses routinely occur today is in the process of transferring data and methods across labs and geographies, especially when moving from upstream labs into regulated downstream labs (late stage development, manufacturing, and QC) where method and system validation, data integrity, and compliance are of critical importance.

As referenced in a recent Waters blog post, What is Changing Analytical Method Transfer Today? Part 3: Methods Across Borders, a scientist at a major global pharma company estimated that an analytical method could see as many as 100 transfers in its lifetime.  What is not commonly understood are the many hidden complexities and risks when transferring data and methods, especially into and between downstream regulated laboratories. The instrument platforms and software used often vary as you traverse down the drug discovery and development pipeline. As such, methods must often be re-developed and re-validated, and historic data and the rationale for method adjustments can effectively get trapped in analytical silos that are hard to access.

If not handled properly, compliance and data integrity issues are likely to be flagged by regulatory agencies, which in turn can result in significant delays with lots of added work and cost overruns.

In her recent article,“An Analysis of FDA FY2017 Drug GMP Warning Letters,” Barbara Unger of Unger Consulting Inc. reports that 2017 saw a doubling in the number of FDA warning letters issued to drug product manufacturing sites. The majority of these were related to compliance and data integrity issues, including method and system validation, data acquisition and storage, audit trails, etc.

As the biopharma industry continues to expand and as an increasing number of products enter the market, including biosimilars and biobetters, the challenges of demonstrating analytical compliance and data integrity will only become more pronounced.  To enhance development speed and productivity, companies must have a clear organization-wide strategy for breaking down analytical silos and meeting compliance and data integrity requirements.

At Waters, we are working to create a single, compliance-ready informatics framework in which all our fit-for-purpose tools as well as third party tools can operate; one that enables the smooth flow of data and information from upstream to downstream labs, from central to distributed labs and across all geographies; one that maintains data integrity and compliance and that creates workflow efficiencies that can fuel significant gains in productivity.

 

In my next blog, I will discuss the challenges of working with outside partners and breaking down silos between central and distributed labs.

Visit waters.com/tamethechaos for more information.

 

 

Previous blog posts on biopharma data harmonization: