Andrew Anderson (AA), vice president of innovation and informatics strategy with ACD/Labs, spent some time talking with Outsourcing-Pharma (OSP) about the rapid evolution in drug development over the past year and shared his thoughts on trends the industry likely will encounter in 2021.
OSP: Before we get into the industry at large, could you please share some of the things that happened at ACD/Labs in the past year—feel free to mention new products/services, growth, new facilities and hires, etc.?
AA: The biggest development at ACD/Labs in 2020 was our transition to a completely virtual company. Due to COVID-19 travel restrictions, we had to transition our customer engagement from face-to-face and physical presence to completely virtual; thanks to modern technology and bandwidth, this was made possible, but it was certainly an interesting transition for our company, as it is for many companies around the world.
At the same time, we saw significant growth in 2020 in terms of our products and innovation. In August, we released our annual updates, v2020.1, which contained a wealth of enhancements to existing products as well as some core technology innovation that supports customers’ digital transformation interests and strategies.
Within digital transformation, many of our customers have been looking to replace paper- and human-driven processes to automate laboratory experiments. When capturing information around these automated experiments, customers are going from a conceptual design to a physical design, ultimately culminating in an analysis and gleaning insights from either a single or set of experiments.
Being able to capture information about the experiment in a digital fashion, and without a significant amount of human effort (i.e., transcription of information from different systems to a single system), is the goal. We are helping our customers with the implementation of their digital transformation efforts; for example, Pfizer’s analytical R&D team has been using ACD/Labs tools to enhance digital workflows with automated data extraction and processing.
OSP: Could you please share an overview on how data collection and analysis has evolved in the pharma industry in recent years?
AA: Looking at the lessons learned from the days of traditional single-pot synthesis to support the optimization in the drug discovery process for small molecule, we are seeing increased innovation around high throughput experimentation and parallel experimentation, and a resurgence in R&D investment. Rather than performing diversity-oriented synthesis, we see more directed parallel synthesis; this allows for a faster, more effective lead optimization cycle.
When going through the scientific experimentation lifecycle, we usually have to perform various unit operations that require different software applications to source data. Ultimately, we associate this to confirm that the end product reflects the initial goal of a registered material which would allow it to be assayed.
In discovery, we also see consequences of supply chain heterogeneity and fracturing, and the effort to digitalize and establish traceability across the supply chain. While global supply chains have obvious benefits in reducing costs and allowing for more flexibility from a manufacturing perspective, there are also risks.
To that end, ensuring both transparency across the supply chain, particularly as it pertains to quality, is a challenge. However, the ability to have digital information flow through that, albeit fractured or geographically disparate, supply chain is an important initiative in which every major pharmaceutical company is investing.
ACD/Labs’ Luminata and corresponding technologies have helped make the supply chain and associated quality data more transparent through improved, digitalized data sharing and traceability.
OSP: What are some of the biggest benefits associated with the data “explosion”?
AA: The more data an organization has, the more statistical significance and insights it can share. Overall, the data explosion allows for more informed decisions and reduces the degree of blind spots in decision making. One caveat is that with increasing volumes of data comes the demand for strategies to handle it, like an insights-gleaning interface powered by machine learning (ML).
OSP: Then, what are some of the challenges and obstacles associated with the ever-increasing volume (and increasing complexity) of data?
AA: One tension with the increasing volume of data is that the data destination is not on-premises very often anymore. For many organizations, their own “on premises” systems are colocation facilities, so even if an organization owns the hardware of the data, it is still on premises away from the source.
The beauty of modern instrumentation is that organizations are able to quickly collect a lot of data, but they are then limited if all the insights drawn from data is by virtue of having that data presented to a system that is physically off premises. In this case, organizations need to make a plan to deal with the submission of data to the off-premises system.
Another challenge that comes with handling myriad amounts of data is that data is often heterogeneous—coming from different sources. As a result of this heterogeneity, the data then has to be normalized. With that said, about 80% of the effort around data science leading to its benefits is data engineering (taking it from its source, and normalizing it) so it can be structured with data from other data sources; ML helps reduce that human effort around data marshalling and data engineering.
At ACD/Labs, we have been working with clients to help with data marshalling and normalization efforts, looking for ways to present data to users when they need it. ACD/Labs have been at the forefront of standardizing analytical data for more than two decades.
Beyond data standardization, we encourage clients to think about the use cases of data—how do they plan to use it? For what purpose? Do they need it immediately? Are they using it to build a larger data set that they will ultimately glean insights from? These types of questions help to determine client organizations’ enterprise informatics infrastructure.
OSP: Can you share some of the missteps or shortcomings common to drug development teams, and how these add to their challenges?
AA: I think the biggest tension in drug development is that teams are ultimately supporting product development that is headed into the pharmaceutical industry, which we know is a very well regulated market. With that in mind, it’s important for drug development teams to look at interactions with regulatory authorities and how these authorities ultimately authorize products to be introduced into that market.
Looking at the interaction between a pharmaceutical company and a health authority, we see a lot of documents. So, the traditional paradigm is document-driven submissions, where drug development teams must first conduct document-driven decision-making before submitting to health authorities.
For pharmaceutical companies with well-founded quality assurance (QA) policies, standard operating procedures, and conformance to GMP or CGMP, a lot of their operations, as researchers and developers, culminates in the construction of the document necessary for the work to be reviewed and approved by the appropriate signatory recruiter.
It is important for companies to first determine what regulatory and QA practices require in terms of data formats. So the shortcoming is that development teams are thinking about what’s immediately necessary and not future-proofing through more extensive digitization for a future where electronic submissions, I believe, will become the norm.
Looking ahead, digitalization of the pharmaceutical development process would increase efficiency; however, the challenge lies in the process behind this digitalization. Once a drug development team has conducted its R&D and commissioned a CMO or CDMO, for example, to make a material necessary to the supply chain, the lack of digital integration between nodes in the supply chain presents a continuing challenge for these digital strategies.
Most CMO/CDMO-driven manufacturing processes supporting clinical trials are operating with a fractured supply chain—the nodes within the network don't interact digitally. Pharmaceutical companies are recognizing this challenge. ACD/Labs has been working with these pharmaceutical companies to help address and overcome this challenge of the continuing lack of integration and interoperability between data in a set of nodes in the supply chain.
OSP: What are some ways pharma companies and their drug development partners have worked to improve data collection, sharing and analysis?
AA: In my opinion, there are two things that are important for pharmaceutical companies to either have done or currently be doing. First is ensuring material quality. The process for a sponsor company to determine that a material produced by a CMO/CDMO is suitable for use is traditionally a document-driven, human-reviewed paradigm.
The sponsor company receives a material, a batch record which describes the manufacturing of that material, and a certificate of analysis which confirms the quality is of an acceptable level and is appropriate for its intended usage. Companies have implemented digital capability to allow for comparative assessment of “entities” across the product lifecycle.
While it's easy for a human to review one set of data for one material, without such digital capabilities the process is stalled when conducting a comparative analysis. Particularly for distributed supply chains, in which the sponsor company must review comparative materials with the same identity and intel usage from different CMOs/CDMOs to determine how the materials are different from a compositional perspective.
To make matters more difficult, the summary of that competition on a material, lot basis could be in different formats. From there, companies are tasked with normalizing the data prior to analyzing.
Additionally, pharmaceutical companies are digitally connecting the characterization and compositional data obtained from a CMO/CDMO. Companies are now innovating to establish a way to get digital representation to allow for comparative analysis across CMOs/CDMOs. Benefits of this include clear insights into the level of variation in terms of composition for materials of the same identity; and therefore, the ability to really judge CMO/CDMO performance for cost-benefit and ROI purposes.
OSP: What advice would you give to drug development professionals looking to optimize their data practices in the year ahead, and to be prepared for what the future holds?
AA: I would advise drug development professionals looking to optimize data practices in 2021 to establish a digital strategy for analytical and quality control laboratories where they are approaching or attempting to approach a fully digital decision support environment. From what we have learned in 2020 from the global COVID-19 pandemic, where we have had a race to get therapeutics and vaccines approved as quickly as possible, I think those in the pharmaceutical industry should establish a vision for how fast a clinical development paradigm can be.
In trying to anticipate the next pandemic or the next global health issue, how can we become even more efficient than we are today? We often talk about the digital transformation in terms of a business benefit, but I think what we're realizing is that it is not only beneficial for business, but also for the greater society, to be able to introduce treatments faster with digital support.
Moreover, by virtue of going digital leading to increased statistically significant insights and greater data integrity, organizations can make more informed decisions more efficiently—and with confidence. Where the themes of 2020 align is in innovation for overall societal benefits.