Breaking barriers: from molecule to market, can data integration revolutionize healthcare?

By Liza Laws

- Last updated on GMT

© Getty Images
© Getty Images

Related tags Icon Clinical trials Data management Research Artificial intelligence

David Menzies is the executive director of technology solutions at ICON, a company providing outsourced development and commercialisation services to pharmaceutical, biotechnology, medical device, and government and public health organisations.

In an interview he spoke to OSP about data integration and overcoming challenges to optimise clinical trial diversity. 

OSP: How has the use of data evolved in clinical trials and what are the challenges that the industry is facing in this area?

Data is the lifeblood of the pharmaceutical industry. Data integration from a myriad of sources supports many important areas of research – from trial site identification, patient finding and eligibility and protocol development, to patient support services and health economics outcomes research. As the industry embraces the virtues of Big Data, we have experienced an explosion of healthcare data from a multitude of specialised sources. The primary challenge is no longer generating data but integrating it in ways that maintain its integrity to produce vital insights for the wide range of use cases now available. Overcoming this challenge requires a robust and flexible data integration framework.

OSP: What data integration strategies can be used to overcome data challenges and optimise clinical trial delivery?

Now more than ever, it is important to ensure your data integration framework can deliver on your evolving needs. While conceptually simple, an influx of solution providers has led to a litany of false starts and wasted effort which translates into sunk costs and program delays. There are more data sources than ever, and the increasing disaggregation requires a high degree of specialisation to appropriately integrate, manage and link them.

Pharmaceutical companies are keen to expand their capacity to apply data across more use cases, from patient journey analytics to site identification, patient identification, predictive modelling and brand strategy. However, this success hinges on an agile integration framework that can pivot to enable the continually evolving use case landscape.

Successful data integration strategies must be flexible and agile to respond to the various needs, changing regulations and potential applications. When devising a strategy, there are four high-level concepts to consider for a holistic approach:

Data disaggregation – Data suppliers are fragmenting the marketplace. Gone are the days of acquiring all data from a single data aggregator. Life science companies now must acquire the same type of data from multiple sources and integrate them into a consolidated view.

New, novel data types – New datasets are constantly emerging which can offer a more powerful indicator of patient health, allowing measurement beyond traditional metrics. Can these datasets and data types be easily incorporated under this strategy?

Fluid brand priorities ​– The use cases for data have proliferated in recent years, and one can only expect these novel applications to continue to develop. Brand priorities and strategy will shift, too. Data integration frameworks should be adaptable enough to enable these changes because, if not, sponsors could find themselves built into a corner.

Privacy and compliance – A patient 360° assessment should not come at the cost of privacy. Managing clean and compliant data will require a clear strategy and an investment of time and resources.

OSP: What are the real-world data integration challenges that sponsors need to be aware of and how can they overcome them?

After considering the overarching concepts for a successful approach, sponsors can turn their attention to the specific challenges that they are likely to face when integrating real-world data. The first challenge would be data access and storage constraints and how these stipulations may impact their strategy. Providers will have different requirements and permissions for data storage and access, which complicate the integration and ultimate use of the data. As the number of data providers increase, these limitations pose bigger challenges.

Multiple data sources introduce patient consent considerations, so it is important to ensure that the data gathered has appropriate consents for the various use cases intended. Otherwise, patients may have to reconsent or there would need to be a waiver of consent. In the best case, these additional consent measures impact timelines, and in the worst case, prevent the data from being used at all.

The relative completeness and collection lag of the data itself can also be challenging. Contractual restrictions or other factors can create information gaps, and the update frequencies on data will differ across suppliers and types. These incongruities will complicate data blending and impact use case applications. As such, it is important to curate the data with the appropriate values that will be the most useful and easily integrated. Deciding between directly contracting specialised data providers and leveraging a single aggregator will be impacted by these challenges.

OSP: What considerations should sponsors have in mind when building the foundation of their data integration strategy?

Not all data tokenisation and integration schemes are created equal, so it’s important to be intentional about what your needs are to understand the best provider and framework to fit your strategy. At a foundational level, rich and robust data requires three key tiers: a tokenisation engine, patient master, and patient and consumer data assets.

The tokenisation engine is the core of the strategy. It turns personally identifiable data (PII) into encrypted tokens. The token quality is directly related to the accuracy and completeness of the original data, and it is imperative that robust privacy and validation processes be in place from the first step. A stringent onboarding privacy review process ensures compliance, integrity, and anonymity from the outset to avoid false starts and backtracking.

The patient master is the centre for data management and is a critical component for an agile data integration strategy. It is the only way to validate the tokens and enables a single, consistent patient identifier, and it is essential to consistently link the same patient across data sources. With a patient master, patient data can be linked from end-to-end, from molecule to market.

An ideal partner would own or have access to curated patient and consumer data assets that enable the clearest and most holistic picture of the patient healthcare experience. Data from claims, patient services, wearables, digital therapeutics, diagnostics, websites, audience activations and social determinants of health are all important pieces of the puzzle that, when accurately linked, can provide the deepest insights for the widest set of use cases.

OSP: What are your thoughts on the future of clinical trial data and application of data insights?

The data landscape is rapidly changing. As we find novel ways of generating and applying data insights, we continue to optimise clinical research efficiencies and create more meaningful outcomes for patients. However, the challenges will also continue to evolve alongside the innovation. Sponsors need a robust, formalised data integration framework with the flexibility to adapt to this dynamic environment and carry their carefully curated data across the development continuum – from early phase through commercialisation. This framework is key to satisfying the diverse needs of the various business stakeholders and is foundational to a pharma company’s pathway to a digital enterprise. 

Related topics Clinical evolution

Related news

Follow us

Products

View more

Webinars