Getting the right tech tools vital to decentralized trial transition

By Jenni Spinner

- Last updated on GMT

(elenabs/iStock via Getty Images Plus)
(elenabs/iStock via Getty Images Plus)

Related tags Decentralized trials Virtual clinical trials Clinical trials software Patient centricity

A representative from Cmed Technology explains how taking stock of existing tech and adding the right new tools can help when switching to virtual clinical studies.

Outsourcing-Pharma spoke with Mark Tomlinson, senior director of technology operations with Cmed Technology, about what a trial team needs before planning to launch decentralized studies, and what tech tools can help make the transition easier.

OSP: Could you please share your thoughts on all the considerations trial teams have to weigh when planning to use decentralized features in a study?

MT: When planning to use DCTs, teams need to first evaluate their existing tools and vendors in relation to this new paradigm of patient centricity. Internal tools need to have a capacity to support more virtual and remote activities, and any vendors need to be able to demonstrate experience of supporting decentralized study processes and expectations, with appropriate software and processes.

With normal studies, the range of data sources is manageable and easily controlled through current processes. However, with decentralized, there are more data types, more volume, and most, if not all, would be eSource data e.g., DDC, ePRO, eCOA, central labs, both structured and unstructured, so strong data providence is vital. The DCT tools need to include this providence as an inherent aspect of its architecture, to ease set-up but also avoid even more manual reconciliation, and clinical management during the study, than normal.

There are a lot of solutions offered by the industry, but many are a collection of loosely linked point solutions, often needing parallel build processes for each solution. Although these can collect and share the data, they are fundamentally inefficient and create additional work for study teams, especially with regards to protocol updates. Small changes in one area can have significant knock-on effects across other areas, all of which need high levels of awareness, control, and management.

To run DCTs, teams ideally need a singular platform that encompasses the collection, review, management, analysis, and visualization of the trial data, however disparate that may be, utilizing a single build process. This ensures consistency across the data sources, inherent providence as well as efficient management of any protocol changes. This single platform needs to offer the various involved teams, a holistic and real-time view of all the data in that trial, with visibility of trends across patients, sites, and possibly trials.

With this new trial paradigm and associated tooling come new opportunities for efficiency and cost savings. Processes must be carefully updated, and teams aligned to make the most of what the technology can offer, to minimize resourcing. Keeping existing processes whilst switching out the technology is not sufficient and may even be detrimental to the trial. 

OSP: What kinds of technological tools might come into play? Feel free to talk about wearables, devices home-health nurses might use to take vitals, patient-facing apps, etc.

MT: DCTs inherently require more mobile solutions and as such can create multiple interfaces with the subject. Increasing use of telephone or video visits, use of wearable technologies, mobile apps on the subject’s phone, and even home delivery of an investigational product is now possible. Together these shift the balance towards the patient, transforming their experience. A trial that fits around the subject’s life - decreased site visits for instance - can increase patient commitment, increase data quality, trial compliance as well as increase patient retention overall.

New technologies such as wearables would all need to be qualified for capture of clinical-grade endpoints, whilst providing a modern, easy-to-use, intuitive user interface for the subject. Where these patient-facing tools are not sufficient on their own, such as clinician-led assessments, tooling needs to offer a way to capture data efficiently and directly in the subject’s home.

With increasing subject data, a key area of consideration would be to ensure data privacy is appropriately managed, especially where trials are multinational - including the associated user management and security controls.

OSP: Can you share some of the challenges typically associated with technology in DCTs and things sites/sponsors might miss the mark on?

MT: A key challenge will be providing tools that the average man in the street wants to use. Most clinical trial systems used now are built on older foundations that are designed for ‘trained’ users in a fixed context.

However, the digital/virtual world of today is anything but a fixed context and I see a huge gap between the flexibility and user compatibility of consumer devices and the current clinical systems – moving to DCTs needs a system with the same level of user-friendliness and intuitive operation that we’ve all come to expect from our mobile apps. Ease of use is critical to ensure high-quality data, patient retention, and satisfaction.

Looking at this from another angle, DCTs will include multiple data sources, as well as unstructured data so, a challenge will be updating existing technology to adequately support the associated processes, such as aggregation of the disparate data, and structural processing of unstructured data before assimilation. On top of this, each data source will necessitate additional effort and time. 

Once the study is live, how the data will be monitored, the quality maintained, and the patients supported needs to be decided. Assuming the existing processes will all translate to the new range of data sources can create frustration and increase the team’s workload, without any additional value. Sponsors need to consider that what was needed to manage transcribed data, is different from that needed for site entered DDC and could be different again with direct from patient data.

To truly enable DCTs, we need new thinking; an obvious area is the automation of data aggregation to reduce the effort needed and decrease lag time between data capture and analysis whilst increasing quality. The tech needs to be flexible to accommodate new input sources, whilst maintaining an ability to provide immediacy of data visualization across all the data, patients, sites, and even across the trial. In combination with conduct activities, this new world needs multi-modal data monitoring as well as performance feedback to further hone the new processes.

Ultimately, increasing data quality, improving patient safety, facilitating rapid data aggregation, and the subsequent improvement in decision making is what we’ve all been looking for.

OSP: Could you tell us about Encapsia—what it is, how it works, and what challenges associated with DCTs it seeks to address?

MT: Encapsia is simply the most innovative, powerful, and holistic data capture platform available. It brings a complete solution to collect and manage trial data, with EDC, eSource, and home visit data capture options, supported by a host of other modalities to manage medical coding, data review, third-party data as well as self-service study extracts and archiving.

Our fully featured API allows us two-way integrations with other systems allowing immediate aggregation and visualization, streamlining site and review team interactions. This is particularly relevant to DCTs where there can be a range of data sources, often including unstructured data. 

Encapsia recognizes the need to access your data on your own terms with no barriers, so all data is accessible via self-service download or direct connection so client teams can get what they want, when they want it, 24/7.

As all the data in Encapsia is available live, it empowers faster, better-informed decisions throughout the trial. For trial teams, there’s no downtime for trial updates: our industry-leading point-of-entry checks ensure cleaner data on entry, enhancing speed and cost-savings.

Encapsia operates in a live, cloud-hosted repository purposely built to leapfrog the current legacy systems and their limitations - supporting where trials are going, not where they were yesterday. Encapsia enables teams to interact with live, up-to-the-minute data, wherever they are, increasing efficiency, reducing the burdens on the site and patient whilst improving the quality of investigator-patient relationships. Flexibility is built-in; it’s future-proof out the box. 

OSP: Please tell us about the Home Visit app, and how it works.

MT: Encapsia Home Visit lets you conduct a trial visit at the patient’s home, instead of getting the patient to arrange travel to the site. Study nurses capture data as eSource directly on an iPad, with instant validation of the entered data, significantly increasing data quality.

Like the iPad, it’s fully mobile and works seamlessly on and offline, connecting to known wi-fi and registering all data with the server automatically, with multimedia options such as image capture or dictated notes. Home Visit also supports a ‘patient mode’ where the patient can enter their own data, for instance, a quality-of-life assessment.

Encapsia Home Visit can hold any number of studies and sites, so each nurse needs only one iPad to access all their active studies.

OSP: Because not every patient is tech-savvy, patient-facing technology tools can be challenging to design and implement. Can you share some strategies, and things to consider when using such elements in a DCT?

MT: Tools need to be easy to use for anyone - whether eight or 80. There are significant differences between a UI for eight-year-olds as for 80-year-olds - both in terms of physicality, visual acuity, and recognition abilities, to name but a few.

DCT needs to be flexible enough to accommodate the differences in subject populations and ensure quality data in all cases. We need transformative tools that are intuitive and consumer-friendly with a clean and clear user interface.

Technology choices must support the appropriate engagement of the patient, whilst also supporting the needs and perspectives of the clinical, review, and analytic teams that use that collected data. Any new digital technologies must incorporate seamlessly into the team workflows to avoid too much disruption in the transition from where we are today into this new world.

OSP: Do you have anything to add?

MT: DCTs come in so many different variations, it’s vital for sponsors to change the mindset from “how trials can be made to fit the tools we have”​, to one where “we need to find the tools that fit our trial”.​ The standard technology used pre-pandemic is no longer fit for purpose, the industry must embrace agile, nimble technologies built specifically to support the virtual and decentralized.

To do data differently, change must be embraced, using partners who can provide that revolutionary technology as well as technical consultancy, expertise to support that shared vision.

Right now, it’s probably riskier to stay fixed to the old way of doing things than move onto new options provided by new technology. Are you​ ready to step forward? It’s time to transform your trial today.

Related news

Show more

Related products

show more

Saama accelerates data review processes

Saama accelerates data review processes

Content provided by Saama | 25-Mar-2024 | Infographic

In this new infographic, learn how Saama accelerates data review processes. Only Saama has AI/ML models trained for life sciences on over 300 million data...

More Data, More Insights, More Progress

More Data, More Insights, More Progress

Content provided by Saama | 04-Mar-2024 | Case Study

The sponsor’s clinical development team needed a flexible solution to quickly visualize patient and site data in a single location

Using Define-XML to build more efficient studies

Using Define-XML to build more efficient studies

Content provided by Formedix | 14-Nov-2023 | White Paper

It is commonly thought that Define-XML is simply a dataset descriptor: a way to document what datasets look like, including the names and labels of datasets...

Related suppliers

Follow us

Products

View more

Webinars