Rapid increase of protocol complexity contributing to clinical trial delays, inefficiencies: Tufts report

By Melissa Fassbender

- Last updated on GMT

(Image: Getty/alphaspirit)
(Image: Getty/alphaspirit)

Related tags Clinical trials Contract research Data Data management

The industry needs “to strike a balance” between science and feasible execution as increasingly complex clinical trial protocols are impeding efficiency and driving up costs, says industry expert.

Ken Getz, associate professor and director of sponsored research at the Tufts Center for the Study of Drug Development, recently led an analysis of 9,737 protocols from 178 global pharmaceutical and biotechnology companies. The results were published in the July/August Tufts CSDD Impact Report​.

“Despite high industry awareness in the adverse impact that protocol complexity has on clinical trial performance, we were surprised to observe rapid growth in all executional elements of the protocol across all phases of development,”​ Getz told us.

According to the report, drug makers doubled the number of countries and increased the number of investigative sites by 63% to support Phase III protocols from 2001-05 to 2011-15. This, while the mean number of patients declined 18%.

“The volume and diversity of data being collected is contributing to high levels of delays and inefficiencies,”​ Getz explained.

The study also detected new areas which are affecting performance, such as long investigative site initiation timelines and long data management cycle times, including time to build and lock study databases.

“The complexity of the scientific and executional elements of our protocols are expected to continue to rise,”​ Getz said – as studies increasingly target rare diseases and specific patient subpopulations, and rely on genetic data.

By the numbers

  • Based on numbers of distinct and total procedures, Phase I and II clinical trials are the most complex
  • Phase III trials have seen the highest increase in complexity during the past 10 years
  • The total number of endpoints rose 86% between 2001-05 and 2011-15
  • Procedures supporting these endpoints contributed “a much higher proportion”​ of data informing secondary supplementary, tertiary, and exploratory endpoints

Getz explained that the industry should aim to optimize protocol design “to strike a balance between the science and feasible execution of the protocol.”  

This could be achieved by removing select procedures that do not support primary or key secondary endpoints. Getz also suggested leveraging protocol authoring templates and modifying protocol authoring approaches to reduce avoidable amendments, as well “taking more time”​ to coordinate data collection from external parties.

Additionally, Getz noted that spending more time upfront assessing and adjusting protocol design based on patient and professional input could improve participation and lower administrative burden.

Related news

Show more

Related products

show more

Saama accelerates data review processes

Saama accelerates data review processes

Content provided by Saama | 25-Mar-2024 | Infographic

In this new infographic, learn how Saama accelerates data review processes. Only Saama has AI/ML models trained for life sciences on over 300 million data...

More Data, More Insights, More Progress

More Data, More Insights, More Progress

Content provided by Saama | 04-Mar-2024 | Case Study

The sponsor’s clinical development team needed a flexible solution to quickly visualize patient and site data in a single location

Using Define-XML to build more efficient studies

Using Define-XML to build more efficient studies

Content provided by Formedix | 14-Nov-2023 | White Paper

It is commonly thought that Define-XML is simply a dataset descriptor: a way to document what datasets look like, including the names and labels of datasets...

Related suppliers

Follow us


View more