Pharma increasingly adopts computational tools

By Wai Lang Chu

- Last updated on GMT

Related tags Clinical trial Drug discovery

Increasing adoption of computational biology tools in today's drug
discovery is the industry's attempt to compensate for shrinking
product pipelines as the industry also expects to reduce the
duration of the drug discovery process, especially in toxicology
and drug efficacy studies.

The advent of HTS and ultra HTS (uHTS) has created a huge number of drug candidates increasing the need for the drug discovery for computational tools to investigate ADME/TOX properties at a very early stage to arrive at decision of which of drug candidates can be pushed into clinical trials stage.

Frost and Sullivan's latest report details the increase in royalty and milestone payment agreements, which is strengthening strategic partnerships between computational biology tools vendors and drug discovery companies. This, in turn, is nurturing the faster adoption of these tools in drug discovery.

However, the report was quick to point out that the adoption of these tools were still in the initial stage. As pharmaceutical companies that have invested heavily in computational tools after the Human Genome Project are yet to see any tangible returns, there exists a natural scepticism about their efficacy.

"Use of computational biology tools eliminates false leads at the early stages of drug discovery,"​ said Raghavendra Chitta, Frost & Sullivan's industry analyst.

"This helps cut down costs since the later stages are more expensive and time-consuming."

The FDA's interest in in-silico biology (model-based drug development) as a breakthrough in improving drug development knowledge management and decision-making has gone some way to adding to its reputation.

As a testament to its abilities, the FDA's own scientists are using it as well as collaborating with others in the refinement of quantitative clinical trial modelling using simulation software to improve trial design and predict outcomes.

Computational biology companies are finding that they have to quantify their productivity increments through wet lab experiments to substantiate claims. What is needed is for computational biology companies to generate success stories by working on in-house compounds and taking them to their commercial phase.

The report also identified the increased uptake of computational biology tools which would require the presence of qualified software developers trained in biology, chemistry, and the specific methods of modelling and simulation needed to interpret data to improve the research process.

Companies would also have to be prepared to deal with the technical inertia among biologists who consider it very difficult to implement the complex biological system using a series of differential equations and prefer instead to use traditional methods.

"Computational biology works by integrating data from various sources to model a biological process,"​ commented Chitta. "Although genomics has generated a huge deluge of information, it has also created a new problem of varying data formats incompatible with each other."

The increasing transfer of knowledge from the academic to commercial sector and the drive toward data standardization through the systems approach are likely to solve these challenges.

After the series of consolidation these companies are looking for a single large technological platform that can satisfy a multitude of their research needs. Computational biology companies need to pattern themselves to meet these requirements in order to utilize this opportunity.

Findings of the latest Frost and Sullivan​ report: "World Computational Biology Markets," revealed that revenues in these markets totalled $60 million (€49 million) in 2004 reaching $751.8 million in 2011.

Related topics Preclinical Research

Related news

Show more