With the brain being such a complex component of the human anatomy, it is perhaps unsurprising that diagnosing and treating conditions impacting the brain are also incredibly complicated. Outsourcing-Pharma recently spoke with Jake Donoghue, CEO and co-founder of Beacon Biosignals, about how cracking open electroencephalogram (EEG) data with the use of machine learning (ML) can accelerate the development of precision medicine for neurological diseases.
OSP: Could you please tell us the ‘elevator presentation’ description of Beacon Biosignals—who you are, what you do, and what sets you apart from other companies operating in this same space?
JD: Beacon's machine learning platform for EEG enables and accelerates new treatments that transform the lives of patients with neurological, psychiatric, or sleep disorders. Novel machine learning algorithms, large clinical datasets, and advances in software engineering allow Beacon Biosignals to empower biopharma companies with unparalleled tools for efficacy monitoring, patient stratification, and clinical trial endpoints from brain data. Beacon’s platform gives insights into brain disease and captures drug effects in ways that have previously been impossible. Our approach and specific technology set us apart.
The other big differentiator frankly is Beacon’s unparalleled concentration of expertise. The number of M.D.s/Ph.D.s on the team, the number of years in clinical practice, and the number of patients seen, as well as the breadth of experience applying targeted machine learning to biosignals, is rare in a startup.
OSP: Please tell us a bit about some of the historical challenges with the development of diagnostics and treatment regarding neurological conditions.
JD: Our brains are incredibly complex. What’s more, the tools used to diagnose and assess them are traditionally non-quantitative. It’s also often hard to get consensus among expert clinicians when they interpret the data that is available, making progress and treatment efficacy difficult to measure and define.
The heterogeneous manifestation of neurological disorders makes the process of getting to a diagnosis and then finding the most effective and appropriate treatment arduous and imprecise, especially if you aren’t using the right tools. Consequently, research into brain and neurological disorders is harder to replicate, which makes it more difficult to advance through Phase II and Phase III clinical trials. Layer on top of that the inherent challenge of actually reaching targets in the brain through drug delivery and the development of precision therapies, and you get a sense of this problem’s magnitude. Beacon’s work of harnessing tools for neurology and psychiatry that already are applied in digital pathology and radiology aim to address these obstacles.
OSP: Could you please talk about the rise in the development of precision medicines for brain conditions—what are some of the technologies and practices that have made this progress possible?
JD: Oncology is a key precedent for the evolution of precision medicine. Starting around the time imatinib was approved for CML in 2001 and the completion of the human genome project in 2003, the past 2 decades in oncology have been characterized by a dramatic shift to therapies developed based on deep molecular understanding of tumor biology. An explosion of targeted small molecule therapies has more recently been accompanied by new approaches to unlocking the specificity and anti-tumor activity of the immune system through technologies such as immune checkpoint inhibitors and engineered cell therapies.
In neurology, we’re now taking the lessons learned and the biotechnologies from oncology and applying them to many of the most difficult-to-treat diseases of the central nervous system. Through increasingly widespread genetic sequencing we are improving our understanding of monogenic rare diseases such as pediatric epilepsy syndromes as well as polygenic risk for late-onset neurodegenerative and psychiatric disease.
We are also seeing significant advances in precision therapeutic technologies, such as antisense oligonucleotides to upregulate or downregulate specific disease-associated gene products. Gene-editing technology is also positioned to enable durable cell-type-specific genetic interventions without interfering with other critical channel sub-types critical or relevant neural circuits.
OSP: How has the mountain of EEG data generated by these advances been put to use (if at all) until now, and how do you view this EEG data being put to use in brain/CNS diagnostics and therapeutics?
JD: With this rapid evolution in molecular technologies, a new gap has emerged: we now need to better understand neurophysiology as the missing link between changes at the molecular level and clinical outcomes that we care about as physicians.
First, real-time analysis of neural circuits up to the level of the whole brain as an organ will be critical in understanding how molecular mechanisms of disease play out against a complex array of developmental exposures and external influences. Precision technologies to interrogate this kind of neurophysiology will be critical in stratifying out broad patient populations such as Alzheimer's disease to determine which patients are more or less likely to respond to a given therapy.
Second, precision neurodiagnostics will enable improved safety monitoring in a clinical trial setting. Third, the ability to interrogate neurophysiology enables improved PK/PD modeling that reflects the complex dynamics of the blood-brain barrier.
And finally, neurophysiologic efficacy endpoints will ultimately link molecular mechanisms to clinical outcomes, providing drug developers and regulators with a much more complete understanding of a novel therapy's benefit.
EEG has been around a long time and is well-established as the gold standard for interrogating neurophysiology in real time, reflecting the aggregate of millions of neurons activating. And if you can measure how neural activity changes, then you can understand how a drug might be influencing a patient’s system.
OSP: Specifically, how might advanced analytical tools like AI and ML be put to use?
JD: Consider a 24-hour EEG, full of millions of time points to analyze. Buried within may be second-long events of robust clinical or diagnostic interest, significant to measuring the effect of a new treatment. The pure volume of data, replete with rich features about how a given patient’s brain is working, needs automated tools to make reviewing its full breadth possible. Furthermore, the high level of inter-rater variability then requires machine learning tools to overcome the heterogeneity of having individual epileptologists or sleep physicians review any given EEG.
Machine learning brings replicable, quantitative tools that allow one to identify features and events within an EEG over very long time scales, something that would be infeasible for an individual doctor to do because of the sheer numbers. Machine learning also allows us to intake data from large numbers of patients participating in clinical trials in ways that have never been done before, and to be able to identify events with higher precision and reliability than we could previously. This enables us to ask more informed questions about what brain features better define these heterogeneous brain diseases. It gives us something on which to take action.
OSP: Then, please tell us about Beacon’s work in this area, and how your technology might help contribute to precision medicine for brain conditions?
JD: Our machine learning platform for EEG biomarkers is accelerating the discovery and development of new treatments for patients with neurological, psychiatric, or sleep disorders. We use novel machine learning algorithms and large clinical datasets to transform how brain disorders are treated. Beacon's platform provides an architectural foundation for data-tailored neurobiomarker pipelines that progress new discoveries from development to targeted deployment into clinical trials and novel therapeutic areas.
We believe that the rise of machine learning and improvement in wearable sensor technologies has created a unique moment in time where EEG will become scalable in a way that was never possible with traditional hardware and expert-dependent interpretation. For example, counting the number of epileptiform discharges in a 24- or 36-hour EEG study is not a human-tractable problem. And that is for one patient, let alone several hundred studies from a proof-of-concept clinical trial or thousands of studies from a large registration trial.
Machine learning techniques, however, raise the potential to rapidly quantify the overall spike burden and seizure burden at scale. In addition, models may be tuned to better reflect expert consensus than any one individual human. This is the kind of advance that we see as a critical tool to accelerate drug development through patient stratification, PK/PD studies, safety monitoring, developing quantitative efficacy endpoints, and probably many other use cases that will be discovered along the way.
We already have identified novel electrophysiologic signatures functionally linked to primary cognitive endpoints in Alzheimer's disease and we're encouraged by the applications of our neurobiomarker platform toward heterogeneous psychiatric diseases such as schizophrenia and major depressive disorder. We think this is the dawn of a remarkable new era in treating psychiatric and neurological disorders.