The founder and chief medical officer of Omniscient Neurotechnology provided commentary on expanding the brain map, the use of advanced technology, and the need to familiarize clinicians with connectomics.
For years, clinicians have hypothesized that a better understanding of the brain map may lead to more targeted approaches to treating complex conditions. Ever since the first brain map was detailed by German neurologist Korbinian Brodmann in 1909, there have been limitations to what clinicians can and cannot verify. At times, locations differed between individuals, especially those with brain injury, and the model would often group multiple regions that performed distinct, unrelated functions.
Omniscient Neurotechnology is a company dedicated to uncovering more about human brain networks, how they interact, and how they impact day-to-day life. Comprised of scientists, engineers, and veterans of the health tech industry, Omniscient aims to inform smarter decisions about the brain by making sense of complex neural data, using an approach called connectomics. Connectomics is constructing and analyzing a computer-generated map of the brain’s functional and structural connections.
In a new iteration of NeuroVoices, Michael Sughrue, MD, founder and chief medical officer of Omniscient, sat down to discuss the long-term outlook of connectomics and the use of big data approach. He stressed that the time to integrate these approaches is now, noting that although it may take some time to adapt, it can open the door to more personalized, targeted therapeutics.
Michael Sughrue, MD: One of the things we try to do is focus on how to make big data make sense without dumbing it down. We always want to put things into class 1,2,3, and 4, but I’m sorry, the brain just isn’t that way. We can’t oversimplify things, but at the same time human brains can’t handle four dimensions, that’s a struggle. Inevitably, where clinicians come into this is not necessarily doing the coding. But if we allow a bunch of machine learning scientists to go off into the wilderness and not guide the stuff they’re using, they’re going to make lasers that cut butter. Yes, it will cut butter, but it’s not solving a problem. Where we need to get to is making big data make sense. This means we don’t run away from complexity; we figure out how to tackle complexity.
Ultimately, you have to make some simplifying assumptions, or you can’t move. We can’t take all 22,000 genes in the genome and think about it for every patient, that’s not realistic. What we need to do is use tools to quickly point out which of those 20,000 genes matter for this patient and stop assuming that we can put people in cohorts. That’s just not realistic, it’s not how it works. We also need to stop talking about AI and how it’s going to be great, and start to do the difficult of saying, what does it mean when we used adjunct tools to make decisions? We’ve been taught a certain way. I was taught a percuss the chest, but there was a limit to that. That was the best people could do 150 years ago. Right now, what we need to do is say, now that we have this type of stuff, what does it mean?
There’s going to be some uncertainty, just like when people first got MRIs and CAT scans. We didn’t know what size of an epidural we needed to operate. This is what clinical research is about. Until we start bringing it to the hospital and doing stuff with it, we’re never going to make progress. We’re going to spend all this time thinking, “One of these days we’re going to get going!” No, you have to get going. We’re doing it. This stuff is here. Ultimately, it’s going to take a lot of thoughtful people spending the time to say, what is it mean to do connectomic medicine? Because it’s here.
What does it mean when we start thinking about a patient with Alzheimer disease as a series of symptoms that we need to manage? Maybe aducanumab (Aduhelm; Biogen) isn’t going to turn everything around for everybody. What does it mean? Let’s say it is a miracle drug, who knows. Maybe it’s not the controversy we thought it would be and look back and say, that was foolish. There used to be a controversy for endovascular clot retrieval for stroke and now there’s no controversy.
If we find something that does work per se, and we “free” someone from a horribly demented state but don’t make them better, then what’s the point? We need to think about things like that. That’s going to require assessing the damage and figuring out how we put the pieces back together. If you get down to it, a stroke wouldn’t be a big deal if we could rehab you from it, or fix you, or have you get a procedure done like if you had a retinal tear. We haven’t thought of that because we assumed the brain was plastic and we couldn’t fix things. Part of it was us being ignorant and we didn’t have the tools to study this and say, there’s pauses in the brain. We know that now. How much? we don’t know. What’s the right way to do it? We don’t know. But all these things that used to be something you talked about in the resident room are getting serious. They’re not to the point where there’s RCTs (randomized controlled trials) on these things, but we’re not that far away in things like BMI (body mass index). We know that’s coming in some form. We need to prepare ourselves to start thinking with advanced tools and thinking about things in terms of million dimensional equations, because that’s what the real problem we’re treating in.
Transcript edited for clarity.