Tracking the Shift Toward Digital Health
A Stanford report highlights the rise in machine learning research in this rapidly evolving field.
From wearable trackers to mental health chatbots to AI-based cancer screening, digital health now supports clinicians in decision making, accurate diagnosis, disease treatment, and individualized healthcare delivery. The global pandemic could help propel the field further as demand for virtual and technology-enabled services like telemedicine and remote patient monitoring shift from short-term needs to a new normal.
Now the Stanford Center for Digital Health has released a report that tracks and explores the evolution of digital health efforts at the university, analyzing both research trends and landmark studies from nearly 2,400 publications, surveys, and in-person interviews.
“We have seen an explosion in digital health — not just in productivity — but also in the number of people who have been working in this area,” says Mintu Turakhia, the center’s executive director. “Our goal was to define what we have at Stanford in terms of our landscape — who’s doing what, how it is being done, what problems are being solved, and what kinds of quantitative measures of productivity, success, and impact are used.”
Turakhia offers his insights on the report, what trends are shaping digital health, and how AI will impact healthcare.
The report looks at publications from the last 35 years and shows that almost 75% of all the digital health papers have been published in the last five years. How has digital health changed over the last three decades?
Digital health is a recent term, but the application of technology in healthcare — such as remote monitoring of implantable devices like pacemakers — has been going on since the 1990s, and in some cases, the 1980s. What we have seen recently is an explosion and a maturation of the technology in terms of the number of therapeutic areas and use cases. This coordinates with wider internet availability, the rise of smartphones and intelligent computing, and the proliferation of mobile applications.
What are the biggest takeaways from this report? What has surprised you the most about these findings?
The report found cardiometabolics and medical informatics areas to have the most publications and AI to be the most trending technology in digital health with its vast applications ranging from medical imaging to risk assessment.
I think the biggest and surprising takeaway was [realizing] the vast amount of digital health-related work that’s being done across the university — not just at the School of Medicine. If you look at the most cited publications, they came from computer science and engineering, not just medicine. This speaks to the strength of the university in what it’s doing.
Where do you see AI in clinical applications of digital health?
Some of the biggest areas are in data-rich areas including hematology-oncology, neurology, and imaging. AI is most strongly rooted in image classification. The seminal papers at Stanford include assessment of skin lesions and assessment of the retina — particularly for diabetic retinopathy.
What we’re starting to see is that AI is moving into areas of messier and less structured data — looking at prediction, looking at taking unstructured electronic health record data, trying to tackle large enterprise problems in healthcare. I think that’s the next wave. AI looking at noisier, messier data.
What do you think are some of the biggest challenges for AI in digital health today?
AI works well for some things like image detection. But, the major challenges include implementation and adoption of payment models, scaling these across healthcare systems, getting cultural institutions and clinicians to buy in, and figuring out where AI will sit in augmenting the intelligence of a clinician versus an independent diagnostic that runs on its own. That is a graded process that will come through time with ongoing validation testing and regulatory approval.
Other challenges with AI include recognizing that it has limitations. Right now, AI doesn’t work where labels aren’t well defined, the quality of the training data isn’t good, or where the problem inherently is not solvable deterministically by the data you possess. I think we need to continue to maintain the humility to understand where it’s going to work well, where it doesn’t, and how it complements what we already do well in healthcare.
What do you think the future is going to be for medical AI?
The way we take care of patients has already changed with AI. Where we’re moving is its implementation and integration into practice. We’re starting to see AI in workflow enhancement, in rapid triage of patients, in the optimization of resource deployment, and prioritization of healthcare. We’re seeing AI used for disease classification, for sensor data classification — such as whether you have a certain heart rhythm or not based on the data that comes from your watch. What we will get to is a framework where AI is used intelligently by clinicians and in a way that augments their abilities to make better, more accurate, and faster diagnoses. They will use AI to parse through large reams of medical information and clinical guidelines to get the data they need to come at the right answers and treatment pathways for their patients.
Over time, what we will see is increasing autonomy and independence of AI where it can now work alongside a clinician with a higher level of independence. We will continue to see progressive autonomy with rigorous and thoughtful development and testing for safety and efficacy.
How has digital health changed during the pandemic?
Covid-19 has forced us to rethink how we do research and coordinate clinical trials at a global level more quickly. Rather than having 10 small studies, you want to have one study that has the number of patients those 10 small studies would have. This way, you have more power, more impact, and more generalizability. The pandemic caused us to rethink how we operationalize clinical trials.
The second aspect that changed with the pandemic was when society shut down and we could no longer do study visits or enroll patients. A lot of that workflow became digitized and automated. For example, we conducted the Apple Heart study (which uses data from an Apple Watch to identify irregular heart rhythms) during the pandemic. We enrolled 419,000 people in eight months, all virtually without a single physical front door. I think that will be the new normal going forward.
Originally published at https://hai.stanford.edu.