Skip to Main Content

Yale Researchers Wield AI for Heart Health

Yale Public Health Magazine, Yale Public Health: Fall 2023
by Matt Kristoffersen


Artificial intelligence can help physicians spot heart disorders not seen by the human eye.

A team of Yale researchers in the Cardiovascular Data Science (CarDS) Lab has trained a machine learning algorithm to identify complex issues with cardiac testing, potentially allowing health care practitioners to provide care early enough to save lives. That same group of researchers has also adapted their tool to work in portable, wearable devices, creating an on-the-go, ever-vigilant monitor for heart health.

Their models draw from hundreds of thousands of heart tests from around the world, and work on the heart diagnostic test that is most available globally: the electrocardiogram (ECG). By using just photos of images, said Rohan Khera, assistant professor of medicine (cardiovascular medicine) and biostatistics (health informatics), most people can benefit from their work.

“This opens up the possibility to finally bring a screening tool for such disorders that affect up to one in 20 adults globally,” Khera said. “Their diagnosis is frequently delayed as advanced testing is either unavailable or only reserved for those with symptomatic disease. Now we can identify these patients with a simple web-based or smartphone application.”

What they have now is a “super reader” of these images, said Veer Sangha, a team member and a Rhodes Scholar. Sangha, Khera and their colleagues have created an algorithm that is skilled at “identifying signatures of LV systolic dysfunction, which the human eye cannot accurately decipher.”

Their models draw from hundreds of thousands of heart tests from around the world.

Their efforts have become even more accessible since they trained the algorithm for the simpler ECGs that are recordable from smartwatches and other wearable technologies. Those portable sensors often generate “noisy” electrocardiograms that lack clarity. In a different study, the researchers taught the algorithm on tests that they intentionally blurred to simulate those from wearables.

“This approach represents a novel strategy for the development of wearable-adapted tools from clinical ECG repositories,” the researchers said.

The work has won the Elizabeth Barrett-Connor Research Award for Early Career Investigators by AHA, Yale’s Blavatnik Fund for Innovation Award, and the Wilson Prize.

Previous Article
Teaming Up to Help Vaccine Decision-Making
Next Article
Molecules Within Olive Oil May Potentially Prevent or Treat Alzheimer's Disease