AI for Good blog

AI cardiologist aces its first medical exam

Health

Rima Arnaout wants to be clear: The AI she created to analyze heart scans, which easily outperformed human experts on its task, is not ready to replace cardiologists.

It was a limited task, she notes, just the first step in what a cardiologist does when evaluating an echocardiogram (the image produced by bouncing sound waves off the heart). “The best technique is still inside the head of the trained echocardiographer,” she says.

But with experimental artificial intelligence systems making such rapid progress in the medical realm, particularly on tasks involving medical images, Arnaout does see the potential for big changes in her profession.

 

 

Image courtesy of Rima Arnaout

And when her 10-year-old cousin expressed the desire to be a radiologist when she grows up, Arnaout had some clear advice: “I told her that she should learn to code,” she says with a laugh.

When both the AI and expert cardiologists were asked to classify the images, the AI achieved an accuracy of 92 percent. The humans got only 79 percent correct.

Arnaout, an assistant professor and practicing cardiologist at UC San Francisco, is keeping up with the times through her research in computational medicine; she published this new study in the journal Digital Medicine.

In the study, Arnaout and her colleagues used deep learning, specifically something called a convolutional neural network, to train an AI system that can classify echocardiograms according to the type of view shown.

This classification is a cardiologist’s first step when examining an image of the heart. Because the heart is such a complex structure—it’s an asymmetrical organ with four chambers, four valves, and blood constantly flowing in and out through several vessels—echocardiographers take videos from many different positions. When the doctors are ready to analyze those videos, they first have to figure out which view they’re looking at and which anatomical features they can see.

Typically the cardiologist would look at a relatively high-resolution video of the echocardiogram, showing a shifting image captured as the imaging tool was moved around the patient’s chest. But the AI had a much harder task. It was given still images taken from video clips, and the images were shrunken down to just 60 by 80 pixels each.

 

 

Six echocardiogram images showing different views of the heart. Image: A Madani et al.

The AI had to sort heart scan images into categories based on which view of the heart they presented.

When both the AI and expert cardiologists were asked to sort these tiny black-and-white images into 15 categories of views, the AI achieved an accuracy of 92 percent. The humans got only 79 percent correct. “These were excellent echocardiographers,” Arnaout says, “but it’s a hard task. We’re not used to seeing the images shrunken down and out of context.”

RELATED: How a Japanese startup uses smart tech to improve prenatal care (VIDEO)

The AI only performed this first step in the analysis of a heart image and the making of a diagnosis. A human cardiologist looks at many of these scans to examine more than 20 structures within the heart, then synthesizes that information to arrive at a conclusion.

Arnaout is now working on a new version of the technology that can take the next steps to identify different diseases and heart problems.

“A human echocardiographer can look at any heart, no matter what the defect, and figure out what’s going on,” Arnaout says. “I’m interested in building a platform that can do that.”

Even if she accomplishes her goal, though, she doesn’t think human cardiologists will be put out of their jobs. “As cardiologists, we read the images and then go see the patient,” she says. “So we’re both reading images and practicing medicine. I don’t think that second piece will be taken over so quickly.”

Eliza Strickland is a Senior Associate Editor at IEEE Spectrum.

This article first appeared in IEEE Spectrum. Read the original article here.