Forbes Insights

A decade ago, Eyal Gura, an Israeli tech executive, was traveling in a remote beach town when he had a scuba diving accident.

A technician took X-rays of Gura’s chest but couldn’t make heads or tails of the resulting images. “We had to wait a few days for the radiologist to come from the big city to diagnose what I had,” delaying his treatment, Gura says. “I asked: How come we don’t have a centralized reference database of all the X-rays for people like me so that I can just run a computer vision comparison against it and get my own sense of what’s going on in my body? That was the seed of the idea.”

The idea became Zebra Medical Vision, which transforms vast amounts of medical imaging data into actionable insights, allowing doctors to better detect diseases, tumors and fractures while giving patients more information about their health.

The technology interprets patient scans using algorithms trained on millions of past medical scans. Relying on machine learning, it can identify, with a great deal of accuracy, say, whether a patient has a hairline fracture in her limb or a suspicious lump in her chest. And just as important, it can do it for as little as one dollar per scan.

Though Zebra is already used in hospitals around the world, encouraging wider adoption is the next great challenge. “When we started the company [in 2014], many clinicians were afraid of AI,” says Gura. “In the last year and a half, there’s been a complete mind shift. Now the statement in all the radiology conferences is that radiologists will not be replaced by AI — but they will be replaced by radiologists that are using AI.”

This distinction is key to understanding healthcare’s digital transformation. Machine learning, artificial intelligence, advanced imaging, genomics — these technologies are pushing medicine to new frontiers, but they are often doing it by augmenting, rather than supplanting, the doctor’s expertise. The question now is how both parties can work together — which innovators like Gura and the founders profiled below are answering.  

“Sometimes algorithms can find something that the human eye cannot see, like bone density,” says Gura. “This is something that is easier for a machine to do, to count and quantify pixels, but it’s hard for the human eye to see beneath a certain threshold.”

The machine can also work far faster, diagnosing in a fraction of a second what may take several minutes for a human to determine. That can make a big difference for a doctor charged with treating dozens of patients a day. And with its ability to spot issues like osteoporosis early, Zebra’s technology is about prevention as much as diagnosis. 

It took three years of cold-calling hospitals to assemble the requisite data set to train Zebra Medical Vision’s system, and in 2018 Gura’s company received its first FDA clearance, for an algorithm that detects coronary blockage. Since then, it has racked up approvals to detect brain bleeds, assess chest X-rays and more.

For many of us, our next X-ray might be read by both a human and a machine, but the decision making will remain, as it always has, firmly in human hands.