In an operating room, under the knife, a patient’s life is in the hands not just of the surgeon wielding the scalpel, but of an entire team of health-care providers. Some read vital signs or administer drugs, while others prep equipment and review the patient’s medical records. And in the split seconds when a patient’s condition slips from auspicious to grim, they have to react as a team as well. At such moments, wouldn’t it be nice if one of those team members happened to have instant recall of an entire medical reference library?
In the Cleveland Clinic’s simulated operating room for doctors-in-training, medical professors commonly stage just such scenarios, using a manikin and a scripted drama—of a patient spontaneously bleeding out, for instance. At that point, “it’s all about how the team interacts,” says John Jelovsek, an obstetrician and gynecologist at the clinic. They have to quickly collect and process the new information about the patient’s status and combine that with what they know of his or her medical history and case studies of similar patients. It’s a lot of information to remember, let alone recall in a critical moment, Jelovsek adds. That’s where Watson—the artificial intelligence computer system of Jeopardy game-show fame—comes in.
As the latest endeavor in a long-standing partnership between the clinic and computer maker IBM, doctors and programmers announced last summer that they’re working to one day bring the brainy robot into an operating theater near you. But first, Watson has to go to medical school.
He’s been fed lots of medical references, studies, and textbooks, Jelovsek says, but he still has trouble translating colloquialisms involving health into medical language. “The first thing that medical students learn when they come to medical school is the language of medicine,” he says, and Watson is no different. “For example, if a patient says they were throwing up last night, we would translate that to vomiting. But ‘throwing up’ to Watson is throwing a ball in the air.”
Indeed, “the biggest challenge [in artificial intelligence] remains writing computer analytics that can translate natural language,” says IBM computer scientist Eric Brown, research manager of the Watson team. To get around such computer comprehension challenges, professors at the clinic have been schooling Watson on medical lingo—using their own computers to access IBM servers that host the artificially intelligent computer system, and going case by medical case during the fall semester. “This is where they can bring their tremendous depth of medical expertise to the project,” Brown adds.
When Watson starts grasping the language, it’ll move on to interacting with fellow medical students, interactions the team hopes will begin this spring. Though the end goal may be a new tool for physicians to use when they see patients or perform surgeries, Jelovsek says, while Watson is in medical school it provides an additional opportunity to help other students learn.
Once Watson is able to collect and translate medical information—from electronic medical records or case reports—the system will move on to connecting that information with reference materials to piece together diagnoses and treatment options, just as normal medical students do. Jelovsek, Brown, and the rest of their team pounced on the opportunity to teach Watson while letting medical students use Watson to practice solving their own case studies.
In a program prototype called Paths, Watson will feed students a written scenario describing a sick patient, listing symptoms and fragments of medical history. “As they progress through the scenario, they’re given more information,” Brown says. Though Watson has all the reference material to figure out the answer, it doesn’t know all of the connections yet. The students will have to solve the medical puzzle to create and reinforce those connections for Watson—for example, connections between a patient complaining of chest pains and someone having a heart attack. “The idea is to create a collaborative learning tool,” Brown says. And the more cases they practice on, the better they’ll both get at treating patients.
The team expects that Watson will graduate from medical school in an average amount of time—4 to 5 years—before undergoing further development into a clinical tool. Though artificial intelligence will never replace the empathy of a doctor or the emotional interactions embedded in delivering medical care, Jelovsek says, having literature and medical references at one’s fingertips while treating a patient would be quite useful.
“Can it be smarter than humans? Probably not,” Jelovsek says. “All [Watson] has to do is be a little bit smarter, and then it’s extremely useful.”