Dr. Watson will see you now
By Lisa Manfield
May 13, 2011
IBM in February set out to prove once again that computers can beat humans at their own game. Watson, its latest showpiece supercomputer, took on two of Jeopardy’s biggest winners—Ken Jennings, who holds the record for most wins, and Brad Rutter, who has won the most money. It was a match reminiscent of 1997, when IBM’s Deep Blue Supercomputer beat world champion chess player Gary Kasparov. Since then, IBM has searched for ways to showcase technology’s power to impact lives.
Why Jeopardy? “Back in 2004, one of our executives noticed that at 7:00 pm in the restaurant he was in, everyone went to the bar to watch TV during Jennings’ run,” said Jennifer Chu-Carroll, a member of IBM’s Watson algorithm team. “One of our criteria is that it has to be a research project and it has to get the public’s attention.”
And get attention Watson did. Not only did its Jeopardy performance send ratings soaring for the veteran show but it also got people talking about the potential of natural language processing. “We wanted to show that a computer can do something people think is smart and on the cutting edge.”
If you watched the show, however, you’ll know that Watson didn’t answer every question correctly. In fact, its accuracy rate was about 70 per cent. That’s because the process by which it works to answer questions produces a series of potential responses, each with a confidence rating. Getting to those responses means understanding the question, which requires making relational linkages between its various components. “It parses the question into names, places, dates and their relations,” Chu-Carroll said. “For example, if it’s a book by an author, it has to know there’s an author relation. So it needs to know what to search for to get the most relevant content, and it uses lots of scorers on the candidate answers. It puts out over 160 answers, but only the top three were shown with confidence ratings.”
It does all that in three seconds, before buzzing in to respond. And luckily, 70 per cent accuracy was enough to win a million dollars on Jeopardy, which IBM donated to charity. It’s also proving enough to revolutionize industries much less trivial than TV trivia games. First up: health care.
Watson goes to the doctor
With its ability to access 16 terabytes of data (200 million pages of content were loaded onto Watson for the Jeopardy match, using only a quarter of that space) and answer questions posed in natural language, Watson has great potential for applications in health care. “Watson is going to be able to help humans find the information they need from a large collection efficiently,” Chu-Carroll said. For doctors, in particular, it’s a pressing need. “There’s a long history of collecting information in medicine,” said Dr. Herbert Chase, a professor of clinical medicine in biomedical informatics at Columbia University’s College of Physicians and Surgeons. “There are electronic textbooks and PubMed, which has every article that’s ever been published, and original research. Watson has access to all of that in a fraction of a second.”
To assess doctors’ needs and adapt Watson accordingly, IBM has assembled a team of physicians including Dr. Chase to explore Watson’s potential in medical settings. “We took the same machine from Jeopardy and applied the algorithm to medical questions, and without any adjustments the machine did very well,” Dr. Chase said.
This makes Watson well suited to the exam room, where doctors must often take enormous amounts of information into account in order to make diagnoses. “At the end of the day I have 50 questions in my mind that I need to find answers to. And it’s not that easy to find the answers—it takes time,” Dr. Chase said. “So it can be a question and answer tool and the doctor will know what to do with that info.”
But how does that make Watson any different from, say, Google, which also has access to an abundance of information? “With Watson, you can describe the problem in English. There’s no need to pick the right keywords, unlike with a Web search,” Chu-Carroll said. “We also look at many dimensions of evidence: the patient’s medical history, are they prone to disease, symptoms the patient is describing, doctor findings, lab test results.”
In response to each question, Watson returns a list of ranked answers with supporting evidence, an important consideration for doctors. “We don’t need the doctor to have all the answers, we just need to give them the options,” Dr. Chase said. “The doctor will focus on the most likely diagnosis, but can’t lose sight of the least likely option. Watson’s output can be 10 or 15 things, and the doctor might say ‘thank you for reminding me of that.’”
With this kind of efficiency, Dr. Chase believes Watson has the potential to dramatically reduce the cost of diagnosis by identifying possibilities that would otherwise take a doctor months. “It implicitly suggests things for me to do, like order one simple blood test that can distinguish between conditions,” Dr. Chase said. “For example, 35 years ago I had a very challenging case. The patient’s muscles were failing and no one knew what was wrong with her. It took two or three months of incredible effort to figure out it was a rare cause of rickets. I wondered what Watson would do, so I plugged in the clues. The number two response was rickets. Number seven was the rare cause. So the diagnostic side is a compelling part of its business model.”
And Dr. Chase said Watson could also play a role in treatment. “Matching the optimal treatment requires crunching an incredible amount of data. We talk a lot about personal medicine but it’s easier said than done. Watson consults a genetic database because for each gene there could be seven drugs that would be useful, and Watson can match them up.”
So will you be greeted by this electronic doctor’s assistant the next time you go for a checkup? It’s possible. IBM is working with several partners, including Nuance Communications, Columbia University Medical Center and the University of Maryland School of Medicine on an aggressive schedule, with plans to roll out Watson to the medical community in 18 to 24 months.
But there are still some critical issues to work out, according to Dr. Chase. “Technology has the potential to interfere, so we have to figure out how this fits into the workflow without harming the patient. Imagine being in emergency and playing with Watson as a patient crashes,” he said. “Our role at Columbia is to simulate the environment before we do it in real life. But an exam room could be the perfect place for Watson. I can imagine Watson sitting on a table with the patient, and the doctor speaking to it.”
Of course, even in the exam room Watson will never replace doctors in a Terminator-style rise of the machines. “It’s really a consultant offering opinions,” Chu-Carroll said. “It’s helping people do their job better.”
And for doctors, Watson will take a load off their minds. Literally. “There’s a lot of memorization involved in medicine, and if you’re memorizing, you’re not analyzing,” Chase said. “But I am mindful of the essential importance of the doctor-patient relationship. Only the doctor will know if the patient is fearful of something, or doesn’t have the money. The art of practicing medicine is figuring all that out.”