Tonight, all-time Jeopardy champions Ken Jennings and Brad Rutter will return to the show to play against a new competitor -- an IBM computer named Watson.
Although this contest is reminiscent of the 1997 event in which IBM's Deep Blue computer played chess against Garry Kasparov and won, the challenge to winning Jeopardy is more complex than chess, which has well-defined rules and a finite number of moves that can be calculated and analyzed. Jeopardy's sometimes hard-to-understand clues contain little puns and jokes, e.g. "This trusted friend was the first non-dairy powdered creamer," (answer: What is Coffee Mate?) and require natural language understanding and an ability to determine context as well as knowledge of a wide range of topic areas including popular culture, history and entertainment.
IBM engineers spent four years building, training and testing the artificial intelligence that went into Watson. PBS ran a fascinating Nova program last week on this project and other, earlier attempts to make computers approximate the human brain:
IBM engineers fed the computer millions of documents and thousands of questions and examples to build a knowledge base and applied search technology, machine learning, rules and analytics to it to try to teach Watson all the information it would need for tonight's game, then tested the computer for months. The success of this work will be shown in tonight's episode of Jeopardy.
Why are we writing about this on a bank technology website? Because IBM eventually plans to make Watson, or a computer like it, available to banks to solve complex problems like enterprise risk management, according to Shankar Ramamurthy, general manager, banking and financial markets. The 2,800 processors (IBM's Power Unix chips) in the computer, which is the size of 10 refrigerators, could run in parallel to solve complex problems similar to its work on Jeopardy.
"We're trying to understand not just syntax but semantics of language and dealing with unstructured data and data that is rapidly changing," Ramamurthy says. "You can't program all the rules into a computer. It's a unique class of problem that lends itself to ingestion of massive amounts of information, discerning patterns using analytical techniques. Think about it as a cadre of experts, each of whom knows part of the human knowledge base, and you bring all those experts together. We take a query and parse it to these multiple expert modules, analyzing and discerning insights to respond within two or three seconds."
Today, Watson can only play Jeopardy, it wouldn't be able to answer questions about medicine or finance. "But the general concept of a self-learning machine with multiple expert modules that can process things in parallel on a real-time basis and determine which outcome has the highest probability of being right, underpins Watson," Ramamurthy. "What if you had 50 or 100 of the best economists available to you when you look at a scenario, each of whom is an expert in a particular aspect, and what if each expert had the ability to access not just structured data but also unstructured data -- voice streams, video, etc. -- and what if you could add more experts as you learn and could bring them all together and respond to events that happen on a real time basis? What could you do with that kind of capability?" A bank might use this capability to analyze risk, customer service or pricing problems, as well as apply it to algorithmic trading, Ramamurthy says.