The agreements with the Columbia University Medical Center and the University of Maryland School of Medicine will be the program's first real-world tests outside of the trivia game show and IBM's laboratories.
Watson, as IBM has dubbed the program, represents a breakthrough in the ability of computers to understand human language and scour massive databases to supply the most likely answer to questions. It's not always right; some of its errors in its "Jeopardy!" debut this week were amusingly off-base.
But it holds promise for doctors and hedge fund managers and other industries that need to sift through large amounts of data to answer questions.
Eliot Siegel, a professor at the Maryland university's medical school, said other artificial intelligence programs for hospitals have been slower and more limited in their responses than Watson promises to be. They have also been largely limited by a physician's knowledge of a particular symptom or disease.
"In a busy medical practice, if you want help from the computer, you really don't have time to manually input all that information," he said.
Siegel says Watson could prove valuable one day in helping diagnose patients by scouring journals and other medical literature that physicians often don't have time to keep up with.
Yet the skills Watson showed in easily winning the three-day televised "Jeopardy!" tournament Wednesday also suggests shortcomings that have long perplexed artificial intelligence researchers and which IBM's researchers will have to fix before the software can be used on patients.
"What you want is a system that understands you're not playing a quiz game in medicine and there's not one answer you're looking for," Siegel said.
"In playing 'Jeopardy!', there is one correct answer. The challenge we have in medicine is we have multiple diagnoses and the information is sometimes true and sometimes not true and sometimes conflicting. The Watson team is going to need to make the transition to an environment in which it comes up with multiple hypotheses -- it will be a really interesting challenge for the team to be able to do that."
Siegel said it would likely be at least two years before Watson will be used on patients at his hospital. It will take that much time to train the program to understand electronic medical records, feed it information from medical literature, and test whether what it's learned leads to accurate analyses of patient symptoms.
He said he wasn't bothered by Watson's on-screen blunders; even highly trained medical professionals make dumb mistakes.
"I will take an assistant that is that fast and that powerful and that tireless any time," he said. "This is going to be something that 10 years from now will be a completely accepted way that we wind up practicing."
Watson could be a boon for IBM, the world's biggest computer services company, if it works as promised in the real world. IBM makes a mint on "analytics" software that helps companies mine their data and predict future trends, such as shopping patterns at a retailer, for instance.
Watson currently runs on 10 racks of IBM servers, but computing power generally doubles every two years so the amount of hardware needed to run the same program will soon be significantly less. And the program can be tweaked to run slower, or scan less information, to make the program easier to deploy in a business setting.
IBM hasn't disclosed prices for the commercial sale of Watson, nor details of the financial arrangements with the hospitals.