SHARE
Facebook X Pinterest WhatsApp

IBM’s Watson Makes Jeopardy! Debut Against Human Champs

While it initially looked like a rout, the humans clawed their way back during the first round of Jeopardy! and forced IBM’s supercomputer to tie for the lead. The first segment of the long-anticipated two-game Jeopardy tournament between former Jeopardy! champions, Ken Jennings and Brad Rutter, and Watson aired on Feb. 14. Watson and Rutter […]

Feb 15, 2011
Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More

While it initially looked like a rout, the humans clawed their way back during the first round of Jeopardy! and forced IBM’s supercomputer to tie for the lead.

The first segment of the long-anticipated two-game Jeopardy tournament between former Jeopardy! champions, Ken Jennings and Brad Rutter, and Watson aired on Feb. 14. Watson and Rutter both had $5,000 on the board and Jennings trailed behind with $2,000.

“I had a good feeling at the end of the first show,” said IBM’s David Ferrucci, the lead researcher and principal investigator on the Watson project in a post-game analysis posted on the company’s A Smarter Planet blog. “I thought: Everybody will realize the computer is competitive,” he said.

The first day’s final scores revealed only part of the story. Rutter made the first selection, and beat Watson to the buzzer to answer the question. After that, Watson dominated the round up to the commercial break, buzzing in with 11 correct answers out of 15 questions, including the Daily Double (it wagered $1,000 for Literary Character APB). The second half of the round started with $5,200 for Watson, $1,000 for Rutter, and $200 for Jennings on the scoreboard, but Jennings and Rutter beat Watson to the buzzer several times during the course of the round. Most of the wrong answers during this round were also Watson’s.

Viewers who expected the computer to get every question right were treated to several of Watson’s wrong answers in the second half. Unlike a human player, Watson can’t adjust its answers to what other players say and answers whatever it picked as its top answer during initial processing. After Jennings incorrectly answered “20s” was the decade in which Oreo cookies were introduced, Watson answered with “1920s.” 

"Watson is very bright, very fast, but he has some weird little moments,” Trebek said.

For more, read the eWeek article: IBM’s Watson Ties for Lead on Jeopardy but Makes Some Doozies.

Recommended for you...

Gigamon Unveils Agentic AI App to Boost IT Productivity
Luis Millares
Sep 16, 2025
Sentra Releases Security Guardrail Tooling for Copilot Users
Victoria Durgin
Sep 16, 2025
BlackFog & Exertis Enterprise Ink Distribution Deal
Victoria Durgin
Sep 16, 2025
Proofpoint Intros Agentic AI-Based Compliance Offering
Jordan Smith
Sep 16, 2025
Channel Insider Logo

Channel Insider combines news and technology recommendations to keep channel partners, value-added resellers, IT solution providers, MSPs, and SaaS providers informed on the changing IT landscape. These resources provide product comparisons, in-depth analysis of vendors, and interviews with subject matter experts to provide vendors with critical information for their operations.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.