About 8,370,000 results
Open links in new tab
  1. Tay (chatbot) - Wikipedia

  2. In 2016, Microsoft’s Racist Chatbot Revealed the Dangers of …

  3. Why Microsoft's 'Tay' AI bot went wrong - TechRepublic

  4. Microsoft shuts down AI chatbot after it turned into a Nazi - CBS News

  5. Twitter taught Microsoft’s AI chatbot to be a racist asshole in …

  6. Learning from Tay’s introduction - The Official Microsoft Blog

  7. Tay: Microsoft issues apology over racist chatbot fiasco

  8. Microsoft and the learnings from its failed Tay artificial ... - ZDNET

  9. Microsoft Created a Twitter Bot to Learn From Users. It Quickly …

  10. Microsoft chatbot is taught to swear on Twitter - BBC News