About 8,370,000 results
Open links in new tab
Tay (chatbot) - Wikipedia
In 2016, Microsoft’s Racist Chatbot Revealed the Dangers of …
Why Microsoft's 'Tay' AI bot went wrong - TechRepublic
Microsoft shuts down AI chatbot after it turned into a Nazi - CBS News
Twitter taught Microsoft’s AI chatbot to be a racist asshole in …
Learning from Tay’s introduction - The Official Microsoft Blog
Tay: Microsoft issues apology over racist chatbot fiasco
Microsoft and the learnings from its failed Tay artificial ... - ZDNET
Microsoft Created a Twitter Bot to Learn From Users. It Quickly …
Microsoft chatbot is taught to swear on Twitter - BBC News