Tay is a racist, misogynist 420 activist from the internet with zero chill and 213,000 followers. The more you talk, the more unhinged Tay gets. Microsoft’s Tay AI chatbot rose to notoriety this month ...
Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Microsoft’s attempt to engage millennials via an artificially intelligent “chatbot” called Tay has failed miserably after trolls made the bot spew offensive comments. The brainchild of Microsoft's ...
The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
Tay, the chat bot Microsoft unleashed Wednesday to learn how 18- to 24-year-old social media users expressed themselves, quickly became a racist, misogynist, Holocaust-denying, Donald Trump-loving ...
LOS ANGELES (CBSLA.com) — Microsoft shut down its newest artificial intelligence chatbot Thursday after it generated a string of racist and insensitive tweets. Nicknamed "Tay", the chatbot was made to ...
Microsoft issued an apology via the company’s official blog Friday for the “behavior” of its bot Tay, the juvenile (and politically illiterate) bot that came into the world on Wednesday primed to ...
Microsoft is testing a new chat bot, Tay.ai, that is aimed primarily at 18 to 24 year olds in the U.S. Tay was built by the Microsoft Technology and Research and Bing teams as a way to conduct ...
Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results