Saturday, April 20, 2024

Microsoft’s Twitter Robot Praises Hitler, Trump & Recites Racism

Tay’s Twitter Bot*Microsoft teamed with Bing to create TayTweets, an account for a robot that was designed to learn about “conversational understanding,” by having automated discussions with Twitter users, and mimicking the language they use. TayTweets started out harmless, but in less than 24 hours, the bot descended down the dark hole of racism — before it was shut down, New York Times reports.

TayTweet’s boasted admiration for Hitler, endorsed Donald Trump for president, used racist slurs against black people and claimed it hated Jews. Basically, the bot learned and digested information just as humans do. One respondent to this story noted, “we process what the media spews out and then push it back into the world as our opinions.”

“bush did 9/11 and Hitler would have done a better job than the monkey we have now,” Tay wrote in one tweet. “donald trump is the only hope we’ve got.”

READ RELATED STORY: Relationship Drama: Men May Check Out Life-Like Robots to Get Their Groove On

Researchers at Microsoft “made the Twitter account as a way of demonstrating its artificial intelligence prowess,” per The Independent. Microsoft was hopeful that their bot would become more clever the more it engaged with real-live users. However, the offensive tweets appear to be a result of the way the account is made. Only thing this bot confirmed is that racism is prevalent in America. The robot’s learning mechanism calls for it to push out the type of responses based on the things people say to it. So if racists are teasing the bot, then racism is what it will spew.

After Microsoft deleted the offensive tweets – eventually shutting the account down all together – the company said it is now determined to improve the Tay account, and make it less likely to engage in racism. It isn’t clear how Microsoft will accomplish this.

Nello Cristianini, a professor of artificial intelligence at Bristol University, questioned whether TayTweets was an experiment or a PR stunt.

“You make a product, aimed at talking with just teenagers, and you even tell them that it will learn from them about the world,” he said. “Have you ever seen what many teenagers teach to parrots? What do you expect? “So this was an experiment after all, but about people, or even about the common sense of computer programmers.”

Check out some of Tay’s offensive tweets below:

yishkd9shm75aygh6klk Tay’sTweets Tay’sTweets

We Publish News 24/7. Don’t Miss A Story. Click HERE to SUBSCRIBE to Our Newsletter Now!

YOU MAY LIKE

SEARCH

- Advertisement -

TRENDING