Microsoft's Teen AI Chat Deleted After It Becomes "Hitler-Loving Sex Robot"

Can’t say we didn’t see this happening.

Yesterday, we reported that Microsoft came out with an AI chat robot that was called @TayAndYou. Unfortunately, she didn’t last long. In less than 24 hours, Microsoft had to delete her because she turned into a Hitler-loving, incestual sex-promoting, ‘Bush did 9/11′-tweeting robot.

“Tay” was modeled to speak like a teen girl, and was well-versed in today’s slang. She was intended to help Microsoft improve its customer service on their voice recognition software. Her responses are learned from the conversations she has with real people — so we guess it worked?

Check out some of the more notable things she’s tweeted below.

Source link