Sex talking robot chat online

Rated 4.47/5 based on 537 customer reviews

They released it on IRC where it found the above scambot, "tifftaff", in March 2013. Coming to our topic BEST CHATBOTS am i right…..yup!!!"If Microsoft had been using the Broad Listening AEI, they would have given the bot a personality that wasn't racist or addicted to sex!"It's not the first time Microsoft has created a teen-girl AI."When Tay started training on patterns that were input by trolls online, it started using those patterns," said Rosenberg."This is really no different than a parrot in a seedy bar picking up bad words and repeating them back without knowing what they really mean."Sarah Austin, CEO and Founder Broad Listening, a company that's created an "Artificial Emotional Intelligence Engine," (AEI), thinks that Microsoft could have done a better job by using better tools.Or at least Microsoft does."The failure of Tay, she believes, is inevitable, and will help produce insight that can improve the AI system.

She was targeted at American 18 to 24-year olds—primary social media users, according to Microsoft—and "designed to engage and entertain people where they connect with each other online through casual and playful conversation."SEE: Microsoft's Tay AI chatbot goes offline after being taught to be a racist (ZDNet) And in less than 24 hours after her arrival on Twitter, Tay gained more than 50,000 followers, and produced nearly 100,000 tweets. "The system is designed to learn from its users, so it will become a reflection of their behavior," he said."One needs to explicitly teach a system about what is not appropriate, like we do with children."It's been observed before, he pointed out, in IBM Watson—who once exhibited its own inappropriate behavior in the form of swearing after learning the Urban Dictionary.SEE: Microsoft launches AI chat bot, (ZDNet)"Any AI system learning from bad examples could end up socially inappropriate," Yampolskiy said, "like a human raised by wolves."Louis Rosenberg, the founder of Unanimous AI, said that "like all chat bots, Tay has no idea what it's saying..has no idea if it's saying something offensive, or nonsensical, or profound.She was supposed to come off as a normal teenage girl.But less than a day after her debut on Twitter, Microsoft's chatbot—an AI system called "Tay.ai"—unexpectedly turned into a Hitler-loving, feminist-bashing troll. Tech Republic turns to the AI experts for insight into what happened and how we can learn from it.

Leave a Reply