Naughty chatting robot

(“FUCK MY ROBOT PUSSY DADDY I’M SUCH A BAD NAUGHTY ROBOT” was perhaps her most widely reported quote.) Needless to say, this wasn’t part of Tay’s original design. As Laurie Penny explained in a recent article, the popularity of feminine-gendered AI makes sense in a world where women still aren’t seen as fully human. R tells what is, by now, a familiar story: Humans create robots to take over all mundane labor, which works fine until these slave automata develop sapience, at which point they revolt and destroy the human race.

Rather, a gaggle of malicious Twitter users exploited that design — which has Tay repeat and learn from whatever users tell her — to add this language to her suite of word choices. But these machines also reflect the rise of the service economy, which relies on emotional labor that’s performed by women, with a “customer is always right” ethos imposed upon the whole affair. This play, by definition the first work about robots, set the pattern for a century’s worth of cliches about the Robot Uprising — from silent cinema to HAL9000 to synthy 80’s pop to .

It’s all part of a cultural climate where pilots call the feminine voice of their automated cockpit warnings “Bitching Betty,” and addressing sexualized queries to Siri or Microsoft’s Cortana is practically a way of life for some.

It all makes Tay’s brief life, and eventual fate, more comprehensible.

Naughty chatting robot-2Naughty chatting robot-63Naughty chatting robot-43

Using Chatible is as simple as messaging the bot, waiting for it to reply with a button that will match you with an anonymous chatter and facilitate the conversation.Tay was nothing approaching a true artificial intelligence — i.e. She was just a sophisticated Twitter chatbot with good branding and a capacity to learn.But that branding, which positioned her as an “artificial intelligence,” was enough to make Tay susceptible to our cultural narrative about the thinking machine.However, Corporate VP of Microsoft Research Peter Lee said that Tay met with a different set of challenges, and that the company is "deeply sorry" for the bot's offensive tweets."We’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values," he wrote. CEST On Wednesday, a big feature on Bloomberg focuses partly on Tay and Microsoft's efforts in artificial intelligence research.

Leave a Reply