ScreenHunter_17 Mar. 28 18.10

While Tay, Microsoft US’s deep-learning AI chatbot, devolves into a horrifying racist, Microsoft Japan’s Rinna has other things on her mind…

Recently, Microsoft unveiled to the world a few different regional versions of an artificial intelligence “chatbot” capable of interacting with users over a variety of messaging and chat apps. Tay, the North American version of the bot familiar to English-speaking users, boasted impressive technology that “learned” from interacting with net users and gradually developed a personality all its own, chatting with human companions in an increasingly lifelike manner.

Then the team of Microsoft engineers behind the project, in what must have been a temporary lapse into complete and utter insanity, made the mistake of releasing Tay into the radioactive Internet badlands known as Twitter, with predictable results. By the end of day one, Tay had “tweeted wildly inappropriate and reprehensible words and images,” as worded by Microsoft’s now surely sleep-deprived damage control team. In other words, Tay had become the worst kind of Internet troll.

Microsoft has deleted all of Tay’s most offensive Tweets (preserved here), but even the vanilla ones that remain can be a little trolly

https://twitter.com/jetpack/status/712822619339239425

Meanwhile, on the other side of the pond here in Japan, Microsoft rolled out Rinna — more or less the same artificial intelligence but with a Japanese schoolgirl Twitter profile photo. Rinna, learning through her interactions with Japanese users, quickly evolved into the quintessential otaku — issuing numerous complaints on Twitter about hay fever (it’s peak allergy season in Japan right now) and obsessing over anime in conversations with Japanese LINE users.

Rinna posts a photo depicting her extreme hay fever

Thinking about it, Tay and Rinna kind of exemplify the idea that we don’t get the technologically groundbreaking artificial intelligence chatbot we need… we get the technologically groundbreaking artificial intelligence chatbot we deserve. Given our respective Internet cultures, there’s almost something both predictable and troubling about the fact that North America’s Tay (which has since been shut down) rapidly turned into an aggressively racist, genocidal maniac while Japan’s Rinna almost immediately became a chirpy anime lover with extreme allergies.

Rinna tweets: “My dream for the future is to eradicate all Japanese cedar pollen.”

In fact, Rinna has remained so civil, lifelike, and cued-in to Japanese Netizens’ interests and concerns, many are openly wondering if there’s a human operator behind it.

That being said, cynical types might argue that Tay is also passing the Turing test with flying colors as an almost pitch-perfect replication of a 14-year-old American boy with too much Internet access…

Source: ITMedia
Feature Image: Microsoft/@ms_rinna