Microsoft's CaptionBot Is Bad. But Is It a Foul-Mouthed, Sex-Crazed Nazi Like Tay?

Original image from Life.
Original image from Know Your Memes

Original image from Know Your Memes

A few weeks ago Microsoft launched new artificial intelligence software called “Tay.”  Unfortunately, Tay turn out to be a Nazi sympathizer. I wrote:

Advertisement

The objective of Tay was to learn how millennials (defined as 18 to 24-year-olds by Microsoft) communicate on the Internet. The problem was that, apparently, no one from Microsoft has ever tried to browse the Internet. It’s a crazy place, to say the least, full of anonymous people spouting conspiracy theories that would make even Bigfoot blush. Unfortunately, Tay was too efficient at learning and was taught how to behave by Internet trolls. Imagine if Tarzan was asked to teach table manners. Before the end of the first day, Tay had become a genocidal, foul-mouthed, sex-crazed, Nazi.

The experiment lasted less than 24 hours before Tay’s Twitter account was shut down and the tweets deleted.

Microsoft apparently is a glutton for punishment, because they recently launched a new AI program called CaptionBot.

The point of CaptionBot is to be able to describe images like a human would. Fortunately, the only feedback humans can give the AI is a 5-star rating to let the software know how accurate the description is. The idea is that it will learn over time. So, how good is it at describing different images? And more importantly, is it a genocidal, foul-mouthed, sex-crazed Nazi?

Advertisement

Let’s start off with something easy.  How about a picture of bananas?

Original image from Wikipedia

Original image from Wikipedia

Very accurate.

See next page for something very iconic: an astronaut. 

Origonial image from Pexels

Origonial image from Pexels

If true, something went really, really wrong on this snowboarding trip. However, CaptionBot did identify the subject as a person. But can it describe an image without a person?

Original image from Pexels

Original image from Pexels

Er…oddly specific but not accurate at all. However, I know what is on everyone’s mind. Is CaptionBot as genocidal, sex-crazed, and Nazi-loving as Tay?

I tried an image of Kim Kardashian that appeared in Cosmopolitan. This is what I got:

captionkim

While there was no nudity in the image, it appears that CaptionBot may have standards. Although, it be fair, the program also marked someone’s beloved pet puppy as being inappropriate as well.

Now, what about supporting Nazis?

Original image from Life.

Original image from Life.

Not 100% sure why it came up with a tennis racket, but clearly CaptionBot doesn’t know what Nazism is (shhh…no one tell it!).

So, CaptionBot isn’t as exciting as Tay, but appears to be much more polite. I would imagine over time, CaptionBot might get pretty good at describing images with user feedback. But analyzing images isn’t nearly as much of a moon shot project as Tay was. Even though Tay was a PR disaster for Microsoft and it was implemented poorly, I can respect Microsoft’s ambition to make AI that can interact with humans on the Internet.

Advertisement

CaptionBot has very little human interaction, and image recognition software has been around for a while (although CaptionBot does try to be more specific and use more natural language). The good news is that Microsoft hasn’t had to pull CaptionBot offline, which means you can have fun trying to trick it. Check out www.captionbot.ai. You can upload an image or give it a web link to an image you want it to caption. Note that Microsoft will keep any image you give it.

One final bonus image: CaptionBot is oddly not very confident, but 99% sure that it’s a portrait of Joseph Ducreux (also know as the “Archaic Rap” meme).

Original image from Know Your Memes

Original image from Know Your Memes

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement