There's a deeply disturbing '70s horror flick called "The Demon Seed" where an AI organic supercomputer — aptly named Proteus — imprisons and impregnates Julie Christie so that its consciousness can have a baby human body to inhabit.
Yes, it is just as creepy as you imagine it might be, although we are talking mid-'70s Julie Christie here, so you can hardly blame a hyperintelligent supercomputer for at least trying. I watched that movie exactly once as a tween, presumably edited for television, and it has stuck with me all these years.
For reasons probably best left to the psychiatrist's couch, "The Demon Seed," after 40-plus years, is what popped into my mind after reading the strange tale of Elizabeth Laraki and the AI-enhanced photo editor that subtly unbuttoned her blouse.
Laraki is no stranger to high-tech. In addition to being a design partner at tech startup Electric Capital (great name, BTW), her bio says she's also a vet of big names like Google Maps, Facebook, and YouTube, and specializes in "simplifying Web3 UX + AI Design."
To put it simply, Laraki is one of the people whose work makes the front end of high-tech usable to newbies and power users alike. Whatever you might think of those firms personally — and I think everybody here knows I won't do any business with the first two and limit as much as I can using the third — there's no denying that they're all easy to use.
Laraki is a pro, so imagine her surprise when she noticed a peek of her bra showing on the headshot for a conference where she'll be speaking.
"I just saw an ad for the conference with my photo and was like, wait, that doesn't look right," Laraki tweeted on Tuesday. "Is my bra showing in my profile pic and I've never noticed...? That's weird."
So she pulled up the original version of her headshot and, sure enough, there wasn't even a hint of bra.
"I put the two photos side by side and I'm like WTF."
Yeah, I mean look at what happened yesterday when PJ's own Matt Margolis tried to get Grok to generate an image of Kamala Harris with her pants on fire (as in "liar, liar").
Unholy smokes!
But back to Laraki. The "super apologetic" conference host learned that the person in charge of their social media only had a cropped, square version of her headshot, but the website required something more vertical. So she used an AI expand image tool — built into Photoshop and other editors — to extend Laraki's blouse.
Laraki wrote that the tool apparently "believed that women's shirts should be unbuttoned further, with some tension around the buttons, and revealing a little hint of something underneath."
Surely, this must be another example of AI's inherent bias against women — the male gaze hard-coded into LLMs.
Well, no.
On the extremely outside chance that today is your first day on the internet, you've noticed that staid headshots are somewhat outnumbered (ahem) by racier shots. And most of those more revealing photos were likely uploaded by women. I'm not talking about actual nudity, which AI does a great job of filtering out. I am talking about the glamour shots, usually selfies, that millions of people upload all the time to their social media accounts.
So the AI editor used on Laraki's photo deduced that it should show a little bit more of her than is proper for a corporate bio.
There's nothing sinister going on here, no proto-Proteus getting its kicks by ramping up the sex appeal of women's corporate headshots. It's just one more unexpected twist in the high-speed evolution of even our innocuous AI tools.
Recommended: BMW Just Told Europe to Stick Its EV Mandate WHERE?
P.S. PJ Media just launched our Platinum membership level, including our new video channels and private messaging with all your favorite writers. Now is a great time to join during our 60% off FIGHT promotion. If you're already a VIP member, you can upgrade your membership to Platinum here.
Join the conversation as a VIP Member