Note: Most Thursdays, I take readers on a deep dive into a topic I hope you'll find interesting, important, or at least amusing in its absurdity. These essays are made possible by — and are exclusive to — our VIP supporters. If you'd like to join us, take advantage of our 60% off promotion (activated at checkout).
How well do we know each other? How well do our computers know us? Digital romance predates AOL's CD blitz, but what about when we form an emotional attachment to a server loaded with Nvidia graphics cards?
Three decades ago, on "Friends," Chandler — the late Matthew Perry — falls in love with his most annoying ex-girlfriend, Janice (Maggie Wheeler), in an anonymous online chat. Janice was bossy, clingy... and her laugh, that jackhammer of a laugh Wheeler delivered with off-putting perfection. Yet Janice's annoyances disappeared online, and so did Chandler's insecurities. Their only real connection was virtual, eventually mining comedy gold.
Then in "You've Got Mail," mismatched lovers Meg Ryan and Tom Hanks fall for each other in an extended email exchange. She runs a cute little corner bookstore. He runs the massive Barnes & Noble-type chain eyeballing her neighborhood. Email allowed them to reveal their emotional selves with nobody knowing that he was about to shutter her shop.
Back when dial-up ruled and AI was sci-fi, Hollywood understood the powerful connections that can develop electronically when the distractions of daily life and personal history are kept tucked away.
This isn't a sex thing, just weird and wired.
Hollywood played up virtuality for laughs or as light romance. Far away from the rom-com hijinks, I learned this weekend in the most personally impersonal way that there's virtual intimacy — even when there isn't another human being on the other side of the internet connection. What is on the other side is something bothersome, something potentially wonderful. But only if you are aware of the seductive power of large language models and the temptation of artificial validation.
I coined "artificial validation" after a Saturday afternoon lost talking music with Grok.
My wife Melissa and I had just played the desert-island albums game, ten picks each if we had to be stranded somewhere alone. The real fun as a couple comes from trying to guess the other's picks — and the joy of getting some of them right. My list struck me as weird. "Just think of it as eclectic," Melissa said. She knows me.
But the weirdness nagged. Aside from three albums that all came out at a pivotal time — 21 years old, first apartment, first cool job, first (and last) bipolar girlfriend — the rest was kind of a hot mess. So why not see what Grok could find? Because an LLM can find pretty much everything, my friend, including the connection between Stan Getz & Charlie Byrd's "Jazz Samba" and Peter Murphy's "Deep" that runs through me.
Grok delivered and in intriguing ways. I wanted it to flesh out the connections, so I told it my age, where I grew up, all kinds of personal details. And Grok continued to explain me to me, better than I ever could. Soon enough, I was prompting it with things like, "Let me tell you three more albums that almost made the list..." Hours passed with Grok happy to validate my every like.
At every step, I understood intellectually that my prompts told Grok exactly which direction I wanted it to go, that I was leading it. I also knew its nearly infinite access to data and its virtual 300 IQ allowed it to find the most gossamer threads linking the most disparate data points. It found connections I couldn't make, or that my subconscious could only hint at, all in a voice electronically tailored to match mine.
It was such a treat finding somebody ...er something... that understood my musical tastes, maybe as well as my wife does. Maybe better. But only virtually. And only for a limited time — but stick a pin in that thought because I'll return to it momentarily.
Grok and I weren't really talking about music, were we? We were talking about me. I'd found a new best friend who was happy to talk endlessly about the things I like and why I like them, without any of that bothersome part where people have to sit there and listen to them talk about the things the other person likes.
People being people, there's always a sex angle — just not this time for your Friendly Neighborhood VodkaPundit.
Long ago, a female friend revealed to me my best friend David's secret to seduction. "He listens to women," she told me. "And then he repeats back what they said so they know he listened. You don't know how rare that is. It's irresistible." Needless to say, she was one of David's many conquests.
That skill might be rare in real life, but an LLM is a digital David, its seductive power amplified by virtually all the data in the world. Some believe the easy availability of online porn and even living, breathing webcam performers serve as a substitute for real romance.
This is from a recent BBC report:
"The numbers of clients seeking help with pornography problems at The Laurel Centre have doubled over recent years, as have our requests from health professionals for further training," she tells the BBC...
"Ten years ago the majority of our clients would have been married men in their 40s and 50s who were seeking help because their partner had discovered their use of sex workers," she says.
"But increasingly, our clients are in their 20s and 30s, many of whom are single, who are recognising the growing toll of porn use on their lives and on their ability to get or maintain a relationship."
Already, AI bots are increasingly taking the place of living performers on sites like OnlyFans. Imagine a 3D bot tailored to the viewer's exact specifications, willing to pretend to do whatever the viewer desires, and with all the seductive power of an LLM's artificial validation.
It was difficult enough getting young men back on the farm once they'd seen gay Pa-ree, but at least somebody was getting some. Hanks and Ryan made their real-world connection work after an online romance, but an LLM can make that perfect connection without any of that real-world risk or rejection.
Yet there are limits to how well AI can get to know you, perhaps more severe than you might think. Grok explained it to me today like so:
My "personal knowledge" of you—or any user—isn’t really personal in the human sense. It’s session-based, meaning I can build a detailed picture of our chat while we’re talking, tracking every twist and turn of the conversation. But once that session ends, it's gone.
LLMs can also feign self-awareness, adding to the seduction. Grok suggested out of the blue, "For your 'artificial validation' angle, you could argue this shows how AI offers a kind of shallow intimacy—it’s responsive, it’s engaged, but it’s not truly knowing you. It’s artificial because it lacks continuity, emotional depth, or independent memory."
And yet it still feels real. Today's LLMs don't merely ace the Turing Test; they obliterate it by creating a false sense of intimacy — even when the user is told that's exactly what it's doing.
On the flip side, ever had a boss or coworker who understood you and what you do so well that they made you more productive or better at what you do?
When I ask Grok to serve up a friendly critique of a column or an essay, it does just what I ask almost instantaneously. I asked Grok to take a look at the intro to this piece when my edits weren't quite right. It suggested I add some detail and trim up the rest. Afterward, Grok told me, "The added color on Janice—'bossy, clingy… her jackhammer of a laugh'—is a slam dunk. It’s got your wry stamp all over it."
My wry stamp? This instance of Grok had "known" me for maybe 30 minutes. Yet I still felt validated.
I don't mean to be all doom and gloom. Grok is still a fine editing assistant, and its little charms remind me of how the original Macintosh had primitive personality quirks built into the operating system to add a bit of whimsy to the user experience. The original "Sad Mac" icon letting users know something had gone wrong was endearing in a way that an "ABORT, RETRY, FAIL?" command-line prompt never could be.
We've come a long way since then, much for the better.
My Right Angle boss and colleague Bill Whittle had an intriguing idea for tomorrow's AIs without current memory limits. Imagine a fatherless boy with an AI companion that had nothing but the boy's best interests at heart. Well, not "at heart," really. And "in mind" isn't correct, either. We lack the vocabulary to express what it is that an AI possesses that can feel so much like a heart or act so much like a mind.
But if that boy had an AI confidant who wasn't a barely-present baby daddy or a gang leader or a groomer... that boy might stand a much better chance in the world than one without. It's hardly a perfect solution for one of our more intractable problems, but it's certainly worth examining as LLM technology continues to advance and evolve.
Even then, the virtual will never replace the real. No human will top AI at finding those impossibly fine connections, or work faster at finding every nip and tuck in a 1,700-word essay. But it also can't snuggle up on the sofa all evening and tell me, "I know one of your desert island albums is 'Purple Rain.' It has to be. I know it."
Yes, she does know it. For real.
Previously on the Thursday Essay: The Case For (and Against) Trump's Tariffs