Premium

The Problem With AI-Generated Images in Campaign Ads

(AP Photo/Mark Schiefelbein)

When Joe Biden launched his 2024 campaign, the RNC was ready with an ad. The ad presented a vision of what the country might look if Joe Biden wins in 2024. All of the images were AI-based images, and some were ridiculously easy to pick out.  Joe Biden and Kamala Harris look almost plastic in one photo, and have way too many teeth. It was scary.

However, despite a faint disclaimer in the top-left corner indicating the images were AI-generated, the ad, which was otherwise effective, was overshadowed by the fact that AI-generated images had been used, effectively undermining its message.

Engadget warned that “the video is a sobering bellwether of what we may see more of from political campaigns in the months and years to come. It’s not difficult to imagine AI-generated images depicting outright falsehoods in attack ads.” It’s true, we’ve seen it happen already. Earlier this year, a deepfake of Joe Biden railing against “trans women” went viral.

“You will never be a real woman,” the deepfake version of Biden begins in the video. “You have no womb, you have no ovaries, you have no eggs. You’re a homosexual man twisted by drugs and surgery into a crude mockery of nature’s perfection.”

Related: The Only Way to Beat Trump in the Primaries

As I noted at the time, Joe Biden’s AI-generated voice and cadence were spot-on. If the mouth movements in the video weren’t so terrible, I suspect more people would have been fooled by it.  It nevertheless raised major concerns about how this technology can be used for nefarious purposes. Since then, there have been multiple deepfakes that have gone viral, including a deepfake of Ron DeSantis praising George Soros—which many Trump supporters believed. The Trump campaign even shared a fake video of Ron DeSantis’s Twitter Space’s presidential announcement featuring AI-generated voices of Adolf Hitler, George Soros, and others.

The potential for abuse of AI is obvious.

The DeSantis campaign even got in some hot water over a campaign ad last week that featured an AI-generated photo of Trump hugging Anthony Fauci. The ad criticizes Trump for not firing Dr. Fauci during his presidency. The images were mixed in with real ones, and are clearly fake to casual observers.

“Smearing Donald Trump with fake AI images is completely unacceptable,” Sen. J.D. Vance (R-Ohio) tweeted in response to the video.

Despite the fact that the fake images were not crucial to the point the ad was making, they ended up overshadowing the message and, like the RNC ad before it, undermined the effectiveness of the ad. I can’t understand why the DeSantis campaign chose to utilize AI-generated images after the RNC faced criticism for the same practice. These images are arguably even more deceptive than those in the RNC ad, because the latter was using fake imagery to paint a picture of the future, as opposed to passing them off as legitimate depictions of the past. Whoever is responsible for placing those images in the ad should probably be fired.

Regardless, the use of AI-generated images by campaigns is problematic at best and outright deceptive at worst. All campaigns should pledge not to use them.

Recommended

Trending on PJ Media Videos

Advertisement
Advertisement