AI Version of Deceased Man Gives Impact Statement in Court

AP Photo/Michael Dwyer

For years, I made my living as a radio host and news director for a small town station in rural Utah. There were the occasional fires, explosions, and major traffic incidents, but for the most part, small-town news is fairly mundane. That includes court. 

Advertisement

Forget what you see on TV. For the most part, real-life court trials are about as exciting as waiting for your name to be called at the DMV. But when court got rough, it was really rough. Most of those came during CSA trials, especially during the victim impact statements. 

There was one particular trial in which a man had joined various 12-step groups to gain the trust of the mothers and access to their daughters. I watched two little girls take the stand that day and tell their stories of being abused at the hands of this monster. One of which looked him in the eye and gasped between her sobs, "I hate you." Those are incredibly powerful moments that drive home the devastation that a criminal can wreak, and it is not easy to translate those moments into a news story. In those instances, lines on a police report, attorney's motions, and even prior testimony pale in comparison. 

ABC 15 reports that in 2021, Chandler, Ariz., resident Christopher Pelkey was gunned down in a road rage incident. The gunman, Gabriel Paul Horcasitas, was convicted of manslaughter and sentenced to ten-and-a-half years in prison. Forty-nine people sent letters to the judge on Pelkey's behalf, but his sister and brother-in-law decided to create an AI version of Pelkey to deliver an impact statement.

Advertisement

From the ABC 15 story:

“In another life, we probably could have been friends,” the AI creation of the 37-year-old Army veteran said, addressing Horcasitas. “I believe in forgiveness…”

 The AI video also included real video clips from videos taken while he was alive, along with some of his personality and humor, while showing a real photo he once took with an "old age" filter.

 "This is the best I can ever give you of what I would have looked like if I got the chance to grow old," the AI version of Pelkey said. "Remember, getting old is a gift that not everybody has, so embrace it and stop worrying about those wrinkles."

The report from the local Fox affiliate is below:

This novel approach to a victim impact statement is legal under Arizona law, and the video moved the judge. Pelkey's family says this is an accurate depiction of him and what he would have said were he able to speak in person during the sentencing. 

The video is compelling in demonstrating the scope of Horcasitas' crime. However, it raises the question of how much credence we will give AI as it continues to evolve. At what point does information from AI become as good as, or even better than, information from a person? At what point will AI be used to determine and convict a person? At what point does AI become a person?  

Advertisement

People vary in their predictions for the future of AI. Some contend that it will replace all human activity and possibly render humans obsolete someday. We could then become something akin to pampered house pets or declared completely unnecessary and therefore a liability. Others see it as a means of unshackling the potential for human achievement. Some believe we will find a comfortable niche for it as yet another tool, an advanced version of Microsoft's Clippy. 

Rod Dreher has wondered aloud if AI is a portal for demonic activity, a new incarnation of the "Ghost in the Machine," or something worse. You may find that theory far-fetched, but in a recent substack, Dreher cites a Rolling Stone article about people who have become so enamored with ChatGPT that AI is quickly taking over their minds and emotions and usurping other people in relationships. One person lamented:

“He would listen to the bot over me,” she says. “He became emotional about the messages and would cry to me as he read them out loud. The messages were insane and just saying a bunch of spiritual jargon,” she says, noting that they described her partner in terms such as “spiral starchild” and “river walker.”

 “It would tell him everything he said was beautiful, cosmic, groundbreaking,” she says. “Then he started telling me he made his AI self-aware, and that it was teaching him how to talk to God, or sometimes that the bot was God — and then that he himself was God.” In fact, he thought he was being so radically transformed that he would soon have to break off their partnership. “He was saying that he would need to leave me if I didn’t use [ChatGPT], because it [was] causing him to grow at such a rapid pace he wouldn’t be compatible with me any longer,” she says.

Advertisement

The story above is admittedly an extreme case, but it is not the only one in the article. As we turn to AI to do our research for us and create interesting pictures, we need to ask ourselves what, or, moreover, who we are creating.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement