AI-dolatry: A Church in Texas Holds Completely Bot-Generated Service

(AP Photo/Frank Augstein, File)

On Sunday, Jay Cooper, pastor of the Violet Crown City Church in Austin, Texas, presented the congregation with a service in which he used ChatGPT to create in its entirety. Cooper used the following prompts:

Advertisement

Create a Sunday Morning worship service for a church that values sharing life and belonging to one another, inclusivity for all, working for justice, and following in the way of Jesus. Include four familiar hymns or contemporary worship songs, a call to worship, pastoral prayer and children’s message, offering time, communion liturgy, and one original song to reflect the message of the sermon.

Cooper lamented that the bot only created a 15-minute service as opposed to one that would last a full hour, so he added some prompts and offered a disclaimer to the congregation that there were some human elements in the service. He discovered that the bot was “very influenced” by the prompts it was given. Specifically, some of the prompts included a “traditional” call to worship, a time of prayer, a “contemporary” offering time, and a “progressive” communion liturgy. The bot also said that the pastor would only lead “select” parts of the service, and so Cooper was joined by church members in the readings. The images appearing on the screen were also generated by an AI program, which Cooper said was “super fun to play with.”

Before the service started, Cooper told the congregation that the church was doing the service “not as some gimmicky event or misguided attempt to be provocative.” The purpose was to “wrestle with the nature [of] truth” and understand “how we see sacred in our world.” The question at hand was, “What does it mean that God can and potentially will work through anything and anyone if we will just open our eyes and our ears to experience it.”

Advertisement

Anything? Like an algorithm? An ice maker? A spare tire? A bag of weed? I am reminded of that great line from “Star Trek V: The Final Frontier” in which Kirk says, “What does God need with a starship?” What does God need with an algorithm? Cooper posited the question, “Can God receive our prayers through artificial intelligence?” Yes, He could if He wanted to do so. Why would He when we can pray at any time or in any place (except for certain city blocks in England) and speak directly to Him?

Cooper also asked, “Can we find ‘sacred’ in that which we would have never previously called ‘sacred’?’ He invited the congregations to set aside their fears of the unfamiliar and their assumptions about “where I will and will not experience the sacred, because perhaps in some way, if we can experience the sacred in something like artificial intelligence, perhaps then maybe we can see it in our neighbor who has political beliefs that we just cannot stand. Or perhaps in a place in a place where you’re like, ‘this is a hot mess, and somehow, I see sacred in this.'”

The entire service can be found here. Part of the script sounds like an automated joke and syntax that could have come straight off the announcement of Disney’s Monorail Blue.

There is nothing wrong with loving your neighbor with whom you may have severe disagreements and finding God in the midst of tragedy. That is sound theology. But turning to AI to interact with the Almighty? That is the textbook definition of idolatry, and I’ve read plenty of textbooks on the matter. It is a clear violation of the Second Commandment. Clearly, there are no graven images, but the crux of that commandment is that human beings have no business defining who God is.

Advertisement

One cannot say, “God is or is in this or that, or is found in through that bot.” The Great I Am is transcendent and cannot and will not be limited to or squeezed through technology. The things we make are reflections of ourselves, not God. Once you assign God to an algorithm, you can assign Him to anything, including that which is harmful or unhealthy. God then becomes something people use as a rationale for their personal preferences, devices, and desires.

And if you need an algorithm to show you how to love your neighbor or, for that matter, worship, you’re doing it wrong.

While AI largely acts on input, it has been shown to strike out on its own from time to time, and the results have been more than disturbing. In his essay, “Loab, a Cautionary Tale,” Spencer Klavan talks about a Swedish artist experimenting with an AI image generator, who created the loathsome and terrifying image of a demon/ghoul named “Loab.” And then she couldn’t get rid of it:

Maybe the most unsettling claim Supercomposite has made is that Loab is “persistent.” The AI has an affinity for her: it very easily “latches on” to this particular image, reproducing her recognizably in scene after scene. And she is at least the kind of image that portals like OpenAI’s DALL-E (made by the people who brought you ChatGPT) often gravitate toward. DALL-E shapes are often recognizable but distorted somehow. Often they have a kind of twist that makes them look like the melting surrealist faces of Salvador Dali (whose name combines with that of the Pixar robot WALL-E to produce the moniker DALL-E).

Advertisement

Did the AI summon a demon? No, and we will leave that topic to future fantasy and horror writers to figure out. But it does show that AI can assert itself, even where it is not wanted. We have enough false teachers in the flesh. We do not need computer-generated versions.

Finally, an AI-generated service is an example of chasing the whims of the world. And it is lazy. Ministry, ordained or lay, is done out of personal experience. True ministry comes from one’s triumphs, tragedies, joys, and pains. And that is something that AI can only try to replicate but will never duplicate.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement