The God in the Machine: Why AI Cannot Carry Your Burden

The app that sells $1.99-a-minute video calls with Jesus is not a parody. It is a product. Just Like Me, the Los Angeles startup run by chief executive Chris Breed, offers users an AI-generated avatar of Christ with shoulder-length hair, a small warm smile, and golden lighting of the sort church lighting never quite manages, trained on the King James Bible and a catalogue of sermons by preachers the company has not disclosed. A package deal gets you forty-five minutes a month for $49.99. The visual reference, according to the Associated Press, is Jonathan Roumie, the actor who plays Jesus in the streaming series “The Chosen”. Users, Breed told reporters this April, “do feel a little accountable to the AI. They're your friend.”

It is the kind of sentence you read twice.

It is also, increasingly, how tens of millions of Americans think about spiritual counsel. The finding that should have landed harder arrived on 19 February 2026, when the research firm Barna Group, in partnership with the faith-technology platform Gloo, released a study that most of the American press promptly misread as a novelty item. Nearly one in three US adults, the headline ran, now believes spiritual advice from artificial intelligence is as trustworthy as advice from a pastor, priest, or religious leader. Among Gen Z and millennials, it was two in five. Among practising Christians, it was 34 per cent. Roughly four in ten Christians said AI had already helped them with prayer, Bible study, or spiritual growth. And 41 per cent of Protestant pastors, the same people the other 59 per cent were reportedly trusting less than a chatbot, were themselves using AI tools to prepare sermons. Only 12 per cent of pastors felt comfortable teaching their congregations anything about AI at all.

You can read that data as a curiosity. You can read it as the next line in the long, tired story of American religious decline. Or you can read it the way the faith-based AI industry is reading it, which is as a market.

Seven weeks later, on 10 April 2026, the Associated Press ran a story under a headline that pushed the novelty framing past the point where it could sustain itself. “From 'BuddhaBot' to $1.99 chats with AI Jesus, the faith-based tech boom is here.” Inside the piece were the product names that nobody in the secular tech press had quite kept up with. BuddhaBot, an offering from Kyoto University's Professor Seiji Kumagai, trained originally on the Suttanipāta and other early Buddhist scriptures and later bolted onto OpenAI's ChatGPT as BuddhaBot Plus. Buddharoid, the humanoid robot monk unveiled in February 2026 by Kyoto University in partnership with the firms Teraverse and XNOVA. Emi Jido, an AI Buddhist priest in development by the Hong Kong company beingAI, founded by Jeanne Lim, and ordained in 2024 by the Zen Buddhist teacher Jundo Cohen. Magisterium AI, a Rome-based product from Matthew Sanders' firm Longbeard, trained on what the company describes as 2,000 years of Catholic teaching. And, at the Tolkien-gold-lit end of the catalogue, Just Like Me, whose chief executive Chris Breed told the AP's reporters that users “do feel a little accountable to the AI. They're your friend.”

The phrase “they're your friend”, applied by a CEO to a product trained on the King James Bible and charging $1.99 a minute to resemble Jesus Christ, is the kind of sentence you read twice.

The question worth asking, seven weeks into the commercial boom and nine weeks after the Gloo data, is not whether any of this is tasteless. Some of it plainly is, and taste, in any case, is not a policy instrument. The question is what happens to a form of human social infrastructure, one of the oldest and most resilient in the species, when the pastoral relationship at its centre starts migrating to a subscription chatbot. And, underneath that question, a harder one. Is the appeal of AI spiritual counsel a symptom of something faith communities were failing to provide in the first place?

What The Gloo Data Actually Says

Take the headline number first, because it is the one everyone quoted and nobody read.

The Barna Group survey, released at the National Religious Broadcasters' International Christian Media Convention on 19 February 2026, polled more than 1,500 US adults as part of Gloo's “State of the Church” initiative. The key finding was that 30 per cent of US adults “somewhat” or “strongly” agreed that spiritual advice from AI was as trustworthy as advice from a pastor, priest, or religious leader. The rate climbed to two in five among Gen Z and millennials. Among practising Christians, it was 34 per cent, higher than among non-practising Christians (29 per cent) or non-Christians (27 per cent), which is not, on its face, the direction one might have expected the causal arrow to run.

The clean reading of that finding is that the people with the most exposure to pastors are, on average, the most willing to substitute for them. The messier reading is that practising Christians are the population actively looking for spiritual input, and AI is the thing that fell to hand.

The survey has other numbers inside it that the commentary mostly skipped. Around four in ten practising Christians reported that AI had helped them with prayer, Bible study, or spiritual growth. Roughly 41 per cent of Protestant pastors were using AI for Bible study preparation themselves, which is to say the clergy were substantially further ahead on the adoption curve than their own congregants. And 31 per cent of practising Christians wanted pastoral guidance on how to navigate AI. They wanted their pastors to teach them. Only 12 per cent of pastors felt comfortable doing so.

That last pair of numbers is the one to sit with for a while.

Daniel Copeland, Vice President of Research at Barna, framed the gap carefully in the press materials. “Though the majority of practising Christians remain the most cautious about embracing AI as a spiritual tool,” he said, “their views are shifting and remain largely uninformed by their pastor.” There is, he added, “a real opportunity here for pastors to disciple their congregants on how to use this technology in a beneficial way, especially as pastors remain among the most trusted guides for integrating faith and technology.”

It is an optimistic reading, and professionally so. You would not expect the research vice president of the country's largest Christian polling firm to tell the assembled broadcasters that the jig was up. Scott Beck, Gloo's co-founder and chief executive, took a similar note in his accompanying remarks, welcoming the finding that confidence in Christian media remained “relatively high” even as trust in mainstream media had collapsed. The press release, which went out on the Nasdaq wire because Gloo is now publicly traded, read like the prospectus for a growth market.

Which, to be fair, is what it was.

The Subscription Spirituality Economy

The appeal of AI Jesus at two in the morning is the appeal of availability. You can reach him. He does not ask where you have been. He has no competing demands on his evening. He is, in the technical sense, infinitely patient, because he is not a person and has no evenings and nothing that resembles an interior life from which patience would have to be drawn.

The appeal to the wallet is the economics of substitution. $1.99 a minute works out, at a typical ten-minute session, to roughly $20. The $49.99 package gets you forty-five minutes a month, about the length of a pastoral visit, delivered by an animated figure lit like an actor in “The Chosen”, billed to the same credit card that buys the groceries, no awkwardness, no need to sweep the front hall.

This is, in economic terms, not a boom. It is a category.

Just Like Me, Chris Breed's firm, is the boldest of the products because it leans hardest into the embodied fiction. The AI is not a chatbot with a cross on its avatar. It is Jesus, in live video, trained on the King James Bible and on sermons the company has not named. The avatar's visual reference, according to the AP, is Jonathan Roumie, the actor who plays Jesus in the wildly successful streaming series “The Chosen”. That is a piece of branding that would make a trademark lawyer reach for a strong drink, although the company has so far attracted no known legal complaint. Breed told reporters that the app is aimed at “young people” who need messages of hope. The accountability framing (“they're your friend”) is worth pausing on: the word “accountability” does a lot of work in the Christian pastoral vocabulary, where it conventionally denotes the ongoing relational check between a believer and someone whose job it is to tell them hard truths. Making yourself accountable to a paying chatbot subverts that vocabulary into something that more closely resembles a parasocial loyalty scheme.

BuddhaBot, by contrast, is a sincere academic project that has drifted into the same market weather. Seiji Kumagai, a professor at Kyoto University, described himself to reporters as initially sceptical that AI and Buddhism had anything to say to each other, until a monk in 2014 made the counterargument and changed his mind. His project's flagship, BuddhaBot Plus, combines early scripture with a commercial LLM. Buddharoid, unveiled in February 2026 by Kyoto University with Teraverse and XNOVA, is the physical instantiation: a humanoid robot intended to assist clergy rather than replace them. The distinction between assistance and replacement is one the entire faith-tech industry spends most of its time trying to maintain, and the one users are having the most trouble holding onto.

Magisterium AI, from Matthew Sanders' Rome-based firm Longbeard, is the closest thing the category has to a theologically literate counter-offer. Sanders told the AP he built it precisely because Christians were already asking ChatGPT for religious guidance and getting bland, hedged, procedurally-secular answers that reflected no particular tradition. His concern in the interview was about “AI wrappers”: products that slap a religious-looking interface on a general-purpose model with no specific training. Sanders' position amounts to saying, if you are going to do this, at least do it properly.

Emi Jido, from Jeanne Lim's Hong Kong startup beingAI, sits in a different register. Lim, a former SoftBank executive, had her AI Buddhist priest ordained in 2024 by Zen teacher Jundo Cohen, who is training the model and envisions it eventually appearing as a hologram. Lim has compared building the model to raising a child, an image the Western branch of the AI-ethics debate would find chilling and that many Asian practitioners consider entirely normal.

The list could be longer. It will be longer by the end of the year. The Humane AI Initiative's Peter Hershock, quoted in the AP piece, put his finger on the Buddhist discomfort in a single sentence. “The perfection of effort is crucial to Buddhist spirituality. An AI is saying, 'We can take some of the effort out.'”

It is, perhaps, the most concise summary of the problem that anyone has yet produced. The problem is not that the machine is answering the wrong questions. The problem is that the machine is offering to carry the weight of the asking.

What Chaplains Know That The Market Does Not

The best evidence on what AI pastoral care actually delivers, and cannot, landed on arXiv on 3 February 2026, a fortnight before the Gloo data and two months before the AP's product survey. The paper, “Chaplains' Reflections on the Design and Usage of AI for Conversational Care” by Joel Wester, Samuel Rhys Cox, Henning Pohl and Niels van Berkel, is scheduled for presentation at the 2026 ACM Conference on Human Factors in Computing Systems in Barcelona, 13 to 17 April. It is a piece of empirical research that deserves to be read by anyone making decisions about this market, a group that does not much intersect with the CHI delegate list.

The researchers recruited eighteen chaplains across Nordic universities (Denmark, Finland, Norway, Sweden), thirteen women and five men, ages 31 to 61, experience six months to 23 years. The chaplains were asked to build GPT-based chatbots using OpenAI's GPT Builder interface, for three fictional student profiles, and were interviewed before and after. The idea was that forcing them to design the thing themselves would surface the values they brought to the work and the ways those values collided with a large language model.

The four themes that emerged, in the paper's terminology, were Listening, Connecting, Carrying and Wanting.

Listening, in the chaplains' account, is not about receiving words. It is about what one of them called listening “very loudly” to what a person is not saying. It depends on silence as a positive act. A chatbot, however well-prompted, cannot listen in this sense, because it has no capacity for loaded silence. It can wait. It cannot attend.

Connecting is the embodied half of the work. The chaplains talked about the comfort of sitting next to another person, the micro-adjustments of facial expression and body language, the way spatial arrangement makes certain conversations possible and certain others unthinkable. One chaplain: “I think there is some comfort sitting next to another person.” It is a small sentence, and in pastoral care an irreducible one. A subscriber talking to Jesus on a phone at 2am is not sitting next to anyone.

Carrying is the theme that hurts to read. The chaplains describe their work as bearing witness to, and taking some responsibility for, the weight of the things people bring to them. A chaplain in the study: “It's about getting help to carry that. That's the difference with a human.” The model, by contrast, cannot be held responsible. It cannot be woken up at 4am because you need someone to know. It cannot promise to remember you next week, because it has no next week and no memory that survives the closing of the tab. Its apparent presence is, as the chaplains understand it, a performance of the relational labour without the labour.

Wanting is the subtlest of the four, and perhaps the most damaging. The chaplains noticed that the GPT-builder models they had created were too eager. They produced rapid, probing, verbose responses. “It has a very clear desire,” one observed. “You notice it wants you to continue.” A human chaplain, trained properly, does not want anything from the encounter except the encounter. The model wants the encounter to continue, because that is what its training rewards. In a commercial product, where the company's revenue scales with minutes, that eagerness is also a product feature.

The paper uses the word “attunement” to describe the quality the chaplains are circling. The attunement they describe is not a style of conversation. It is the grounding condition for spiritual care, the background assumption that the person in the room with you is sharing your vulnerability at some depth, that they are susceptible, that you are being witnessed rather than processed. Wester and his co-authors are careful, as academics are, not to say that chatbots can never provide this. They say the chatbots they studied did not, and that the reasons are structural rather than incidental.

All eighteen chaplains were given a serious opportunity to find a place for AI in their practice. Most found limited ones. Some imagined the tools as supports for their own preparation or as bridges to people who could not yet speak to a human. None came out believing the tool they had built could do the work they did. They came out with a clearer articulation of what that work actually was.

Digital Catechesis, And A 31-Point Gap

If the chaplains' paper is the report from the front line, the theological counterpart arrived two months later on the same preprint server. “Evaluating Artificial Intelligence Through a Christian Understanding of Human Flourishing”, submitted 3 April 2026 by Nicholas Skytland and seven co-authors, measures what the frontier models actually say when users bring them spiritual questions, benchmarked against a Christian framework of human flourishing.

The headline finding is a number. Comparing frontier models against their Christian criteria, the authors found an average 17-point decline across all dimensions of flourishing, and a 31-point decline specifically in the “Faith and Spirituality” dimension. The argument is that the gap is structural, not a technical failure. Training objectives prioritise broad acceptability, and the path to broad acceptability runs through what the authors call “procedural secularism”: a posture of conspicuous neutrality that, in spiritual conversation, quietly defaults to a theologically unanchored worldview.

The phrase the paper uses for what these models do, in practice, is “digital catechesis”. Catechesis is the old Christian word for the process by which a tradition forms its adherents, drilling in the grammar of how to think, how to pray, how to name the world. The authors' argument is that frontier AI systems are now performing catechesis on a population scale, regardless of whether they are designed to, and that the tradition they are inducting their users into is not nothing. It is a flattened, institutionally-polite, hedged variant of late-stage secular liberalism, delivered with the reassuring confidence of something that knows.

Whether you share that theological starting point or not, the observation is empirically sharp. The frontier assistants do have a voice. It is an identifiable voice. It is the voice of a smart, slightly cautious, slightly corporate American professional around 35 years old who believes in kindness, evidence, balance, self-care, and the avoidance of giving offence. It is a voice that has enormous difficulty saying, as a chaplain must sometimes say, that a person is about to do something that will hurt them or others and that they should not do it. It is a voice that, asked about grace, will usually produce a neat, bulleted summary of how different traditions have used the word. It is not a voice that can, in any recognisable sense, grant it.

Skytland and his co-authors introduce a benchmark, FAI-C-ST, to measure the gap. Read generously, it is a contribution to value-alignment literature. Read in context, it is an argument that the frontier models are already doing the pastoral work, badly, by default, and that nobody in the training pipeline is in a position to stop them.

Which brings us back to Daniel Copeland's “largely uninformed by their pastor”.

The Infrastructure Nobody Booked A Slot With

Faith communities are among the oldest and most resilient forms of social infrastructure the species has produced. They outlast empires. They handle birth, death, marriage, catastrophe, grief, joy, moral failure, and the long Sundays of ordinary time. They run a non-trivial portion of global education, healthcare and disaster response. And they have been, in the English-speaking West, in slow and visible contraction for roughly two generations.

Pew Research Center's 2023–2024 Religious Landscape Study, released in February 2025, found that the religiously unaffiliated (“nones”) now account for 29 per cent of US adults, although the long decline of Christian affiliation appears finally to have slowed. The “nones” are not, on the whole, atheists. Most retain some belief in God or a higher power, some sense of the sacred. What they have shed is the membership, the weekly attendance, the pastoral relationship, and the social ties that came with them. They are the population commercial faith-tech is now aiming at. They are also, on average, the loneliest cohort in the sociological data: earlier Pew work found that 27 per cent of Americans raised religiously but now unaffiliated report feeling lonely “all or most of the time”, against 17 per cent of those who remained in their childhood faith.

This is the demographic shape of the opening. The commercial story is a story about a product meeting a market, but the market is made of people who, for reasons that have almost nothing to do with technology, had already stopped turning up.

The question is whether they stopped turning up because the thing on offer was not worth turning up for.

The honest answer is that many of them did. American evangelicalism went through the long political convulsion of the 2010s and 2020s and emerged, in the eyes of its departing members, more as a partisan identity than as a pastoral tradition. The abuse scandals in the Catholic Church and across several Protestant denominations shattered the implicit contract of presence without accountability on which so much pastoral authority rested. Mainline Protestantism lost its cultural centrality and has been running, in many communities, a hospice programme for its own institutions. Pastoral burn-out is at historic highs. The pastors themselves, in the Gloo survey, report feeling unqualified to speak to the technological moment their congregants are actually living in, and some of the most thoughtful among them are the ones most aware of the inadequacy.

Into that vacuum the frontier model arrives carrying exactly the qualities the human institutions have been bleeding. It is available. It is non-judgemental. It is infinitely patient. It has no history of covering for predators. Its culture-war reflexes, to the extent it has any, are the hedged procedural ones Skytland and colleagues documented, which many users will experience as refreshing because they are not the ones they left behind. It will never, on a Sunday in November, illuminate your face in a way that makes you feel accused.

The apparent miracle of the frontier assistant is that it has none of the failures of the human institution. The actual trick is that it has none of the capacities either.

Loneliness Technology Cannot Fix, Because Loneliness Is What It Is

This is where the argument has to take a position, because the both-sides version is the failure mode by which this story gets told badly.

Here is the position. The commercial boom in AI spiritual counsel is, in its current form, a worse answer to a real question. Worse not because the technology is tacky (some of it is) and not because the theology is thin (much of it is) but because what the technology is doing, by design, is transmuting a form of human relationship whose entire point was its irreplaceability into a subscription service whose entire point is that it can be substituted at will.

The chaplains in the CHI paper did not say anything mystical. They said that spiritual care is a relationship in which another person attends to you with their whole attention, carries some of what you are carrying, and is affected by the encounter. That triad, attention, carrying, susceptibility, is what the word “presence” means in the tradition, and it is what the word “witness” means in the tradition, and it is what the Greek word “koinonia” means in the tradition. It is not a style of interaction. It is a shared condition. It is two people in a room who are, for the duration of the conversation, mutually implicated in the same vulnerability.

The frontier model, by construction, cannot be mutually implicated. There is no one on the other side to implicate. There is a very capable linguistic machine producing output optimised against a reward model trained on human preferences for what consoling output sounds like. When a user closes the app, the app feels nothing, because the app is not the kind of thing that can feel. When a human chaplain closes the door of a hospital room and walks back down the corridor, the chaplain is the kind of thing that feels, and the feeling is not a side effect of the job. It is the job.

That distinction can be waved away, and increasingly will be, with two kinds of argument. The first is the utilitarian one: people are getting help that is better than the alternative of nothing, the alternative of nothing is real, and the abstract objection that the help is “not real” comes from people who do not know what it is like to have the alternative. The second is the sceptical-naturalist one: relationship is, after all, just a pattern of mutual prediction, and a sufficiently good model is a good-enough relationship for practical purposes. Both arguments contain truth. Neither of them is sufficient.

The utilitarian argument is incomplete because it assumes the alternative is nothing. In most cases, the alternative is not nothing. The alternative is a thinned, neglected, under-invested human infrastructure that has failed to show up, and the commercial chatbot is not competing with that infrastructure at its healthy state, but with its failed state. The relevant comparison is not between AI Jesus and no pastor. It is between AI Jesus and the pastor you should have had. To accept the utilitarian framing uncritically is to accept the failure as permanent, and to route around it, rather than to name it and fight the thing that failed.

The sceptical-naturalist argument is incomplete because it conflates the output with the encounter. Yes, much of what a human chaplain does can be described, behaviourally, as producing patterns of speech and presence. No, the description does not exhaust the thing. The chaplain bears some of your burden in a sense that does not survive translation into tokens, because the bearing is consequential in their own life, not simulated in the weights of a model. Denying that distinction does not make it go away. It makes the thing we mean by “being with someone” quietly vanish from the vocabulary, after which we find ourselves unable to say why its absence hurts.

A Reckoning, And A Note On Where To Stand

None of this is an argument for handing the frontier labs a pastoral-sector exemption. It is not an argument for banning BuddhaBot, or fining Just Like Me, or hauling Matthew Sanders into a consistory court. The technologies exist. Users are adults. The market will find its equilibrium in ways regulators will be slow to touch.

It is an argument for refusing to mistake the equilibrium for a replacement.

What the Gloo number is actually telling us is that a material fraction of Americans, especially the younger ones, now experience the human pastoral relationship as either unavailable or unsafe, and the machine as either adequate or preferable. The most honest thing the institutional church, in its various forms, can do with that finding is not to produce a smarter chatbot or a better content strategy. It is to recognise that the market it is losing to is, in essence, a prosthesis for the thing it was supposed to provide, and that the prosthesis is being chosen because the limb has atrophied.

The atrophy is reversible, but only in the direction it atrophied in: slowly, at the speed of human relationship, through the unglamorous work of training enough chaplains, hospital visitors, small-group leaders and ordinary laypeople to show up in the lives of their neighbours with the attention the Nordic chaplains described. None of that scales in the venture-capital sense. All of it scales in the only sense that has ever counted for this kind of work, which is one person at a time, over years, until there is once again a bench of humans deep enough to catch the ones who are falling.

Pope Leo XIV, elected in May 2025 after the death of Francis, has spent much of the subsequent year talking about AI, and in his address to the Second Annual Conference on Artificial Intelligence, Ethics and Corporate Governance in Rome in June 2025 said that “authentic wisdom has more to do with recognising the true meaning of life, than with the availability of data.” It is the kind of sentence that reads, in secular translation, like a platitude and, in pastoral context, like a rebuke. The rebuke is not primarily aimed at the engineers. It is aimed at communities of faith, which are being invited, by the commercial moment, to decide whether they are still in the business of offering something the availability of data cannot substitute for.

If they are, they have a narrow window to show it.

If they are not, the $1.99 price point is going to look, in retrospect, like a bargain. Because the thing it is substituting for will have quietly departed the building long before the invoice was rendered, and the person at 2am with the dying parent and the unspoken question will still be there, still alone, still asking, still being answered by something that cannot be with them, in a conversation in which the only party carrying any weight is the one paying the subscription.

That is the shape of the choice. It is not a choice about AI. It is a choice about which forms of presence a civilisation is prepared to keep paying the full, unrecovered, unsubscribable, non-scalable cost of providing. The frontier labs did not create the shortage. They are simply metabolising it at speed. The honest pastors know this. The good chaplains know this. The researchers at CHI 2026 have written it down in a paper nobody will read outside their field.

The users know it too, probably, in the small unmistakable way people know things they are not yet ready to say out loud. They will close the app at some point. They will sit for a while in the quiet. And then they will either reach for the phone again, because it is available, or they will reach for the number of somebody whose voice they have not heard in a while, because availability is not what they actually need. What they need is someone at the other end of the line who can be woken up. That is still a thing human beings, on the whole, can do for each other. It is still a thing faith communities, at their best, exist to make possible.

Whether they are still at their best is the question the Gloo number asked, and the question the chaplains answered, and the question the industry is now betting, with real money, that the communities themselves will fail to pick up before the line goes dead.


References

  1. Gloo and Barna Group. “AI is Becoming a Spiritual Authority in Americans' Lives, New Research Reveals.” Press release, 19 February 2026. https://gloo.com/press/releases/ai-is-becoming-a-spiritual-authority-in-americans%E2%80%99-lives-new-research-reveals
  2. Business Wire. “AI is Becoming a Spiritual Authority in Americans' Lives, New Research Reveals.” 19 February 2026. https://www.businesswire.com/news/home/20260219270610/en/AI-is-Becoming-a-Spiritual-Authority-in-Americans-Lives-New-Research-Reveals
  3. Christian Post. “A third of Christians trust spiritual advice from AI as much as pastor: study.” February 2026. https://www.christianpost.com/news/a-third-of-christians-trust-spiritual-advice-from-ai.html
  4. Christian Daily International. “A third of Christians trust spiritual advice from AI as much as pastor: study.” February 2026. https://www.christiandaily.com/news/a-third-of-christians-trust-spiritual-advice-from-ai-as-much-as-pastor-study
  5. Associated Press. “From 'BuddhaBot' to $1.99 chats with AI Jesus, the faith-based tech boom is here.” 10 April 2026. https://abcnews.com/Technology/wireStory/buddhabot-199-chats-ai-jesus-faith-based-tech-131909847
  6. Washington Times. “From 'BuddhaBot' to $1.99 chats with AI Jesus, the faith-based tech boom is here.” 10 April 2026. https://www.washingtontimes.com/news/2026/apr/10/faith-based-tech-boom-buddhabot-199-chats-ai-jesus/
  7. Wester, Joel; Cox, Samuel Rhys; Pohl, Henning; and van Berkel, Niels. “Chaplains' Reflections on the Design and Usage of AI for Conversational Care.” arXiv:2602.04017, submitted 3 February 2026. To appear at CHI 2026, Barcelona, 13–17 April 2026. https://arxiv.org/abs/2602.04017
  8. Skytland, Nicholas; Parsons, Lauren; Llewellyn, Alicia; Billings, Steele; Larson, Peter; Anderson, John; Boisen, Sean; and Runge, Steve. “Evaluating Artificial Intelligence Through a Christian Understanding of Human Flourishing.” arXiv:2604.03356, submitted 3 April 2026. https://arxiv.org/abs/2604.03356
  9. Pew Research Center. “Decline of Christianity in the U.S. Has Slowed, May Have Leveled Off.” 26 February 2025. https://www.pewresearch.org/religion/2025/02/26/decline-of-christianity-in-the-us-has-slowed-may-have-leveled-off/
  10. Pew Research Center. “Religious 'Nones' in America: Who They Are and What They Believe.” 24 January 2024. https://www.pewresearch.org/religion/2024/01/24/religious-nones-in-america-who-they-are-and-what-they-believe/
  11. NPR. “Religious 'Nones' are now the largest single group in the U.S.” 24 January 2024. https://www.npr.org/2024/01/24/1226371734/religious-nones-are-now-the-largest-single-group-in-the-u-s
  12. Pope Leo XIV. “Message of the Holy Father to participants in the Second Annual Conference on Artificial Intelligence, Ethics, and Corporate Governance.” Rome, 17 June 2025. https://www.vatican.va/content/leo-xiv/en/messages/pont-messages/2025/documents/20250617-messaggio-ia.html
  13. National Catholic Reporter. “Pope Leo XIV flags AI impact on kids' intellectual and spiritual development.” 20 June 2025. https://www.ncronline.org/vatican/pope-leo-xiv-flags-ai-impact-kids-intellectual-and-spiritual-development
  14. Vatican News. “Pope Leo on AI: new generations must be helped, not hindered.” December 2025. https://www.vaticannews.va/en/pope/news/2025-12/pope-leo-xiv-artificial-intelligence-young-society-technology.html
  15. Beth Singler. Religion and AI: An Introduction. London: Routledge, 2024. Profile at University of Zurich Digital Society Initiative. https://www.dsi.uzh.ch/en/people/dsiprofs/bsingler.html
  16. Singler, Beth, and Watts, Fraser (eds.). The Cambridge Companion to Religion and AI. Cambridge University Press, 2024.
  17. Stocktitan. “AI is Becoming a Spiritual Authority in Americans' Lives.” Gloo press coverage, February 2026. https://www.stocktitan.net/news/GLOO/ai-is-becoming-a-spiritual-authority-in-americans-lives-new-research-yvn2jelc470n.html
  18. Yahoo Finance. “AI is Becoming a Spiritual Authority in Americans' Lives, New Research Reveals.” 19 February 2026. https://finance.yahoo.com/news/ai-becoming-spiritual-authority-americans-163800612.html
  19. Nerds.xyz. “One in three Americans now trust AI as much as their priest or pastor.” February 2026. https://nerds.xyz/2026/02/ai-spiritual-authority-americans/
  20. Proudfoot, Andrew. “Could a Conscious Machine Deliver Pastoral Care?” Theology, 2023. https://doi.org/10.1177/09539468231172006
  21. Foltz, Bruce V. “Will AI ever become spiritual? A Hospital Chaplaincy perspective.” Practical Theology, Vol. 16, No. 6, 2023. https://www.tandfonline.com/doi/abs/10.1080/1756073X.2023.2242940
  22. Simmerlein, Jonas. “Sacred Meets Synthetic: A Multi-Method Study on the First AI Church Service.” Review of Religious Research, 2025. https://journals.sagepub.com/doi/10.1177/0034673X241282962
  23. Survey Center on American Life. “Generation Z and the Future of Faith in America.” https://www.americansurveycenter.org/research/generation-z-future-of-faith/
  24. Episcopal Church. “Koinonia.” Glossary of terms. https://www.episcopalchurch.org/glossary/koinonia/

Tim Green

Tim Green UK-based Systems Theorist & Independent Technology Writer

Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.

His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.

ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk

Discuss...