[1] Earlier this year, I was asked to serve as a co-investigator on a research project aimed at demonstrating the effectiveness of spiritual care in helping emergency trauma nurses cope with experiences of moral injury in their work experiences. As I am currently serving as a chaplain, I eagerly accepted the invitation to participate. However, I soon learned there was one catch in the research project. The project leaders asserted that Artificial Intelligence-based (AI) chaplain avatars would be sufficient to deliver spiritual care, rather than human ones. However, after other chaplains and I raised several concerns about this approach, the decision was made to suspend the project indefinitely.
[2] In September, the Christian Century published Prof. Danielle Tummino Hansen’s reflection about using several AI spiritual counselor applications, or “apps,” during her hospitalization.[1] Her reflection, like the discussion that led to the suspension of the forementioned research project, explores the question: Could AI ever replace human beings in providing pastoral and spiritual care?
What is pastoral care?
[3] In the Introduction to the Evangelical Lutheran Worship – Pastoral Care, it describes pastoral, or spiritual[2] care as “…the church’s ministry of care…for people outside the primary worshipping assembly: in times of crisis, when confronting matters of sickness and health, in the time of dying and after death, and on many other occasions.”[3] Most recall the practical applications of pastoral care when defining it: prayer, ritual, specialized worship, “a ministry of presence”[4] and listening. However, pastoral care also includes exploring deeper questions of meaning, purpose, identity, and goodness in relation to the divine, the world, and oneself.
There’s an app for that,” – but is it any good?
[4] During her hospitalization, Tummino Hansen asked two questions to the various spiritual apps she explored: “Was her injury a result of sin?” and “Would you pray for me?” The responses to the first question ranged from offering sympathetic platitudes to repeating the doctrinal stances of various religious traditions. The responses to the second question were more satisfactory, but they lacked specificity regarding what she was experiencing and feeling in her situation. In the research project I participated in, I raised several concerns. What faith tradition would the chaplain avatar represent? Despite the inclusion of evidence-based moral injury counseling methods in the avatar’s development, how would they ensure it was free from religious bias, as mandated by the organization’s policies and the U.S. Constitution? Could the same privilege of confidentiality when speaking to a human chaplain be maintained when speaking with the chaplain avatar? What others and I discovered is that the project designers were able to adequately address technical questions of protecting confidentiality and tailoring the avatar’s responses to a nurse’s faith preferences; however, they could not address more complex issues concerning the individuals’ morality and pluralistic belief systems.
[5] These examples are not shared to discredit how AI platforms perform their functions and the ethics associated with them. Rather, they demonstrate what AI does. While the types of functions AI performs are wide-ranging, all of them are based on large language models that quickly analyze large amounts of data to generate an output. In the application of pastoral care, apps perform what is known as pattern recognition, which involves sifting through the data they have access to and generating a response to a user’s question.
[6] Performing pattern recognition, spiritual and pastoral care apps can provide responses that a human could not. Capable of accessing and analyzing a vast array of data, apps can offer multiple perspectives on religious, spiritual, and moral struggles, as well as subsequent pathways of care and healing that were previously unconsidered. Said another way, AI can offer more expansive and diverse insights, prayers, and rituals for pastoral care and support.
[7] Yet, AI pattern recognition tools are primarily predictive and lack any deep conceptual understanding of the present context.[5] In other words, the app cannot truly get to “know” a person or their situation but merely makes assumptions based on the data it has access to. What about the quality of that data? Regarding pastoral care, apps may suggest forgiveness as a means to overcome deep hurt and provide the user the therapeutic benefits associated with forgiveness. However, AI cannot weigh the moral and ethical considerations of forgiving in the user’s specific situation. In Tummino Hansen’s experience, AI responded to questions of theodicy with a summary of what theodicy is and what various faith traditions say about it. AI can’t fully explore what the question means to the individual. In summary, AI gives the perception that it “knows” the user by predicting what “facts” or information it predicts the user wants in response. However, it does not “know” the user in the sense that it can assess how those responses will affect the user or others in the present situation.
Pastoral Care as Reducing Pain?
[8] Despite the limitations of AI in knowing a person deeply, the question remains whether its pattern-finding and generative processes might be helpful in pastoral care for those dealing with moral injury. Moral injury includes experiences of inner conflict where one’s deeply held moral values and beliefs are violated, resulting in devastating suffering, including physical, psychological, social, and spiritual distress.[6] Referring to the research project at the beginning of this discussion, trauma unit nurses experience moral injury in situations when they make choices about providing life-saving care that violate their personal and professional moral codes. Another example of moral injury is the experience of sexual assault, where the survivor feels betrayed by someone they previously trusted. When unresolved, moral injury creates emotions of betrayal, shame, and guilt that result in devastating pain that resembles that of Post-Traumatic Stress Disorder (PTSD). However, moral injury is different than PTSD in that in the violation of one’s moral codes, whether by another or oneself, there is an awareness of one’s moral compass.
[9] David Rodin explains,
But clearly it is possible that psychic responses which, from the perspective of psychological health, present as a disorder, from a moral perspective may potentially be an appropriate and therefore morally healthy reaction to the commission of a grave moral wrong. Far from a problem to be solved, moral injury may in certain circumstances be a necessary and appropriate form of internal moral regulation.[7]
Thus generally, moral injury is a helpful term to encompass issues arising from one’s life experiences and relationships that pastoral and spiritual care routinely address: God’s goodness, suffering in the world, and the loss of identity, meaning and purpose.
[9] With respect to pastoral care, the concept of moral injury underscores the importance of examining both the “moral” and “injury” aspects. Pastoral care that explores deeper questions within one’s experience not only reveals its meaning, but also the presence of a loving and just God and who we are in relation to God. The practices of pastoral care can aid individuals in exploring and finding answers to their questions and can also have a therapeutic effect on their physical, emotional, social, and spiritual well-being.
[10] However, pastoral care that focuses solely on alleviating the pain of the “injury” is transactional, commodified, and strictly therapeutic. This type of care pathologizes the human experience of the pain that arises from it. Thus, pastoral care becomes strictly about reducing pain, if not eliminating it altogether. While reducing pain is important, it doesn’t require any deeper understanding of one’s experience and context. Importantly, it actually avoids this critical component of pastoral care.
[11] Furthermore, the potential of spiritual abuse and trauma caused by AI-generated responses must also be considered. But, ultimately, if pastoral care is solely about alleviating pain, then AI may be sufficient in providing solutions. If all we need is information, suggestions, and solutions to reducing pain – “there’s an app for that.”
Engaging Despair – A Collective Approach to Pastoral Care
[12] I’ve explored what AI does, including its capabilities and limitations. Revisiting the question of whether AI could replace human beings in providing pastoral and spiritual care, my answer is no. However, we must also acknowledge that AI is becoming an increasingly inevitable part of our daily lives. Therefore, we must ask: how must ordained leaders, laypersons, and faith communities who provide pastoral care prepare for a future with it?
[13] While a future of pastoral care that includes AI might be cause for despair, Martin Luther wrote that engagement with despair is “a godly and holy” thing.[8] Despair can invite us into a theologically ethical reflection that involves both honesty and practical reasoning. We ought reflect on the best practices and approaches to include AI in our approach to tending to the spiritual needs of those we care for.
A Note for Pastoral Caregivers Engaging Despair over AI
[14] Like others, my concerns about AI sometimes manifest as fear, making me resistant and defensive when I learn people are using spiritual care apps. Instead, we can engage our despair over AI and see this as an opportunity for a renewed, focused engagement. Pastoral caregivers can accompany those they care for as interpreters of the information and responses AI provides, responding to ethical considerations that affect their relationships with themselves, others, and the Triune God. In an era of increasing pluralism in people’s spiritual beliefs, AI apps can be utilized to provide a quick spiritual assessment —a practice commonly employed in clinical spiritual care —to align pastoral care with the individual’s identity and understanding. When individuals raise questions regarding other religious and theological perspectives, consulting AI could provide an opportunity for continued care for caregivers who are not well-versed in those perspectives.
[15] As a chaplain, I’ve regularly heard people share frustration and hurt due to past experiences with pastoral caregivers who were authoritative and more interested in reinforcing the usefulness and primacy of their specific faith tradition or theology. In chaplaincy, one goal of pastoral care is to support positive spiritual coping by helping people better understand their core beliefs and theological commitments in relation to their lived experience. The other goal is to represent our faith tradition in an ethical manner so that those in need can trust and continue to seek pastoral care in their lives. If AI can assist pastoral caregivers towards achieving these goals, then it is in our interest to consider how it can do so, ethically.
A Note for Faith Communities Engaging the Despair of Those in Need
[16] As a chaplain, one unfortunate trend I encounter is the moral distress and injury that people experience because faith communities often fail to provide pastoral care during times of need. Furthermore, we also live in a time of “therapeutic culture” in North America, where science and evidence-based practice have shifted care to an already overwhelmed mental health care system.[9] As a pastor who has served multiple congregations over the course of 12 years, I’ve noticed a trend where proximity to experiences of despair has become something to avoid rather than engage with and bear collectively.
[17] As a chaplain specializing in integrating pastoral care with psychiatric and psychological care, there is a distinct line between the types of care that faith communities and medical professionals provide. However, it has been the mistake of faith communities to conclude that this means all care needs require professional, clinical care. Specific to moral injury and deeper questions of meaning, identity, and goodness, pastoral care has been proven to be just as, if not more, effective especially when accompanied by clinical mental health care.[10] For faith communities, this presents a challenge. In this time of growing social, emotional, and spiritual need, is the church able to provide pastoral care, or will it be outsourced to AI? This is a question the church must grapple with.
[18] In conclusion, AI is already an integral part of society, and like previous technological advancements, it will become an integral part of our lives, shaping our communities and cultures. While it is certainly important and crucial to reflect on and weigh the ethics of AI and advocate for its just use in the world, we must not forget about our own ethical commitments as God’s disciples and Christ’s church. Through a robust and thoughtful exploration of the ethics of pastoral care, faith leaders and communities can continue to meet the deep needs of God’s people in ways for which there is no “app for that.”
[1] Danielle Tumminio Hansen, “My Artificial Chaplains,” The Christian Century. (2025) https://www.christiancentury.org/features/my-artificial-chaplains (accessed Oct 10, 2025).
[2] For the sake of consistency, I will use “pastoral” to refer to both pastoral and spiritual care.
[3] “Introduction,” Evangelical Lutheran Worship – Pastoral Care, (Minneapolis: Augsburg Fortress, 2009), 9.
[4] Ibid.
[5] The Atlantic Podcast, “Autocracy in America,” https://www.theatlantic.com/podcasts/archive/2025/09/ai-and-the-fight-between-democracy-and-autocracy/684095/ (accessed September 12, 2025).
[6] J. M. Pyne, J. Currier, and K. D. Hinkson, et al. “Addressing Religious and Spiritual Diversity in Moral Injury Care: Five Perspectives,” Curr Treat Options Psych 10, (2023) 446–462.
[7] David Rodin, “The Ethics of Moral Injury,” Moral Injury in the Humanities, (Abingdon: Routledge, 2024), 75.
[8] Martin Luther, “Lectures on Galatians, 1535” Luther’s Works, Vol. 27: ed. Jaroslav Jan Pelikan, Hilton C. Oswald, and Helmut T. Lehmann, (Saint Louis: Concordia Publishing House, 1999), 73–74.
[9] J. L. Herman and F. W. Putnam, “Our Broken Mental Health Care System,” Psychology Today. (2023) https://www.psychologytoday.com/us/blog/mental-health-care-today/202310/our-broken-mental-health-care-system?msockid=38b277f1a5cc6d533dd26235a4656c3f (accessed October 16, 2025).
[10] B. Winfrey, “Integrating Spiritual Care Within Mental Health,” Journal of Human Services, 44(1), (2025), 57–73.


