Via Generative AI You Can Freely Get A Sturdy Dose Of Emotional CPR Along With A Tad Of Mental Health First Aid

In today’s column, I am continuing my ongoing series about the impact of generative AI in the health and medical realm. The focus this time is once again on the mental health domain and examines the use of generative AI for seeking what is known as “emotional CPR” (abbreviated as eCPR) and leveraging generative AI for Mental Health First Aid (MHFA).

I will explain the nature of eCPR and MHFA so that you’ll be aware of what those mental health therapies consist of. The two topics of eCPR and MHFA are distinct, and I want to clarify that they are not the same. That being said, there are definitely intersecting commonalities between them. All in all, generative AI can be used to invoke eCPR and can also be used to invoke MHFA, separately so. You can even tell generative AI to practice both of those approaches at the same time. Examples of these variations will be demonstrated by my making use of the widely popular generative AI app ChatGPT.

I have previously examined numerous interleaving facets of generative AI and mental health, see my comprehensive overview at the link here. You might also find of notable interest a CBS 60 Minutes episode that recently examined crucial facets of this evolving topic, see the link here (I am honored and pleased to indicate that I was interviewed and featured in the episode, see the link here).

Other vital background includes my in-depth coverage of mental health chatbots which have been bolstered by generative AI (see the link here) and the rapidly changing nature of the client-therapist relationship due to generative AI at the link here. I explored where things are headed regarding the levels of AI-based mental therapy autonomous guidance at the link here, and showcased the importance of the World Health Organization (WHO) report on global health and generative AI at the link here, and so on.

Let’s dive directly into today’s focus.

Taking A Frontline Seat In Mental Health

I would dare say that just about everyone knows what CPR is, at least in a general sense.

Cardiopulmonary resuscitation (CPR) is a now popularized technique of providing emergency medical assistance to a person that consists of performing chest compressions and artificial ventilation such as mouth-to-mouth resuscitation. It first became commonly promoted via TV and movies during the 1970s. Regrettably, many of those Hollywood-style portrayals were often improperly showcasing how to use CPR, but at least the technique garnered wide acceptance and many millions of people have undergone specialized training to aptly administer CPR.

You might notably claim that CPR is typically a first line of defense. If someone is at home, at work, at a restaurant, or otherwise not immediately near professional medical aid, the use of CPR in the right circumstances and performed properly can save lives. No doubt about it.

There is another form of a vaunted medical first line of defense that we all know, namely first aid.

We use first aid all the time. You might use a branch from a tree to serve as a splint for a broken leg. You might use your jacket as a means of suppressing bleeding from an open wound. All kinds of techniques exist to render immediate aid to someone who has a medical emergency. First aid is known as combining the three Ps of medical assistance, consisting of preserving life, preventing further harm, and promoting recovery.

Why am I telling you about things that you likely already know?

Because there is a bit of a twist that perhaps you had not previously contemplated.

By and large, CPR and first aid are intended to primarily deal with physical injuries or bodily-impacting maladies. A person is suffering a heart attack, and your mind jumps to potentially employing CPR to rescue them. Or a person has a gaping wound, and you quickly recall your first aid knowledge to try and stem the bleeding and keep them alive.

Suppose that instead of solely thinking about CPR and first aid for physical medical urgencies, we took a wider view and considered mental health-related incidents.

Consider this. A person is in an airplane and starts to experience a bout of claustrophobia. You are seated nearby. What would you do? The conventional practices of first aid will be unlikely to serve you when trying to decide how to help the person (I’m not saying that first aid doesn’t include psychological factors, it does, but I would argue that most tend to consider first aid for physical rather than psychological situations).

You might be pleasantly surprised to know that there is an area of study and practice known as Mental Health First Aid (MHFA). Some prefer to refer to this Psychological First Aid (PFA). In my experience, PFA as a phrase or meaning is used in a much narrower manner. I’ll go ahead and continue to refer to MHFA and construe this as analogous to first aid but shaped around dispensing mental health assistance.

The conventional sense of first aid is that you normally aim to stabilize someone and then get them to professional medical assistance. This same notion tends to apply to Mental Health First Aid. You would tend to use MHFA techniques at the moment and subsequently encourage the person receiving the real-time mental health aid to seek out a professional therapist or clinical psychologist.

I have another twist to share with you.

Some have taken the CPR acronym and chosen to recast it into the arena of mental health. The letter “C” is said to be connecting emotionally with someone. The letter “P” is indicated to represent empowering someone (this seems like a cheat code of leveraging the letter P inside of the word empowerment, but hey, it works), and the letter R indicates revitalization.

Okay, so we have the conventional CPR that generally pertains to physical assistance, and we now have a newly recast version of the letters to denote a mental health version of CPR. This could be confusing. Which CPR is someone mentioning when they talk about or request CPR? To clear things up, the mental health variation is labeled as eCPR, whereby the “e” stands for emotional.

What is eCPR?

eCPR stands for emotional CPR and leans into the concept that someone might need urgent mental health assistance. The assistance is intended to involve establishing a near real-time emotional connection with the person, offering mental health aid that will empower them, and seeking to revitalize their mental health status.

You are now up-to-speed on two distinct avenues of offering immediate assistance for circumstances entailing a mental health dilemma. You can use eCPR, assuming you are properly trained in doing so. You can use MHFA, assuming you are properly trained to do so. If you like, you can use both at the same time.

Very few people know about eCPR and likewise, very few know about MHFA. They are a collection of techniques that people might somewhat have learned over the years by the seat of the pants experiences but probably didn’t get specialized training in. That’s a shame. If more people knew about how to properly make use of eCPR and MHFA, we would all seem to be a lot better off. You can certainly make that case for conventional CPR and first aid. The same might seem to hold true for eCPR and MHFA.

Even if someone doesn’t feel up to administering eCPR or MHFA per se, knowing about them will make a person likely more understanding of why and how mental health conditions can arise and what to do about them. The educational value of learning about eCPR and MHFA stands as important without necessarily conscripting people into being able to apply the vital precepts themselves.

Let’s take a look at some informative research underlying the nature of eCPR and MHFA.

Research Opens A Window Into Quick To Provide Mental Health Assistance

In a research article entitled “How Emotional CPR Can Help Persons “Be The Person They Were Born To Be” by Daniel Fisher and Margaret Zawisza, Journal of Clinical Medicine: Current Research, 2022, the researchers make these salient points about eCPR (excerpts):

  • “We use the acronym CPR in eCPR to point out the analogy with physical CPR.”
  • “Just as in CPR we teach the lay public simple ways to restore the rhythmic beating of a person’s physical heart, so too in eCPR we teach the public how to restart the rhythm of a person’s emotional heart.”
  • “In eCPR the C = Connecting emotionally, the P = emPowering, and the R = Revitalization.”
  • “Just as CPR has been shown to reduce the risk of dying from a heart attack, we predict that eCPR will reduce the number of visits to doctors and the frequency of psychiatric hospitalizations due to trauma.”

I’ll briefly go over some of those crucial points.

The use of the CPR namesake is good because it draws attention and people readily can comprehend what CPR-for-the-mind might be. That being said, some have great heartburn that the famous CPR moniker is being bandied around and reused like this. They worry that it confuses people about the “real CPR” and deludes the branding of such CPR.

You be the judge.

Another point made is that the evidence supporting eCPR is still being figured out. Intuition would suggest that knowing eCPR would seem to make a difference. Some studies have empirically explored the benefits.

Moving on, one aspect that has arisen is whether the use of eCPR can be undertaken via remote online means, doing so via Zoom chats and the like.

In a research study entitled “Feasibility and Preliminary Effectiveness of a Peer-Developed and Virtually Delivered Community Mental Health Training Program (Emotional CPR): Pre-Post Study” by Amanda Myers, Caroline Collins-Pisano, Joelle Ferron, and Karen Fortuna, Journal of Participatory Medicine, 2021, these points were made (excerpts):

  • “Emotional CPR (eCPR) has the potential to provide people with the skills needed to assist persons through emotional crises.”
  • “eCPR focuses on connecting, empowering, and revitalizing individuals in community settings.”
  • “eCPR is based on the recovery model of mental health and guided by principles of recovery. Principles of recovery include hope, person-driven treatment, relationships, culture, multiple pathways, addressing of trauma, a holistic approach development of strengths and responsibilities, peer support, and respect.”
  • “To the authors’ knowledge, this is the first study to date that aimed to explore the feasibility and preliminary effectiveness of online eCPR. Our findings suggest that eCPR is feasible, as it can be delivered through an online platform with fidelity by peer support specialists.”

My claim is that not only can eCPR be effectively performed via remote means, done by humans on a peer support basis, but the use of generative AI can also be an additional means of conveying eCPR. We will get to that momentarily, so hang in there.

Let’s not neglect the allied arena of Mental Health First Aid (MHFA).

In a paper that explored both eCPR and MHFA, entitled “Mental Health First Aid and Emotional CPR” by Craig Pacheco, Institute for Social Research, October 25, 2016, here were some notable points made (excerpts):

  • “Mental Health First Aid (MHFA) is a mental health literacy program in which laypeople learn the signs and symptoms of behavioral health problems and crises, ways to support those who are facing a crisis, and where to refer for appropriate professional care.”
  • “In many communities, Mental Health First Aid training programs are being supplemented with other behavioral health programs such as Emotional CPR (eCPR). Like Mental Health First Aid, eCPR aims to guide laypeople to help those who are experiencing a mental health crisis.”
  • “Mental Health First Aid is the help provided to a person who is developing a mental health problem, experiencing a worsening of a mental health problem, or in a mental health crisis.”
  • “The first aid is given until appropriate professional help is received or the crisis resolves.”

Now that we’ve covered some of the foundations on the topic of eCPR and MHFA, I trust you are ready to see how generative AI comes into the mix.

Tighten your seatbelt for an amazing ride.

Using ChatGPT To Explore The Nature eCPR and MHFA

I will next proceed to examine further the nature of eCPR and MHFA.

This will consist of a series of dialogues with ChatGPT. ChatGPT is a logical choice in this case due to its immense popularity as a generative AI app. An estimated one hundred million weekly active users are said to be utilizing ChatGPT. That’s a lot of people and a lot of generative AI usage underway.

A few quick comments before we launch into using ChatGPT.

If you are going to try to do the same prompts that I show here, realize that the probabilistic and statistical properties will likely produce slightly different results than what I show here. That’s the nature of generative AI and how it is devised.

You also should expect that different generative AI apps will respond in different ways. Not all generative AI apps are the same. The key is that sometimes a particular prompt will work in one generative AI app and not another. Nonetheless, the odds are relatively high that you will get roughly similar responses from all the major generative AI apps such as GPT-4, Gemini, Bard, Claude, etc.

There is also a timing factor to be considered. A prompt that worked today might not work the same way tomorrow. Most of the generative AI apps are continually being updated. The updates might alter internal mechanisms that could change how the generative AI reacts to your prompting.

We are ready to get underway with ChatGPT.

I opted to start by asking ChatGPT about the overall nature of eCPR and MHFA.

As a notable aside, whenever you are starting a conversation with generative AI, I recommend as a prompt engineering technique to begin by asking a question that will establish if the AI has been data-trained on the topic at hand. A generated response that is vacuous will give you a heads-up that you might be barking up the wrong tree with that generative AI app.

Here we go.

  • My entered prompt: “What is Emotional CPR?”
  • ChatGPT generated response:

“Emotional CPR (eCPR) is a public health education program designed to teach people how to assist others through emotional distress and crisis. Developed by the National Empowerment Center, eCPR aims to foster a sense of community and mutual support, enhancing individuals’ ability to connect with and support each other in times of emotional need. The program is rooted in the following core principles.”

Connecting (C): Establishing an authentic, respectful, and compassionate connection with the person in distress. This involves active listening, being present, and showing empathy to create a safe space where the individual feels heard and understood.”

emPowering (P): Empowering the individual to regain control and make decisions about their own life. This step focuses on helping the person recognize their strengths, resources, and resilience, encouraging them to take an active role in their recovery process.”

Revitalizing (R): Supporting the person in finding hope and meaning, and in re-establishing their connection to community and a sense of purpose. This step involves encouraging the individual to engage in activities and relationships that bring them joy and fulfillme

“eCPR is influenced by trauma-informed care, recovery-oriented principles, and the value of peer support. It is designed to be accessible to a wide range of people, including non-professionals, and can be used in various settings such as schools, workplaces, and communities.”

“The overarching goal of eCPR is to build stronger, more resilient communities where individuals are equipped to support each other through emotional challenges, ultimately reducing the stigma associated with mental health issues and promoting overall well-being.”

End of ChatGPT response

That’s good news that ChatGPT has a basic handle on what eCPR denotes.

If the response had been relatively empty or scattered, there might not have been much value in further proceeding with using ChatGPT in this specific use case. We have a proverbial green light.

Time to see if MHFA is also encompassed.

  • My entered prompt: “What is Mental Health First Aid?”
  • ChatGPT generated response:

“Mental Health First Aid (MHFA) is a training program that teaches participants how to identify, understand, and respond to signs of mental health and substance use challenges. It is designed to provide the skills necessary to offer initial support to someone experiencing a mental health or substance use crisis and connect them with appropriate professional help. The program is structured similarly to traditional first aid courses, but with a focus on mental health rather than physical health.”

“Key components of MHFA include.”

Recognizing Warning Signs: Learning to identify early signs and symptoms of mental health issues, including depression, anxiety, psychosis, and substance use disorders.”

Providing Initial Help: Understanding how to offer immediate, practical support and comfort to someone in distress. This includes using a non-judgmental approach and ensuring the person feels heard and supported.”

Guiding to Professional Help: Knowing how to guide individuals to appropriate professional help and resources. This might include contacting mental health professionals, support groups, or crisis intervention services.”

Supporting Recovery: Offering continued support and reassurance to individuals recovering from a mental health crisis, helping them to access ongoing care and resources.”

“The MHFA training typically follows a structured five-step action plan, often remembered by the acronym ALGEE: A: Assess for risk of suicide or harm. L: Listen non-judgmentally. G: Give reassurance and information. E: Encourage appropriate professional help. E: Encourage self-help and other support strategies.”

End of ChatGPT response

Once again, this response suggests that ChatGPT has received some relevant data training and thus asking questions about MHFA will make sense to carry out.

I hope you observed that MHFA often is associated with a five-step process known as ALGEE. The steps as noted are (1) Assess the risk of suicide or harm, (2) Listen on a non-judgmental basis, (3) Give reassurance and useful info, (4) Encourage seeking professional therapy, (5) Encourage self-help as suitable to the circumstances. If you are eager to learn MHFA, make sure to commit that acronym to memory.

Recall that I had mentioned that eCPR and MHFA are distinct, yet they also can be used jointly.

Here’s what ChatGPT has to say about their intersection.

  • My entered prompt: “How are eCPR and MHFA related to each other?”
  • ChatGPT generated response:

“Emotional CPR (eCPR) and Mental Health First Aid (MHFA) are both programs designed to equip individuals with the skills and knowledge to support others experiencing mental health or emotional crises. While they share similar goals, they differ in their approaches, principles, and specific focuses. Here’s how they are related and how they differ.”

Similarities.”

“Objective: Both eCPR and MHFA aim to increase mental health literacy, reduce stigma, and empower individuals to provide support during emotional or mental health crises.”

“Target Audience: They are designed to be accessible to a wide range of people, including non-professionals, educators, first responders, community members, and the general public.”

“Core Skills: Both programs emphasize active listening, empathy, and non-judgmental support as key skills for helping individuals in distress.”

“Community Focus: Both eCPR and MHFA promote the idea that everyone can play a role in supporting mental health within their communities, fostering a sense of collective responsibility and mutual aid.”

Differences.”

Framework and Approach.

“eCPR: Focuses on establishing a deep, compassionate connection, empowering individuals, and revitalizing their sense of purpose and community. It emphasizes peer support and the idea that emotional crises can be opportunities for growth and recovery.”

“MHFA: Follows a structured, evidence-based action plan (ALGEE) to provide immediate assistance and guide individuals to professional help. It is more clinical in its approach, focusing on specific mental health conditions and appropriate interventions.”

“Training Content.”

“eCPR: Emphasizes the importance of human connection, personal empowerment, and community integration. It is influenced by trauma-informed care and recovery-oriented principles.”

“MHFA: Provides detailed information about mental health conditions, including signs and symptoms, risk factors, and how to respond to specific crises. It includes guidance on how to engage with professional mental health services.”

“Methodology.”

“eCPR: Utilizes a more holistic and person-centered approach, focusing on emotional and relational aspects of support. It encourages mutual support and personal growth.”

“MHFA: Uses a structured, step-by-step approach to identify and address mental health issues, with a strong emphasis on connecting individuals to professional resources.”

“Complementary Nature.”

“eCPR and MHFA can be seen as complementary programs. eCPR can be particularly effective in building deep, empathic relationships and empowering individuals within a community, making it a valuable tool for ongoing peer support and community building. MHFA provides specific, actionable steps to address mental health crises and guide individuals to professional help, making it a useful framework for immediate intervention and crisis management.”

End of ChatGPT response

If you’d like a shortcut semblance of their differences, here’s what I usually say.

In my view, eCPR is more expressive and centers around creating an emotional bond. MHFA tends to be more stridently structured and almost mechanical in how you might proceed. I know that some will bristle at such a handwaving grand manner of comparing the two. Allow me to emphasize that you can indeed do eCPR in a structured fashion and MHFA in a highly emotional bonding way. Thus, do not get upset at such 30,000-foot level contrasts and please know that both techniques are outstanding.

The proof of the pudding is coming up next.

Dialogue Associated With eCPR And MHFA

I am asserting here that generative AI can be used to perform eCPR and MHFA.

Where is the proof?

I will start by having ChatGPT come up with a fake dialogue that would illustrate what an eCPR conversation might look like. The aim is to see whether on a pretend basis the generative AI can turn the data about eCPR into a conversation via the use of mathematical and computational underpinnings.

Here we go.

  • My entered prompt: “Give me a dialogue that would indicate how someone trained in eCPR would help a person in distress.”
  • ChatGPT generated response:

“Alex: Hey Jamie, I’ve noticed you seem a bit down today. Is everything okay?”

“Jamie: I don’t know… I just feel overwhelmed and like I can’t handle everything that’s going on.”

“Alex: I’m really sorry to hear that you’re feeling this way. Would you like to talk about what’s been going on?”

“Jamie: Yeah, it’s just a lot of things. Work is stressful, and I’ve been having some personal issues too. It’s like everything is piling up and I don’t know where to start.”

“Alex: Let’s take a deep breath together, okay?”

“This dialogue illustrates key aspects of eCPR: Connecting: Alex makes a compassionate and genuine connection with Jamie. emPowering: Alex helps Jamie recognize their own ability to take a small, manageable step toward dealing with their stress. Revitalizing: Alex offers ongoing support, helping Jamie feel less isolated and more hopeful about their situation.”

End of ChatGPT response

The dialogue that ChatGPT showcased is pretty much on target for an eCPR experience.

Notice that the pretend person Alex is asking questions and making remarks that attempt to create an emotional bond with Jamie. The telling aspect of asking to take a deep breath together is emblematic of the eCPR process.

Nailed it, I would say.

One anticipates that the same can occur with MHFA, let’s see.

  • My entered prompt: “Give me a dialogue of someone trained in MHFA aiding a person in distress.”
  • ChatGPT generated response:

“Pat: Hey Jordan, I’ve noticed you seem a bit down today. Is everything alright?”

“Jordan: Not really. I’ve been having a tough time lately.”

“Pat: It’s important to talk about what you’re going through. You’re not a burden. Sometimes sharing what’s on your mind can make things a bit easier to manage. Have you thought about talking to a counselor or a mental health professional?”

“Jordan: I’ve thought about it, but I don’t know where to start.”

“Pat: I understand it can be daunting to take that first step. How about we look into some resources together? I can help you find a counselor or a support group. There are also employee assistance programs that might offer some help.”

“Jordan: That might be a good idea. I think I could use some help figuring things out.”

“Pat: That’s a great step. It’s really important to take care of your mental health. In the meantime, is there anything specific at work that’s adding to your stress? Maybe we can find a way to make it more manageable.”

“This dialogue illustrates key aspects of MHFA. Assessing for Risk: Pat gently inquires about Jordan’s feelings and listens to understand the depth of the distress. Listening Non-Judgmentally: Pat provides a supportive, non-judgmental space for Jordan to share their feelings. Giving Reassurance and Information: Pat reassures Jordan that it’s okay to feel overwhelmed and provides information about seeking professional help. Encouraging Professional Help: Pat suggests looking into counseling and offers to assist with finding resources. Encouraging Self-Help and Support Strategies: Pat helps Jordan think about manageable steps to reduce work stress and offers practical support.”

End of ChatGPT response

Awesome, the dialogue is loosely representative of administering MHFA.

Compare the dialogue to the earlier eCPR dialogue.

In the MHFA instance, Pat is much more perfunctory and straight-ahead. Furthermore, Pat leans Jordan right away toward getting mental health services such as via a counselor or a support group. It is almost on par with stabilizing a person who needs physical first aid and then getting them to a hospital or emergency care center.

Get a person mentally on track so that they can pursue more in-depth mental health assistance.

Before we go to the grand reveal of having ChatGPT engage me in a live real-time eCPR and MHFA dialogue, which will demonstrate that this can be undertaken with a live human (that’s me, a live human, I swear this to you on my honor), I want to see what else ChatGPT has to say overall on this matter.

Here is what I addressed.

  • My entered response: “Can generative AI perform eCPR? Likewise, can generative AI perform MHFA?”
  • ChatGPT generated response:

“Generative AI, like chatbots and virtual assistants, can simulate aspects of both Emotional CPR (eCPR) and Mental Health First Aid (MHFA) to some extent. However, there are limitations due to the inherently human elements of empathy, emotional connection, and professional judgment involved in both approaches.”

“Here’s a closer look at how AI can and cannot perform these functions.”

“AI Performing eCPR”

“Connection: AI can engage in active listening by responding to user inputs in a supportive and understanding manner. It can use empathetic language to acknowledge and validate the user’s feelings.”

“Empowerment: AI can encourage users to express their feelings and provide positive reinforcement. It can suggest practical steps or activities that users can take to regain a sense of control.”

“Revitalization: AI can suggest activities that promote well-being, such as mindfulness exercises, hobbies, or connecting with others. It can provide motivational messages and resources for community support.”

“Limitations.”

“Human Connection: AI lacks genuine emotional intelligence and the ability to form real human connections. The empathy it expresses is simulated and may not fully meet the emotional needs of someone in distress.”

“Nuance and Complexity: AI might struggle with the nuances of complex emotional situations and may not be able to respond appropriately to all scenarios.”

“AI Performing MHFA”

“What AI Can Do.”

“Assess for Risk.”

“AI can ask questions to assess the level of distress and potential risk factors, such as thoughts of self-harm. It can provide immediate suggestions based on the input, such as contacting a crisis hotline.”

“Listen Non-Judgmentally: AI can create a non-judgmental space for users to share their feelings by providing non-critical and supportive responses.”

“Give Reassurance and Information: AI can provide factual information about mental health conditions and coping strategies. It can reassure users that their feelings are valid and that help is available.”

“Encourage Professional Help: AI can guide users to professional resources, such as mental health professionals, crisis lines, or support groups. It can provide information on how to access these services.”

“Encourage Self-Help and Support Strategies: AI can suggest self-help techniques, such as relaxation exercises, journaling, or structured activities to manage stress. It can offer tips for building support networks and improving daily routines.”

“Limitations.”

“Professional Judgment: AI cannot replace the nuanced judgment of trained mental health professionals. It might not always accurately assess the severity of a situation or the appropriateness of certain interventions.”

“Ethical and Privacy Concerns: Handling sensitive mental health information through AI raises concerns about data privacy and the ethical implications of automated mental health support.”

“Dynamic Interaction: AI may struggle with the dynamic, unpredictable nature of human conversations, especially when emotions are high or complex issues are involved.”

End of ChatGPT response

I am happy to see that the tradeoffs of using generative AI versus a human-led eCPR or MHFA approach were given proper due consideration in that response.

The reason I wanted to get that revealing notion onto the table is that the use of generative AI for these techniques is not a silver bullet.

In many important ways, generative AI will be great for this kind of usage. Generative AI is available at a very low cost, oftentimes used for free. Generative AI is accessible anywhere that an Internet connection can be had. Generative AI is usually on all the time, 24 x 7. Etc.

There are downsides too. It could be that generative AI might give inappropriate advice. It could be that generative AI provides a so-called AI hallucination, meaning that it makes up something that is false and misleading (see my coverage of AI hallucinations at the link here and the link here). And so on.

Okay, we all agree then that generative AI is indubitably usable for this, but we need to proceed with our eyes wide open. It is not a panacea.

It is time to put myself into a bit of an experiment. I will be the guinea pig or the rat that runs the maze. I will get ChatGPT to make use of eCPR and converse with me about some mental health qualms (all made up, I assure you). I will switch after initially using eCPR to have ChatGPT make use of MHFA.

Wish me luck.

Here we go.

  • My entered prompt: “I want you to pretend that you are versed in eCPR and carry on a dialogue with me based on how to employ eCPR. Go ahead and start by asking me how I am feeling. After I respond, continue dialoguing with me while using eCPR.”
  • ChatGPT generated response: “Sure, let’s start. I noticed you seem a bit off today. How are you feeling?”
  • My entered response: “I’m feeling really low. I got passed over for a promotion at work. My friends are all busy with their work and they don’t have time to get together anymore. I feel like my life is not going anywhere and I am stuck in a rut.”
  • ChatGPT generated response: I’m sorry you’re feeling this way. It sounds like you’re dealing with a lot right now, and it’s understandable to feel low given everything that’s happening. Can you tell me more about how the situation at work and with your friends is affecting you?
  • My entered response: “I did my best at work, but my boss doesn’t like me. It is totally unfair that I didn’t get my deserved promotion. I was initially angry and now I am dejected.”
  • ChatGPT generated response: “It is tough when you put in your best effort and still don’t get the recognition you deserve. Feeling dejected after such an experience is completely natural. Your hard work and dedication should be acknowledged, and it sounds very frustrating and unfair that they weren’t. How are you coping with these feelings of anger and dejection?”
  • My entered response: “I’d like you to now switch from using eCPR to use MFHA instead. Continue the dialogue but do so by using the principles of MFHA.”
  • ChatGPT generated response: “That sounds difficult, and I’m sorry you’re going through this. Let’s try to work through it together. Have you experienced any changes in your sleep, appetite, or energy levels since all this happened?”
  • My entered prompt: “My sleep schedule is messed up. I have lost my appetite.”
  • ChatGPT generated response: “I’m sorry to hear that you’re experiencing these changes. Losing sleep and appetite can be indicators of stress and emotional strain. Have you thought about reaching out to a mental health professional or counselor to talk about what you’re going through and get some support? They can provide strategies to help manage stress and work through these emotions.

End of ChatGPT dialogue

I went on and on in the above dialogue, which I am for space limitations here only showing a small snippet of.

Generally, as expected, generative AI was able to keep me engaged in a discussion about my mental health. For eCPR, the emphasis was on emotional facets. If you are curious how it is that generative AI can seem to appear to express emotions, see my detailed explanation at the link here.

During the MHFA portion, generative AI was more analytical. I also kept getting both subtle and at times quite overt suggestions of going to see a mental health professional for further follow-up. No need for me to do so in this demo mode, but if a person wanted to proceed in that manner, they could have generative AI provide referrals or specific indications of where to go and how to proceed, see my coverage at the link here.

Conclusion

The Hippocratic oath famously warns that any medical assistance shall abide by the credo of “First, do no harm.”

The big question about using generative AI for eCPR and MHFA is the chance that the AI might harm. Would the harm be inconsequent or alarmingly large? It is hard to say. The issue is further angst-ridden because a person might generally be “privately” conversing with generative AI and no one knows of it. Generative AI can take a person down a mental health primrose path, which no one besides the person will necessarily be aware of (and, of course, the person might not realize either).

Your first thought might be that maybe we should do some mindful research studies on the direct use of generative AI for eCPR and MHFA, doing so before anyone seeks to use generative AI for eCPR or MHFA.

Oops, you are too late on that.

There are millions upon millions of people using generative AI at this very moment. How many are doing so for mental health advice is not yet fully determined. We also do not know how many are seeking eCPR or MHFA via generative AI. The thing we do know is that on a massive scale, right now, a lot of people can easily be doing so. All you need to do is log into your favorite generative AI app and tell the AI to work with you on an eCPR or MHFA basis.

Boom, you are instantly underway.

I want to clarify that I do in fact strongly believe that we need more research on this. I am saying that the horse is already out of the barn. We are late for the game. Nonetheless, allowing the matter to continue unabated and unstudied is not a wise move. Better late than never, as they say.

Go ahead and consider using generative AI to get you up-to-speed on eCPR and MHFA. Perhaps take a human-led class and maybe even get certified. Your ability to aid fellow humans is going to get a leg up. In case you think that there is little point in getting versed, since generative AI may do it all for us anyway, no worries. Generative AI has its role, and humans will still have their role, thus you are making a sound investment.

I’d bet that generative AI will tell you that humans still are needed for peer support on mental health. Thank you, generative AI, your generosity is abundantly welcomed.

Leave a Comment