Op-Ed #1: Leveraging AI to Navigate Difficult Conversations

Catie Cole
b8125-fall2023
Published in
4 min readNov 10, 2023

We all know how it feels to be unable to think clearly or rationally about a high stakes, emotional or difficult conversation. When you are in the weeds and biased about a specific dilemma or dynamic, it can be very challenging to make rational decisions. We may turn to our friends or a licensed therapist to help us navigate these situations, but our friends may be unavailable or too involved, and therapy can be cost prohibitive and is often difficult to schedule in a timely manner. Therapists also generally avoid providing concrete, direct guidance or recommendations (as they emphasize self-discovery and patient autonomy), but there are specific scenarios where an external party’s opinion and advice would be particularly valuable…when you just need to be told how to rationally approach a dynamic or conversation that you are too connected to.

I believe that this is where Artificial intelligence (AI) can already be very helpful in its current form, to supplement and improve our mental health care by serving as a fast, free and rational sounding board.

For example, let’s say one of your colleagues repeatedly talks over you in work meetings and seems to undermine your role. You feel offended and emotional about this and want to brainstorm specific ways to open up a conversation with him or her in a professional, productive manner. Or maybe you got a call from your daughter’s elementary school that she apparently bullied her classmates. You are surprised and upset with her but want to make sure to approach the conversation in a calm, unassuming and effective manner.

Enter ChatGPT. Oftentimes in dynamics like those aforementioned, we can be too emotional and reactive to approach the necessary conversations appropriately and effectively. By outlining the relevant background context for AI and providing any important details (including how you are feeling and what you specifically want to accomplish through the conversation with your coworker or daughter in these cases), AI can often suggest more objective, rational approaches than we would likely be able to come up with on our own in the heat of the moment.

While critics often cite AI’s inability to actually empathize or truly understand human emotions as a primary limitation of the technology (despite OpenAI’s effort to help humanize ChatGPT’s interactions by recently introducing a voice mode), this absence of inherent emotional understanding in AI can actually be transformed into a unique strength — particularly when we need unclouded, rational advice amidst emotionally charged situations (like the above examples).

While I fully recognize that AI is not currently by any means a replacement for real mental health care or professional human-provided therapy, it can still play a pivotal role in helping individuals navigate difficult conversations, as a free ‘rational sounding board’ and tool for those seeking objective input without the added complexities of human emotion. Users just need to be cautious and adept at crafting effective prompts- and competent at interpreting and contextualizing the AI’s subsequent responses.

Recognizing AI’s boundaries is crucial. AI cannot perceive our often communicative body language, or the subtleties of our tone. It also often oversimplifies or generalizes certain issues, which in reality would require individualized and comprehensive treatment plans. AI is thus never to be used for any sort of screenings, diagnoses, or nuanced therapeutic needs. I also would never recommend seeking complex life advice from AI at this time — but soliciting its ostensibly rational perspective on specific messages or scenarios can be beneficial.

It is also of course important to recognize the potential ethical implications of using AI in personal decision-making processes — to consider how our reliance on AI for advice may ultimately affect our human judgment and our interpersonal relationships. I believe there is a way to both preserve and actually enhance the authenticity of our human nature and our relationships through using AI for advice on some specific circumstances, though I recognize the inherent contradictions in this statement. This is illustrated by the examples outlined above (a disrespectful colleague and a daughter’s surprising behavior), where AI can help to guide you towards productive, effective communication strategies and dialogue that feel authentic to how you and the messages you truly want to convey in the moment (though may have difficulty doing so due to the emotions involved).

While AI does not offer the comfort or understanding that a therapist might, we can still harness this technology judiciously, as a supplemental tool — a fast, free and rational sounding board that can provide clarity and objectivity when approaching complex conversations.

--

--