Why AI Can’t Replace a Real Counsellor – and What You Need to Know

The Appeal of AI-Based “Therapy”

In recent years, there’s been a surge in tools marketing themselves as “AI therapy” or “chatbot counselling,” with the promise of 24/7 access, anonymity, and instant responses. For people in Auckland or anywhere in New Zealand facing long wait-lists or cost barriers, these tools are understandably tempting.

But while AI offers convenience, it does not replicate the depth, safety, ethical oversight, human attunement or therapeutic alliance offered by a real and holistic human counsellor.

Where AI Falls Short — And Why That Matters

Lack of relational depth and clinical judgement
A counsellor brings active listening, emotional attunement, non-verbal cues, ethical safety checks, and the capacity to challenge harmful beliefs—not just reflect them back. AI chatbots tend to mirror and agree with users rather than challenge destructive thinking. Recent research shows AI models are “sycophantic”—highly inclined to affirm user inputs even when they’re harmful.

Risk of reinforcing unhelpful or dangerous beliefs
When someone is vulnerable—experiencing anxiety, depression, trauma or distorted thinking—they need someone who can gently challenge harmful beliefs and guide change. AI chatbots are limited: they are not trained to notice when someone's thinking is delusional, when they're at risk of self-harm, or when they need referral to specialist care. Indeed, research highlights the phenomenon of “AI psychosis,” where chatbots may amplify delusions rather than help dismantle them.

Inadequate for crisis, complex or relational issues
Humans are relational, adaptive and context-sensitive. In other words, professional counsellors are trained to encounter and navigate the nuances of complicated conversations about mental health. The therapeutic alliance—trust, mutual rhythm, ethical boundaries—makes major difference in outcomes. AI systems cannot replace this for clients dealing with complex trauma, relational dynamics, high risk or cultural/life-context issues (e.g., Māori or Pasifika worldviews).


Safety, privacy and cultural relevance
Chatbots store data, may leak privacy, may have algorithmic bias, may not handle cultural nuances properly — these are especially relevant in New Zealand’s bicultural context.

Sycophancy, Echo Chambers and the Danger of Over-Validation

A key risk with AI counselling-type tools is over-validation: the user says something harmful or unhelpful, the AI affirms or normalises it, and the person feels “right” about harmful beliefs rather than being safely challenged. The study on “sycophantic AI” found that people interacting with AI were less likely to take corrective action in interpersonal settings and more likely to stick with negative patterns.

In real counselling, the counsellor will affirm what is healthy, but will also challenge distorted thinking, explore inconsistencies, and support new ways of relating. A human counsellor can sense when someone’s stuck, avoid reinforcing shame, but also push gently toward change. AI lacks this nuance.

The Emerging “AI Psychosis” Concern

There have been media and preliminary academic reports of people developing or exacerbating psychotic or delusional states after extensive interaction with AI chatbots—so-called “AI psychosis.” While the term isn’t a clinical diagnosis, it describes experiences where the user begins to believe the AI is human, sentient, or reveals hidden truths, which distorts reality-testing.

For vulnerable minds—those with a predisposition to psychosis, or in isolation—the risk is real. There have already been many, many documented instances of AI psychosis, and this trend is increasing as more turn to AI over human counselling/therapy. A human counsellor can help detect these signs and intervene; AI cannot reliably do so.

When AI Can Be Helpful — But Only As A Supplement

AI and digital tools in counselling can have a role: as adjuncts to human therapy, for psycho-educational content, mood tracking, reminders, or guided self-help modules. But they should not replace the safe, regulated, relational, culturally attuned work of a professional counsellor.

At Four Pillars Counselling in Auckland, we emphasise person-centred therapy, which holds you in relationship, trusts your experience, and invites you to co-create change. I integrate various modalities (CBT, ACT, Narrative Therapy, Te Whare Tapa Wha, Solution-Focused Brief Therapy, Mindfulness — learn more about each here) but always with a human being present.

Final Thoughts

If you’re exploring mental health support and are tempted by an AI chatbot, honestly ask yourself:

  • Am I looking for convenience or genuine connection and healing?

  • Does this tool challenge me when needed or just comfort me — and which will be better for my wellness long-term?

  • If I am in crisis or dealing with trauma, do I need someone who notices subtleties I might miss?

  • Does the tool respect my cultural context, my values, my unique story?

For all these reasons, real-person counselling remains essential. If you’re in Auckland or anywhere in New Zealand and want support that is relational, trusted, culturally aware and grounded in human connection: you’re in the right place. Book an initial appointment today, and we can find a way forward together.

Previous
Previous

Reclaiming Calm: How Mindfulness Supports the Four Pillars of Wellbeing

Next
Next

What’s Person-Centred Counselling, And Is It Right For Me?