Online and Telephone Counselling Course – Enrolment OPEN

Government Guidance for Counsellors Using AI Tools

What the MHRA update means for everyday UK practice

Over the past few years, digital tools claiming to support mental health have multiplied quickly. Many now describe themselves as using artificial intelligence. They promise insight, efficiency, personalisation, and safety. They often reassure users that data is protected and that their systems are secure.

For counsellors and psychotherapists, this creates a difficult landscape to navigate. It is not always clear what a tool is actually designed to do, how it works with sensitive information, or where professional responsibility sits when it is brought into therapeutic work. Marketing language can sound confident, even when the ethical and clinical implications are far from settled.

Many practitioners describe feeling both curious and cautious – interested in what AI tools might offer, while aware that they lack the time or training to evaluate them with confidence.

Illustration of a counsellor sitting thoughtfully in an armchair with a small brain-and-circuit icon in a thought bubble, representing caution, curiosity, and fear about using AI in counselling practice.

“Many counsellors feel both curious and cautious – drawn to what AI tools promise, while unsure how to evaluate them ethically.”

Against that backdrop, counsellors have been asking the same quiet questions. How do I know whether an AI tool is appropriate for use in counselling? What responsibility do I carry if something goes wrong? How do I protect clients when the technology itself is opaque?

A recent update from the Medicines and Healthcare products Regulatory Agency offers an important step towards clarity. While the guidance is not written specifically for counsellors, it provides a clearer framework for thinking about what mental health apps and technologies are for, how safe they are, and how concerns should be raised. For practitioners looking for orientation rather than instruction, it helps to steady the ground.

What has changed for counsellors in the MHRA guidance?

The MHRA has updated its guidance on mental health apps and digital mental health technologies. The emphasis is on helping people understand a tool’s intended purpose, the evidence behind it, and the level of risk involved.

One of the most important clarifications is the distinction between general wellbeing tools and technologies that may function as medical devices. Apps or platforms that claim to diagnose, monitor, treat, or prevent mental health conditions may fall into a different regulatory category, even if they are presented in friendly or informal ways.

For counsellors, this matters because how an AI tool is used in practice can shift its meaning and its risk. A wellbeing app used independently by a member of the public is not the same as an AI-enabled tool being suggested, endorsed, or integrated within therapeutic work.

The guidance reinforces a central point. AI tools do not sit outside professional responsibility. They sit within it.

“AI tools do not sit outside professional responsibility. They sit within it.”

Why the MHRA guidance matters for counselling practice

Counsellors are not being asked to regulate technology or to become technical experts. However, the guidance makes clear that ethical duties do not disappear simply because a tool is digital or automated.

Illustration of a counsellor using a laptop labelled “AI”, with thought bubbles showing “Purpose?”, “Evidence” and “Data protection?”, representing questions about how safe AI tools are in counselling practice.

At this point, many practitioners find themselves wondering how to know whether an AI tool is ethical to use in counselling practice. That question is not about innovation. It is about alignment with core professional values.

When a counsellor recommends or relies on an AI-enabled tool, a number of considerations follow naturally. What is the tool actually designed to do? Is there evidence that supports the claims being made? How is client data stored, processed, and protected? Could the tool quietly shape expectations or boundaries in ways that are not immediately obvious?

These are not new ethical questions. They are familiar concerns appearing in a new context.

Counsellor responsibility does not disappear when using AI tools

One of the risks of rapid technological change is the sense that responsibility becomes blurred. Systems suggest. Algorithms generate. Platforms reassure.

In practice, responsibility remains with the practitioner.

If something goes wrong, the question is not what the technology intended, but where professional responsibility sits. AI tools may support reflection, organisation, or pattern recognition, but they do not hold clinical accountability. They do not obtain informed consent. They do not make relational judgements. They do not repair harm.

Ethical judgement in the use of AI begins in the same place as all clinical judgement – in reflection, supervision, and attention to the therapeutic relationship.

“Ethical judgement about AI begins in the same place as all clinical judgement – in reflection, supervision, and attention to the therapeutic relationship.”

The MHRA guidance does not remove the need for professional judgement. If anything, it sharpens it.

Client data, confidentiality, and trust when counsellors use AI

For many counsellors, concerns about AI tools quickly turn to data. What is actually happening to client information when an AI app or platform is involved? Where is it stored? Who can access it? What happens if systems change, are sold, or fail?

These questions are not only legal. They are relational. Clients trust counsellors to protect their privacy and to be transparent about how information is handled. That trust can be undermined if a practitioner relies on assurances they do not fully understand.

The guidance encourages greater clarity about data use and safety, but it also highlights the need for counsellors to ask their own critical questions, rather than relying solely on developer claims.

“Client data is not just information. It is part of the therapeutic trust.”

How counsellors apply MHRA guidance in day-to-day practice

Government guidance sets expectations at a high level. Counsellors then have to translate those expectations into everyday clinical choices.

This is often where practitioners feel the most pressure. Not because they are unwilling to engage with AI, but because the pace of change leaves little time to reflect. Tools appear quickly. Updates are frequent. Ethical implications emerge slowly.

For example, if an AI-enabled supervision or practice management platform begins suggesting themes, formulations, or intervention ideas based on session material, this raises important questions about what the tool is claiming to do and how its outputs are understood and used.

Illustration of a counsellor sitting in an armchair with a thoughtful expression and a question mark above their head, representing professional reflection and uncertainty about using AI in counselling.

In practice, these decisions are rarely made alone. Supervision, peer dialogue, and organisational policies remain essential places to think through uncertainty, alongside emerging regulatory guidance.

A structured approach to critical thinking can help here. Rather than asking whether an AI tool is efficient or popular, the focus shifts to whether it supports client safety, autonomy, confidentiality, and therapeutic integrity.

How the counselling profession is responding to AI

It is also important to recognise that these concerns are not being held by individual practitioners alone. Across the counselling and psychotherapy professions, there is growing recognition that AI requires shared ethical attention.

The work of the Artificial Intelligence Expert Reference Group in Counselling and Psychotherapy reflects this wider conversation. By bringing together representatives from across the field, it acknowledges that navigating AI ethically is not just a technical task, but a professional and collective one.

This wider context matters. It reassures practitioners that uncertainty is not a personal failing, but a reasonable response to a rapidly changing landscape.

Keeping therapeutic values at the centre of AI use in counselling

The MHRA guidance does not tell counsellors what AI tools to use, or whether to use them at all. It does not replace ethical frameworks, supervision, or reflective practice.

What it does is offer a clearer starting point. It helps distinguish between claims and purposes. It reinforces the importance of safety and evidence. It reminds us that responsibility does not vanish when technology is introduced into therapeutic work.

For counsellors, the task is not to keep up with every new tool, but to remain anchored in professional values. To ask careful questions. To slow decisions that feel rushed. To notice when convenience begins to outweigh care.

Illustration of a therapist weighing a tick and a gavel in each hand, representing ethical checks when choosing AI tools in counselling practice.

AI will continue to evolve. Guidance will continue to develop. The heart of counselling practice, however, remains steady.

“Guidance will evolve. Tools will change. Professional values remain steady.”

And that steadiness is still our most reliable guide.

  • British Psychological SocietyStatements and commentary on MHRA guidance for mental health apps and technologies. (No public link identified; check BPS website or member resources.)

Transparency note
This article was written and reviewed by human contributors. ChatGPT 5.2 was used as a supportive tool to assist with formatting, layout clarity, and language refinement. All content, interpretations, and ethical positions were created and checked by the authors.

💡 About Counselling Tutor

Counselling Tutor provides trusted resources for counselling students and qualified practitioners. Our expert-led articles, study guides, and CPD resources are designed to support your growth, confidence, and professional development.

👉 Meet the team behind Counselling Tutor