MHRA issues new UK guidance on mental health apps

More of us reach for mental health apps when we’re stressed, anxious or waiting for support. The UK medicines regulator, the Medicines and Healthcare products Regulatory Agency (MHRA), has published new plain‑English advice on 27 January 2026, built with NHS England’s MindEd learning programme, to help the public, parents, carers and professionals choose and use these tools with confidence.

From mood trackers to virtual reality therapies, digital tools are now common alongside NHS and community care. The MHRA says the quality varies, so we all need a simple way to tell what’s safe, what’s effective and what’s right for a particular person. This guidance is designed to help you ask better questions and to know what to do if something doesn’t feel right.

MHRA Chair Professor Anthony Harnden explains that people should expect tools to be safe, effective and grounded in evidence, and that digital support can sit alongside, not replace, professional care. If you’re struggling with your mental health, seek help from trained professionals; apps can complement a plan, but they are not a substitute.

Think of this as a five‑minute check before you download or recommend anything. We walk through five quick questions you can apply to any app or technology.

Start with the claim. Is the tool offering general wellbeing support, or does it say it can diagnose, treat or manage a mental health condition? The second group makes medical claims and should explain those claims clearly and back them with evidence you can see.

Look at who it’s for. An app built for adults may not suit teenagers or children. Age guidance and the intended user group should be obvious. If you teach or parent, pause if the developer can’t show a version designed for the age group you support.

Ask for evidence. Trustworthy products explain how they were tested or evaluated, for example through a clinical study or a real‑world service evaluation. Be cautious of bold promises without details on methods, results or independent review. NICE’s Dr Nick Crabb notes that robust evidence helps the right tools reach people faster while protecting value for the taxpayer.

Check your data trail. These tools often collect sensitive information about mood, sleep, voice or location. You should be able to find, quickly, how your data is stored, who can see it, where it may be shared, and how you can delete it. If the privacy information is hard to find or confusing, that’s a warning sign.

Look for regulation when medical claims are made. Some digital mental health technologies are classed as medical devices. Those should meet UK safety standards and display a CE or UKCA mark. You can look up a product on the MHRA’s public register. Not every tool is a medical device-some are wellbeing or lifestyle products. That doesn’t automatically make them unsafe, but they may not have been through the same checks.

If a regulated tool causes harm or distress, report it to the MHRA through the Yellow Card scheme so action can be taken to protect others. In Scotland, healthcare professionals should use the Incident Reporting and Investigation Centre. If you or someone else is at immediate risk, contact NHS services or emergency services.

The new MindEd resources use short animations and real‑world examples to show what safe, well‑evidenced digital mental health technologies look like in practice. They also explain, step by step, how to raise concerns via Yellow Card. The materials are designed for everyone who uses or recommends these tools: the public, parents and carers, teachers, nurses, GPs and mental health practitioners.

The MHRA developed the programme with NHS England’s MindEd and support from Wellcome, as part of a wider project to improve the safe and effective use of digital mental health technologies. Since 2023, MHRA and the National Institute for Health and Care Excellence (NICE) have been working on proportionate regulation and evaluation in this fast‑moving area, alongside people with lived experience, clinicians, developers and international partners.

Wellcome’s Matthew Brown says we need better evidence about what works, for whom and in which contexts, while keeping regulators, developers and healthcare providers working together. Mind’s Stephen Buckley welcomes resources that help people recognise what good support looks like. The British Psychological Society’s Dr Roman Raczka reminds us that AI should support, not replace, human‑led care-because empathy and real connection matter. Northern Ireland Health Minister Mike Nesbitt calls this an important step for health literacy and informed choice.

What this means for you: if you’re a student, ask a trusted adult or clinician to check the five points with you before you rely on an app. If you’re a teacher or pastoral lead, use the MHRA checklist before signposting tools to pupils or families, and record your decision‑making. If you’re a clinician, integrate these questions into consultations so choices are shared and transparent. If you’re a parent or carer, try the app yourself first and review privacy settings together.

Picture a sixth form tutor considering an anxiety app for exam season. They check the claim and see it offers breathing exercises rather than diagnosis. They confirm it’s designed for 16–18‑year‑olds, read a summary of a small evaluation study, review a clear privacy policy, and note there’s no medical claim so no CE or UKCA mark is expected. They signpost it with caveats, remind students it doesn’t replace clinical care, and explain how to report concerns if anything feels off.

The takeaway is simple: slow down for five checks, choose tools that are transparent and evidence‑based, and speak up if something isn’t right. That’s how we keep digital mental health support safe, fair and effective for everyone.

← Back to Stories