Insights for Organisations Diversity, Equity & Inclusion

AI and mental health risks in the workplace 

Skills Lab Team
07 May 2026 Published: 07.05.26, Modified: 07.05.2026 17:05:19

Over one in three adults use AI Chatbots for mental health support, but is the technology ready?

Workplace mental health is now recognised as a shared responsibility between individuals and organisations. Research shows that around one in four people face mental health issues yearly, while workplace stress is a leading cause.

Employees are now exploring technology to manage stress, workload and daily challenges, with studies showing nearly 88% of employees use AI, often without formal guidance.

AI offers accessibility and immediacy, but raises concerns about dependency, accuracy and human connection.

This article examines the balance between AI’s role in mental health support and the need for organisations to prioritise human-first approaches as AI becomes more prevalent at work.

How could AI tools help our mental health?  

AI chatbots aren’t therapists, they’re tools built with large language models, trained on millions of conversations and texts.

Natural language processing helps the programme interpret what you’re telling it and predicts the most helpful response. That’s why you must be specific about what you need.

Sometimes, you need to feel heard rather than fix a problem, and AI tools are available instantly without judgment. For many people, that alone can lower the barrier to opening up.

When overwhelmed, employees may struggle to organise their thoughts or identify the root of their stress. AI prompts reflection, asks clarifying questions and offers frameworks for clear thinking.

An employee facing workload pressure might use AI to break down tasks, prioritise or reframe challenges in a manageable way. This creates a sense of control, often the first step in reducing anxiety.

Effectiveness depends on use. AI can guide but can’t understand human emotion as people can.

What are the risks in using AI for mental health support?  

AI brings advantages but introduces risks that organisations and individuals must address.

Lyndsey Regan, FDM People Team Senior Manager and FDM Mental Health First Aider, believes, “AI cannot read, understand or replace empathy or feelings, making it a risky way to support employees who may be at a crisis point or need immediate intervention.”

Here are a few examples of risks that employees face when using AI for mental health: 

Accuracy and reliability

AI responses are based on patterns and data, not experience or professional judgement. So the advice it generates may be generic, unsubstantiated by real research, and even potentially harmful.

Data privacy and trust

Surveys show 90% of employees are concerned with data use in AI tools, especially for sensitive information. Organisations must have strict guidelines in place for using business-approved AI tools and ensure employees understand the risks and limitations of these systems.

Connection and workplace relationships

Another consideration is how AI might shape the way we interact with others at work. If it becomes the first place we turn for support or problem-solving, opportunities for collaboration and informal conversation can begin to decrease.

Reduced interaction can affect the connections that underpin a positive workplace. Everyday exchanges, asking for input, sharing ideas or working through challenges together helps team-building.

What does over-reliance on AI actually look like?  

To better recognise and prevent these risks, it helps to understand what over-reliance on AI actually looks like in the workplace.

Gradual dependency 

Dependency on AI develops gradually, often starting with convenience.

Initially, an employee may use AI occasionally for guidance. Over time, as its usefulness and accessibility become clear, it can become routine.

Shifts in behaviour and interaction 

An employee might turn to AI instead of a colleague when overwhelmed. They may seek AI reassurance instead of addressing issues with a manager.

Increased reliance on AI can reduce interaction, despite evidence that strong workplace relationships drive wellbeing and engagement.

Reduced critical thinking 

If AI becomes the authority and employees accept responses without question, critical thinking and self-awareness can diminish.

Recognising these patterns early is crucial. The goal is to ensure AI remains one tool among many, not to discourage its use.

What role do mental health first aiders play in an AI-driven workplace? 

As AI becomes more embedded in the workplace, the role of Mental Health First Aiders (MHFAs) grows even more important.

They provide something that AI cannot: genuine human connection, empathy and understanding. Mental Health First Aiders are trained to listen, support and guide individuals towards appropriate resources.

In an AI-driven environment, they also play a key role in balance. While AI offers immediate guidance, Mental Health First Aiders help individuals interpret their experiences more delicately.

Lyndsey Regan shares, “Mental health is just as important as physical health and can be impacted by workplace pressures. MHFA are trained to recognise early signs that an employee may be struggling, offer reassurance and guide them to professional support.”

They also help build a culture where mental health is openly discussed and supported. This is especially important to prevent over-reliance on AI.

Advice for people who are using AI to support their mental health 

AI works best when used intentionally and balanced with other support.

  • Use AI as a prompt to organise thoughts and explore different perspectives, not to replace your judgement or others’ input.
  • Be aware of your AI use. If you rely on it for every concern or decision, consider stepping back.
  • Prioritise human connection. Conversations with colleagues, managers or Mental Health First Aiders offer depth AI can’t replicate.
  • Respect boundaries. Avoid sharing sensitive information and understand how your data is used.
  • Seek professional support when needed. Early intervention improves outcomes, and consulting a qualified professional remains effective for ongoing challenges.

Lyndsey believes in, “Putting the human element before the technology. This is so important when dealing with mental health, as there is no one course of treatment fit for all. It is quite often an ongoing journey that responds to different treatments and support at different times.

As organisations adopt AI, the focus should remain on enhancing, not replacing, the human experience at work. A human-first approach ensures technology supports wellbeing without diminishing the relationships and support systems that truly make a difference.

Other helpful resources:

How FDM can support 

We are proud to be an equal opportunities employer, ensuring we give everyone the chance to pursue their dream careers in business and technology. We’re committed to breaking the stigma of mental health in the workplace to ensure a safe working environment where everyone is able to get the support they need. If our consultants are struggling with their mental health at work, our Mental Health First Aiders are available to lend a helping hand and provide guidance to additional, professional support.

Learn more about FDM’s commitment to diversity and inclusion.

Yes
No