Wellbeing washing: Are workplace mental health apps doing more harm than good?

19 dic 2024

6 min

Wellbeing washing: Are workplace mental health apps doing more harm than good?
autor
Tanmoy GoswamiLab expert

Founding editor : Sanity by Tanmoy & lived-experience advocate

Abusive work cultures, ruthless profit-seeking, and rampant job insecurity…workers’ mental health challenges are complex, and there are myriad factors that take a toll.

In response, a new trend is emerging: employers are offering digital mental health tools—online therapists and coaches, chatbots, and apps—to build ‘happier’ and ‘more resilient’ workplaces.

But researchers worry that many of these tools could be making unsubstantiated and misleading claims, and pose serious risks to sensitive user data. And they cannot be a magic fix for deep-rooted systemic issues.

Workers’ experience with mental health apps

In 2022, Suparna’s* employer, a multinational with billions of dollars in yearly profits, rolled out a personal wellness chatbot for its staff. It was meant to answer employees’ questions and nudge them to take breaks, practice breathing exercises, or drink water. The company heavily marketed the bot—but employees showed little enthusiasm for it. Much of the advice the bot gave could be easily Googled. Employees were also spooked by the lack of transparency around how it handled their personal information.

“People do not want their bosses to know about their mental health struggles,” Suparna says. “If someone asks the bot a serious mental health question and not just for tips for sleeping better, they don’t know where the data is stored or who sees it.”

Soon after this failed experiment, the company began mass layoffs. The chatbot now exists as part of a support toolkit for fired employees. “It’s like saying, ‘we know we are causing all this stress, but here’s a bot you can play with,’” Suparna says.

Similarly, Leanna’s organization has a history of “arbitrarily putting employees in performance improvement plans [PIPs] citing poor performance, so they can fire you without paying severance.” Employees have access to a platform where they can speak to mental health professionals for free. “I have never used it,” Leanna says. “I don’t know anyone who has.”

Anita is an HR manager who has implemented mental health apps in multiple organizations. “One of them hated any diversity, inclusion, equity, and belonging initiatives. Another one seemed all fluffy, yet had no [BIPOC leaders]. In fact some leaders would advocate hiring from Asia-Pacific and Africa because these were cheap labor markets. And yet these companies wanted to be seen as “pushing [the cause of] mental health.”

The digital mental health boom

For this piece I interviewed over a dozen employees across different sectors and geographies. These conversations pointed to a troubling trend: Even as insensitive management, ruthless profit-selling, and rampant job insecurity batter workers, employers are rushing to adopt technology as a quick fix for their distress. At stake is $1 trillion in annual productivity lost due to poor mental health of employees.

So far this year, the tech sector alone has shed more than 235,000 jobs, adding to the nearly half a million jobs lost in 2023. Companies have fired people over email and Zoom calls, and escorted employees out of office buildings without warning. Meanwhile, according to one survey, 72% of large companies in the US added virtual behavioral healthcare or telehealth tools for their workers. More than two-thirds added or enhanced employee assistance programs (EAP) such as access to mental health apps.

The digital mental health market is worth an estimated $20.1 billion and is expected to nearly triple to $55.82 billion by 2030. In fast-growing markets such as India, which has some of the world’s highest employee burnout rates, enterprise sales are the strongest source of growth for digital mental health startups. In the first half of 2024, investors poured $682 million into the sector, making it the top-funded category within digital health.

Is it wellbeing washing?

To be sure, technology can be a force multiplier in mental healthcare. Digital tools promise personalized and convenient access to support and can destigmatize help-seeking. In the workplace, they claim to bring greater reach and engagement than traditional EAP services and are cost-effective and easily scalable, says Anita.

However, employers often buy into these tools just to “do something” about a complex challenge that they don’t make the effort to really understand, says Sam, who has worked with EAP providers. Most employers claim that they want a more resilient and happier workplace to attract young talent. “But often, all they ask is, ‘can you help us boost productivity?’”adds Smriti Joshi, chief psychologist at the AI-based mental health platform Wysa.

These are classic signs of wellbeing washing—pretending to care about employees while shirking real reforms. They raise urgent questions about the hype around digital mental health tools in the workplace: Do they really work? What damage can they cause in the hands of uninformed or exploitative employers? And how can employers do it right?

Unsubstantiated and misleading claims

There’s no reliable data on the number of mental health apps out there; one estimate from 2021 put it at over 20,000. Their rapid mushrooming thwarts rigorous research into their claims. As such, they could be marketed to employers and the public based on “very limited or no evidence of effectiveness,” using “unsubstantiated or even misleading claims,” according to researchers led by Lyndsay Krisher of the Colorado School of Public Health.

App makers can evade scrutiny by claiming that they are in the loosely regulated ‘wellness’ business. Many apps that use the ‘mental health’ label employ coaches as opposed to licensed clinicians. The majority of employees who shared their story for this piece said that even when a platform does include therapists, it’s difficult to verify their credentials. One of them said that even though their employer operates globally, the mental health platform they have enlisted has only western therapists who do not understand the cultural context of users from the rest of the world. Another said their app promised services like diet consultations but didn’t actually offer them. Yet another complained about poorly designed user interfaces.

Almost all the employees said that their employers did not consult them before enlisting these platforms, nor did they have a feedback mechanism where they could raise their concerns or suggest improvements. Several employees said that they were put off by this top-down approach and the gap between what’s advertised and what’s really available. They junked the tools altogether, making the entire exercise a waste of employers’ resources.

Data privacy risks

In 2022, privacy watchdog Mozilla Foundation revealed that most mental health apps were “exceptionally creepy.” Even though they dealt with highly sensitive issues—such as depression, anxiety, suicidal thoughts, or domestic violence—they routinely shared data, allowed weak passwords, targeted vulnerable users with personalized ads, and featured vague and poorly written privacy policies.

Many apps don’t follow strong data security protocols, making them prime targets for cyber attackers on the hunt for valuable personal health data. In advanced markets such as the US and EU, regulations like HIPAA and GDPR offer some data protection to users. But vast swathes of the world lack such safeguards.

And still, almost all employees interviewed for this piece said that their employer did not educate them on these risks. Worse, some employers expect app makers to hand over employees’ personal data.

Joshi of Wysa, which allows users to opt out of personal data collection and delete their data, says they have lost business because they refused such requests. “People use Wysa because it lets you remain anonymous. It’s a safe space where you can vent about anything, including difficult bosses or colleagues. Unless an individual is at risk of harm, no one has the right to know about their exchanges with Wysa.”

Even when personal data is anonymized, it can be reidentified, meaning that an employee’s mental health condition could potentially be weaponized against them. Employers can use this data to gauge productivity and discriminate among employees while distributing financial rewards, says Daniel Aranki of the UC Berkeley School of Information.

How to implement workplace mental health apps—the right way

To begin with, employers need to define the returns that they can legitimately expect from these tools. For instance, researchers Jon Torous and Elena Rodriguez-Villa offer a framework for cost-benefit analysis that balances the value employers can gain from these tools with measurable, sustained improvements in employees’ mental health.

Joshi says Wysa’s most committed clients measure engagement with the app, user growth, and improvements in employees’ wellbeing rather than balance sheet gains. They also don’t pry into employees’ personal data.

One employee I spoke with said that her employee-provided app was fun to use and helped them create a self-care routine; another said it gave him the comfort that they have access to support anytime. Common to both was a culture of trust, respect, and open communication in their workplaces. They believed that their leaders truly cared about their wellbeing and weren’t paying lip service to it.

Technology can be a useful supplement in an already healthy workplace. But, as Suparna puts it, when employers use tech as window dressing out of FOMO or to cover up broken systems, “People see through it.”

5 parameters for evaluating digital mental health tools

You can evaluate an app’s features and policies on independent third-party platforms such as MindApps, created by digital psychiatry experts. Krisher and her research team at the Colorado School of Public Health outline five parameters to help decision-makers.

1. Accessibility

What are the cost barriers to adoption? Do all employees own a smartphone where they can access all features of the app? Can those with unreliable internet access access it offline? Are features available in multiple languages and accessible to people with disabilities?

2. Privacy and security

No app should cause harm, including through the release of confidential health information.

3. Clinical foundation

Are the app’s claimed benefits backed by credible research and user feedback?

4. Engagement style

Can employees decide to use the app or not free of coercion?

5. Therapeutic goal

Does the app help users connect to clinical care and treatment if needed?

*Names have been changed to protect identities.

Photo: Welcome to the Jungle
Follow Welcome to the Jungle on Facebook, LinkedIn, and Instagram and subscribe to our newsletter to get our latest articles every week!

Las temáticas de este artículo