What your mental health app reveals to big tech

Osman

When we book a session with a mental health professional, there is a high degree of trust built into the interaction. We’re paying someone to listen to some of our deepest and darkest secrets, and privacy is expected as a matter of ethics and legality. Whatever you say in that room should never leave, with only rare exceptions.

That’s one of the fundamental principles that underpin the profession. But does this extend to online realms too?

Apps that offer guidance or put users in touch with professionals help democratize access to treatment for depression, substance abuse, anxiety, and much more. We’re no longer restricted to professionals—and the fees they dictate—in our immediate physical location.

Individuals who are uncomfortable discussing their problems in person might be more willing to turn to counselors over apps. In some ways, online services provide a greater sense of privacy—but reality could be far from the case.

[Interested in more privacy news? Sign up for the ExpressVPN newsletter.]

Interest in mental health has exploded this year, thanks to the emergence of Covid-19, which has led to a suffering economy, lots of time spent indoors, and greatly cutting back on getting to see friends and family. And apps offering help have reported increased sales.

But can we trust them with our privacy and data?

Not so confidential: Mental health apps track you

A 2019 study critically investigated 36 top-ranking apps for depression and smoking cessation on the iOS and Google Play stores. Out of the 36 apps, 33 were caught transmitting data to third-party services but only 12 communicated this transparently in their privacy policy. All the third-party entities processing this data were linked to Google or Facebook; that’s despite both companies requiring developers to explicitly disclose the terms of service in their privacy policies.

Another study widened the net to include 56 apps specifically for mental health. What the researchers found was even more egregious: Apps frequently requested permissions that they didn’t need and encouraged users to voluntarily share their information with online communities. Twenty-five of the apps investigated didn’t have a privacy policy either, keeping users in the dark about how and when personal information was collected or if it was shared with third parties.

An in-depth analysis of prominent mental-health app Better Help, which had over half a million downloads in 2019 alone, exposed just how murky this industry can be. Better Help has an attractive model: For a flat rate of 40 USD per week, users can text, call, or video chat with a licensed counselor. It has used flashy marketing campaigns, signing on NBA players and YouTube influencers who call for destigmatizing mental-health treatment.

Better Help claims to encrypt the content of communications between patient and counselor, but there is a wealth of other data that it tracks. For example, its signup process asks for users’ gender, age, and sexual orientation, as well as questions like if you’ve had suicidal thoughts.

The study found that the app was relaying back metadata to “dozens of third parties,” including Facebook, Google, Snapchat, and Pinterest. Facebook was informed every time a user opened the app and how often they were attending counselor sessions. And while the specific contents of patient-counselor communication were not disclosed, Better Help did transmit metadata back to Facebook, including information on what time of day users were attending sessions, their location, and the amount of time spent on the app.

Some of the data was anonymized—but the fact is anonymized data sets aren’t really anonymous. They can be linked back to individual profiles and used to keep tabs on user behavior.

What’s the solution?

Formalized healthcare and psychiatric services are subject to strict regulatory standards. The processing of patient data in hospitals and primary care centers is also governed by regulations such as HIPAA. While the FDA and other government agencies can hold sway over doctors, clinics, and hospitals, these safeguards are nebulous in online marketplaces.

The app industry is perversely incentivized to incorporate fewer mechanisms for user privacy. The more data points they’re able to skim and feed to third party tracking platforms, the better their targeting and advertising. And free mental health apps can only monetize through invasive advertising and data tracking; such apps are the ones that will fail to disclose privacy policies and/or forcefully insert tracking code.

It’s not just data tracking that’s a problem with mental-health apps. In order to surge to the top of app stores, developers often have to make outsized claims of their effectiveness and scientific backing. A 2019 study revealed that only half of the apps claiming to follow scientific methods actually did, while approximately one-third referred to techniques for which no evidence could be found.

App developers aren’t going to be the ones driving more attention to privacy and security issues. Change has to come from the gatekeepers of the platforms they build apps on or through wider industry regulators. That in itself represents a conundrum. Should Facebook be regulated by the Securities and Exchange Commission or the Federal Drug Agency? What about Google?

Unless there are rapid improvements in the enforcement of user privacy at tech companies, it stands to reason that the status quo will not change. Healthcare regulation is robust but it simply lacks influence over online spaces.

I like to think about the impact that the internet has on humanity. In my free time, I'm wolfing down pasta.

View Full Experience

ExpressVPN is proudly supporting

Need help? Chat with us!