Chatting with an AI-powered therapist after a hard day at the office is convenient, but it’s not necessarily confidential. We’re still in the Wild West when it comes to protecting your privacy on mental health apps.
If you’re not careful, you could be signing your rights away the moment you tap the download button. New research from Mozilla revealed how mental health apps treat personal data, and many failed the test.
*Privacy Not Included
The *Privacy Not Included initiative was launched by Mozilla in 2017. The label aims to give people the knowledge they need to choose products that best protect their privacy. Since the initiative started, over 100 apps and 300 internet-connected devices have been reviewed.
The bad news? Many apps target vulnerable users with predatory ads, share your data and have garbage-quality privacy policies.
1 step forward, 2 steps back
Mozilla’s 2022 investigation revealed that mental health apps were “worse than any other product category” in terms of security and privacy. Oy vey. They gave 29 of the 32 studied apps the *Privacy Not Included label.
This year, the Mozilla team found 40% of the apps were even worse than last year. However, a third of them did improve. Only 19 of the 32 apps in the 2023 investigation received the *Privacy Not Included label.
Mozilla’s most wanted
Which mental health apps should you steer clear from? Mozilla warns that these are some of the worst options:
- Replica: My AI Friend
Haven’t heard of them? Keep an eye on these well-known apps, which require up-front questionnaires before they show you their privacy policies:
Fortunately, some mental health apps are taking privacy seriously and making positive changes. Here’s who got Mozilla’s thumbs-up:
- Youper won “most improved app” in 2023.
- Woebot gives users full access to their data.
- Modern Health won’t share or sell your info.
Before downloading any app, don’t forget to read the terms of service — carefully. Speaking of apps, head here to learn how to stop them from tracking your every move.