A review of mental health and prayer apps has concluded that they offer poorer privacy and security than any other type of app.
Mozilla’s investigation of 32 mental health and prayer apps, including Talkspace, Better Help, Calm, and Glorify found that 28 raised strong concerns over user data management, while 25 failed to meet Mozilla’s minimum security standards, such as requiring strong passwords and managing security updates and vulnerabilities.
Despite dealing with sensitive issues — depression, anxiety, suicidal thoughts, domestic violence, eating disorders and PTSD — these apps routinely share data, allow weak passwords, target vulnerable users with personalized ads, and feature vague and poorly written privacy policies. Some harvest additional data from third-party platforms such as Facebook, from elsewhere on users’ phones or from data brokers.
“The vast majority of mental health and prayer apps are exceptionally creepy. They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data,” says Jen Caltrider, Mozilla’s ‘Privacy Not Included’ lead.
“Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”
The six worst offenders, according to Mozilla, are Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace. Youper, Pray.com and Woebot were found to be sharing personal information with third parties, while Talkspace even collects chat transcripts.
Meanwhile, at least eight apps allowed weak passwords ranging from “1” to “11111111″.
The only two apps found to be protecting data responsibly were TSD Coach, an app created by the US Department of Veterans Affairs and the AI chatbot Wysa.
Mozilla warns that parents of kids and teens should be particularly wary, as many mental health and prayer apps target this market.
“When teens share information on these apps, it could be leaked, hacked, or used to target them with personalized ads and marketing for years to come,” it says.
“Hundreds of millions of dollars are being invested in these apps despite their flaws. In some cases, they operate like data-sucking machines with a mental health app veneer,” says Mozilla researcher Misha Rykov. “In other words, a wolf in sheep’s clothing.”
Update: Pray.com gives no information on what data it collects, where from, and how it is used, but says it “is not in the business of selling its customers’ personal data”. It adds: “Pray.com remains focused on delivering the best digital faith experience and leaving a legacy of helping others. This includes providing a safe and secure community for its customers as well as stepping out as a leader in the future of web3, crypto and NFT technology. This will help further strengthen privacy and IP ownership while decreasing censorship in the market. Pray.com stands committed to providing a safe and secure environment for its customers and looks forward to serving them in new ways as it embraces the technologies of the future.”
Talkspace says Mozilla’s report lacks context. A spokesperson adds: “We have one of the most comprehensive privacy policies in the industry, and it is misleading to assert we collect user data or chat transcripts for anything other than the provision of treatment.”
Youper denies selling personal information, and says it only shares it with users’ consent, adding: “Messages between users and their medical providers are encrypted in transit. All users’ health information used by health providers is recorded in an Electronic Health Record, which follows the HIPAA standard to protect privacy.”