As a category, mental health programs have worse privacy protections for users than most other types of apps, according to a new analysis by researchers at Mozilla. Prayer applications also had poor privacy standards, the team found.
“The vast majority of mental health and prayer programs are exceptionally scary,” said Caltrider, Mozilla. * Privacy Not Included guide, said in a statement. “They track, share, and capitalize on users’ most intimate personal thoughts and feelings, such as moods, mental state, and biometric data.”
In the latest iteration of the guide, the team analyzed 32 mental health and prayer records. Of those programs, 29 received a warning label of “privacy not included”, indicating that the team had concerns about how the program managed user data. The programs are designed for sensitive issues such as mental health conditions, yet collect large amounts of personal data in accordance with inaccurate privacy policies, the team said in the statement. Most programs also had bad security practices, allowing users to create accounts with weak passwords despite containing deeply personal information.
The programs with the best practices, according to Mozilla, are Better Help, Youper, Woebot, Better Stop Suicide, Pray.com and Talkspace. Woebot’s AI chat, for example, says it collects information about third-party users and shares user information for advertising purposes. Therapy provider Talkspace collects user chat transcripts.
The Mozilla team said in a statement that it had contacted the companies behind these programs to ask about their policies several times, but only three responded.
In-person, traditional mental health care can be difficult for many people to find – most therapists have long waiting lists, and navigating insurance and costs can be a major barrier to care. The problem worsened during the COVID-19 pandemic as more and more people began to need care. Search applications for mental health fill that gap by making resources more accessible and readily available. But that approach could come with a privacy compromise, the report shows.
“They work like data-sucking machines with a mental health application,” Mozilla researcher Misha Rykov said in a statement. “In other words: a wolf in sheep’s clothing,”