Mental Health Apps
First Evaluation: August 2020
Latest Update: December 2020
Mental health apps growing in popularity: App-based mental health counseling is a large and growing industry. These apps accounted for $587.9 million in 2018 and are expected to generate up to $3.9 billion in annual revenue by 2027, according to some estimates. Due to greater awareness related to the significance of mental health, U.S. and Canadian consumers made up the majority of the global market in 2018.
Increased anxiety and tumult, harms not evenly distributed: Recent events like the global coronavirus pandemic, the resulting economic crisis, and large scale protests related to the Black Lives Matter movement, have spotlighted rising mental health related harms with marginalized and vulnerable populations. Increased anxiety and upheaval causes both physical and psychological symptoms and can be very distressing. Mental health applications collect sensitive information that can create damaging, irreversible impacts on individuals if shared with third parties, including social stigmatization and additional barriers to future opportunities. These applications can collect data around topics such as anxiety disorders, depression, bipolar disorders, eating disorders, and post-traumatic stress disorders. Finally, the pandemic has exposed disparities in the U.S. mental health system, reported the Center for American Progress. People with mental health disabilities face “disproportionately high rates of poverty”, “housing and employment discrimination”, and criminalization.
Data leaks: In addition, there are documented data leaks with mental health applications. Investigative journalists have highlighted issues around excessive data sharing with the argument that apps can either sell subscriptions to services or sell data. Privacy concerns about mental health apps have highlighted the need for improved regulation on these apps, marketed to people with anxiety, autism and depression. Other research highlighted how “the majority of the top-ranked mental health apps for depression and smoking cessation” share user data without disclosing the practice in privacy policies.
Data Testing: We ran a static analysis of each Android application. We also worked with AppCensus, a company that analyzes app behavior for privacy and security issues, to do an automated analysis of the apps. This research process involved an inspection of the following items:
Design analysis (UX + UI): The user experience, user interface design analysis involved a manual, thorough review of all of the user-interfacing elements of the applications. More specifically, the purpose of this work is to:
Policy review: Third, the team reviewed the privacy policy and terms of service documents of the applications.
CR sent a letter on December 17, 2020 to seven companies that operate mental health applications: BetterHelp, Moodpath, Sanity & Self, Talkspace, Wysa, Youper and 7 Cups. Based on a review of the policies of these applications, CR is urging the companies to raise the standard of privacy and transparency for their services.
We provided the four recommendations below to the companies:
Recommendation 1: Clearly explain procedures used for de-identification of data used for research. Identifiable data should not be shared except at the consumer’s direction.
We advocate for companies to improve clarity on research for data sharing especially around how they define “anonymized data.” Companies should be explicit about what processes they use to de-identify data. We highlight this to help prevent people from being reidentified. Mental health applications collect sensitive information that can create damaging, irreversible impacts on individuals if shared with third parties, including social stigmatization and additional barriers to future opportunities.
Recommendation 2: Provide clear and contextually-appropriate explanations of how user-provided data will be used, so users are aware of potential consequences before they share.
Companies should not overwhelm people with superfluous information or choices. Wherever possible, app default settings should be that your privacy is protected and users should not have to worry about managing this on their own. However, if there are choices to be made or information someone should be aware of, they should be presented in a clear and straightforward way.
Recommendation 3: Adhere to platform guidelines that are in place to protect people’s privacy.
App developers should ensure that their apps meet the guidelines laid out in Android developer documentation, such as Best Practices for Unique Identifiers which recommends avoiding the use of identifiers like the Android ID (SSAID). App developers should also make sure that the libraries (SDKs) they embed within their apps meet their own expectations for data collection, and that they are configured accordingly.
Recommendation 4: Transparently disclose the service providers that receive data when people use your apps.
We recommend that companies are more transparent in their privacy policies about the service providers that receive data. Although it is not legally required or common practice in the U.S. to list every service provider or institution receiving data, we recommend companies proactively disclose this information.
We published a full research study that provides additional information about our investigation. We also published a letter to the companies, a 2-page brief and a blog post.
.