Mental health apps have become increasingly popular over the past few years, especially with the rise of telemedicine during the coronavirus pandemic.
However, the problem is that data privacy is compromised in the process.
“Data is incredibly lucrative in the digital space,” Darrell West, a senior fellow at the Brookings Institution, told Yahoo Finance. “That's how companies make money. Many large companies derive a significant portion of their revenue from advertising. People want to target advertising to specific people or specific issues. So if you have a mental health condition related to depression, there could be companies that want to market their medicine to people who are suffering that way.”
In 2023, the Federal Trade Commission (FTC) ordered BetterHelp, a mental health platform owned by Teladoc (TDOC), to pay $7.8 million to consumers who shared mental health data with Facebook (META) and Snapchat for advertising purposes. ordered to pay a fine (SNAP) after promising in advance to keep the information private.
Telemedicine startup Cerebral admitted last year to releasing sensitive patient information to companies including Google (GOOG, GOOGL), Meta, TikTok, and other third-party advertisers. This information includes the patient's name, date of birth, insurance information, and the patient's responses to a mental health self-assessment via the app.
And mental health app Talkspace's privacy policy says the company “may use inferences about you when you complete a registration survey that asks questions about your gender identity, sexual orientation, whether you feel depressed, etc.” It is clearly stated. We have never provided our Privacy Policy for marketing purposes, including customized advertising. ”
Mental health app's policies 'looked like a money grab'
Overall, only two of the 27 mental health apps available to users met Mozilla's privacy and security standards in 2023, according to the Mozilla Foundation's Privacy Not Included Online Buyer's Guide. It's PTSD Coach, a free self-help app created by the U.S. Department of State. Veteran Affairs, and Wysa are apps that offer both an AI chatbot and chat sessions with live therapists.
Mozilla began evaluating these apps in 2022 after their popularity skyrocketed during the height of the coronavirus pandemic.
“We were concerned that companies weren't prioritizing privacy in places where privacy was supposed to be a top priority,” Privacy Not Included program director Jen Kaltrider told Yahoo Finance. Told.
The vast majority of apps did not meet Mozilla's privacy and security standards in both 2022 and 2023 due to how they use user data, performance management, personal information protection, or artificial intelligence.
“It felt like they were taking advantage of people who were in a bad situation and extorting money from them, which was really disgusting,” Kaltrider said.
Telemedicine is a growing industry
According to a report by Grand View Research, the global telemedicine market was valued at approximately $101.2 billion in 2023 and is projected to grow by 24.3% on an annual basis from 2024 to 2030.
North America has the largest market share globally at 46.3%, but other countries are rapidly adopting telemedicine as well.
Mental health apps are also expected to grow significantly between 2024 and 2030, with Grand View Research predicting that the global mental health apps market will reach $6.2 billion in 2023, with a compound annual growth rate of 15.2 billion. We estimate that it will be %.
This also means that the exposure of personal data has increased significantly. A December 2022 study of 578 mental health apps published in the Journal of the American Medical Association found that 44% share the data they collect with third parties.
“I've been on both sides of the fence,” Diane O'Connell, attorney and president of Sorting It Out, told Yahoo Finance. “on the one hand, [mental health apps] It really provided greater access to mental health and physical health. However, there is also concern that personal health information may be hacked. ”
People who use one of these mental health apps to seek help for depression or anxiety may start seeing ads for antidepressants, even if they've never expressed an interest in taking medication. There is a gender.
legal loophole
Data brokers leveraging mental health data is nothing new. According to a February 2023 report from Duke University, of the 37 data brokers that researchers asked about mental health data, 26 responded and 11 “eventually provided the requested mental health data.” “We were positive about selling the product and were able to sell it.”
That's also completely legal.
HIPAA (Health Insurance Portability and Accountability Act) was created by President Clinton in 1996 as a way to “strike a balance between protecting the privacy of people seeking care and healing while allowing important uses of information.” It has been enforced. This is currently considered the primary medical privacy law in the United States.
However, not all entities, including many mental health apps, are bound by HIPAA. According to HIPAA Journal, the law applies to “most workers, most health insurance providers, and employers who sponsor or co-sponsor their employees' health insurance plans.” Organizations that are not required to comply with HIPAA include life insurance companies, most schools and school districts, many state agencies, most law enforcement agencies, and many local governments.
“HIPAA only applies during conversations and information shared between doctors and patients,” Kaltrider said. “Many of these [mental health] In the app, you are not considered a patient in the same way. Some of them are you too. I think Talkspace is a great example of that. [how] Once you become a client of Talkspace, a different privacy policy will apply and cover your interactions than before you became a client. They have it to build a relationship with an actual therapist and not a coach when you are a client. ”
This is a common occurrence with talk therapy apps, Kaltrider explained, adding that HIPAA “doesn't cover the vast majority of content that many people share on mental health apps.”
“People don't understand that [HIPAA] These apps are not covered by it, as it only covers communication between healthcare providers,” Kaltrider said. Even if HIPAA applies to conversations between you and your therapist, some metadata collected about your appointment time or the app you use to make the video call may not be subject to the law.
HIPAA protection also depends on the type of provider you are conferencing with. A qualified therapist is considered a medical professional, but an emotional coach, professional coach, or volunteer is not a medical professional.
“Napster controversy”
Another legal loophole available to data brokers and mental health app providers is found in the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009. It extended HIPAA guidelines to those who are considered “business associates of a covered entity,” the (HIPAA) Journal said.
After Congress passed the Affordable Care Act, many private equity firms bought medical practices and hospital networks, O'Connell said. Because the investors behind these M&A transactions were not in the healthcare industry, they were not considered business associates (billing companies, health care companies, health insurance companies) under HIPAA, and therefore the HITECH Act did not apply. O'Connell explained.
“There's been some confusion about how you can exchange data in a merger and acquisition transaction when you're not actually allowed to exchange personal health information with the company that's trying to acquire you.” O'Connell he said.
That's where the terms of use come into play. O'Connell called this the “Napster debate,” alluding to the former peer-to-peer file-sharing network that was permanently shut down in 2001 following multiple lawsuits related to music piracy.
“Napster wasn't stealing music, it was just creating a platform for people to share it,” O'Connell said. “So you come up with different arguments for how the regulations don't apply, and you create a fact pattern that fits your story until someone takes you to court and a judge decides.”
The main problem, West said, is that the U.S. doesn't have a national privacy law, meaning “there aren't a lot of regulations governing behavior in this area, so there's a wide range of companies.” Some of them take privacy very seriously, while others don't. ”
“We're not against mental health apps,” West said. “The virtues are many. Not having to physically go to a doctor's office brings medical services to a wider range of people.”
West added: “We just want to make sure people are aware of the risks and have better protections built in. And people should be aware of the privacy practices of the specific apps they use. “You need to do your research and make sure the app has protections,” he added. This is what each patient wants. ”
—
Adriana Belmonte is a reporter and editor covering politics and health policy at Yahoo Finance. You can follow her on Twitter @adrianambells Please contact adriana@yahoofinance.com.
Click here for a detailed analysis of the latest healthcare industry news and events impacting stock prices.