Premium

How mental health apps are yet to fulfil the promise of user data privacy

Reportedly, mental health applications’ usage recorded a 54.6% rise for the period 2019-21

According to Grand View Research, the international mental health application market has been valued at .09 billion for 2024
According to Grand View Research, the international mental health application market has been valued at $7.09 billion for 2024

With the onset of the COVID-19 pandemic, the digital impact on healthcare seemingly started to make a mark. Now, over three years into the post-pandemic world, the influence is believed to have become more prominent, especially around mental health. Reportedly, usage of mental health applications has seen an upward trend in the last couple of years. However, the concern seems to lie around the alleged inability of mental health applications to maintain data privacy. “I believe the pandemic has driven patients to opt for virtual consultations, reshaping their expectations of face-to-face interactions. When we navigate mental health applications, we seemingly begin a journey with a commitment from these companies to ensure that their privacy policies are in line with our understanding. The prospect of data breaches or sales, coupled with the identification of individuals, can carry consequences in a society that stigmatise those seeking resources for mental health protection,” Somdutta Singh, founder and CEO, Assiduus Global Inc, an AI-powered e-commerce accelerator, told FE TransformX. 

Sign Up to get access to the Financial Express Exclusive and Premium Stories.
Already have a account? Sign in

Media reports have shown that mental health applications’ usage recorded a 54.6% rise for the period 2019-21. From what it’s understood, the  pandemic not only brought a rise in downloads of mental health applications but also outlined the need to address mental health problems. Although these are understood to be positive signs, concerns have seemingly been relayed around the vulnerabilities of these applications with regard to data security. Data provided by the National Library of Medicine, a biomedical library, have stated  that data privacy problems associated with mental health applications include uncertain cryptographic executions, disclosure of delicate information in logs and web requests, and irrelevant authorisation. The library also mentioned how mental health applications lack proper defensive applications to ensure protection regarding detectability, linkability, and identifiability. Market-based research has shown that unprotected mental health applications can be subjected to cyberattacks, which can outline the absence of security understanding amongst mHealth developers. A report published by Mozilla, a software developer, gave data about the privacy policies of 32 mental health applications, of which 22 were labelled with a “privacy not included” cautioning, which suggested other drawbacks such as lack of data control, absence of data protection means, and non-compliance with Mozilla’s Minimum Security Standards.   

“I think prioritising privacy in mental health applications is important, especially given the sensitive nature of the data involved and the current digital revolution. Protecting user data with security measures and ensuring confidentiality is considered crucial for building trust and ensuring the ethical handling of personal health information. As we continue to integrate technology into healthcare, maintaining high standards of data privacy and security is believed to be non-negotiable. Drafting specific privacy legislation for mental health data is considered important. Such laws would provide a framework for handling sensitive mental health information, ensuring it is treated with care and confidentiality,” Tarun Gupta, Co-Founder, Lissun, a tech-enabled mental and health emotional startup, specified. 

In 2023, cases around data infringement by mental health applications seemingly became evident in the market. In March 2023, the Federal Trade Commission instructed BetterHelp, a mental health platform, to pay compensation worth $7.8 million to its customers on account of sharing their personal information for advertising reasons with platforms such as Snapchat, Pinterest, Meta, and Criteo, without its consumers’ consent. Apart from sharing personal data, mental health applications seemingly channel this information towards developing their services. Cerebral, a telehealth startup, confessed to sharing personal information belonging to more than 3.1 million users, with companies such as TikTok, Google, Meta, and other third-party platforms. Context of the data included information about birth dates, insurance data, patients’ names, and responses to mental health-based questions. Market experts believe that despite the data privacy concerns, this market can still progress if policies are drafted for mental health applications to fall under the same bracket as traditional health platforms. According to Grand View Research, a market intelligence firm, the international mental health application market has been valued at $7.09 billion for 2024 and is poised to reach $15.42 billion by 2029, at a 16.82% compound annual growth rate (CAGR) for 2024-29. 

Moreover, it is expected that the synergy between artificial intelligence (AI) and mental health applications can ensure data privacy while ensuring ethical practices for mental health treatment. With mental health treatment understood to not be restricted to urban areas, it’s considered imperative to implement digital policies to ensure users’ data protection for a better tomorrow. “In the future, mental health apps are likely to harness emerging technologies, such as augmented reality, to create immersive experiences for mental health issues. AI advancements may lead to more sophisticated mood and emotion analysis, enabling apps to provide personalised and adaptive interventions. The integration of biofeedback mechanisms could offer users real-time insights into their physiological responses. Additionally, a shift towards collaborative and community-driven mental health platforms may foster peer support networks. As ethical considerations gain prominence, developers are expected to prioritise transparency in AI decision-making processes and implement measures for user consent and data protection,” Gorav Gupta, psychiatrist and co-founder, Emoneeds, a mental health subscription service, concluded.

Follow us on TwitterFacebookLinkedIn

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on January thirty, twenty twenty-four, at zero minutes past eight in the morning.
Market Data
Market Data