'How private is your mental health app data? An empirical study of mental health app privacy policies and practices' by Lisa Parker, Vanessa Halter, Tanya Karliychuk and Quinn Grundy in (2019) 64 International Journal of Law and Psychiatry comments
Digital mental health services are increasingly endorsed by governments and health professionals as a low cost, accessible alternative or adjunct to face-to-face therapy. App users may suffer loss of personal privacy due to security breaches or common data sharing practices between app developers and third parties. Loss of privacy around personal health data may harm an individual's reputation or health. The purpose of this project was to identify salient consumer issues related to privacy in the mental health app market and to inform advocacy efforts towards promoting consumer interests. We conducted a critical content analysis of promotional (advertising) materials for prominent mental health apps in selected dominant English-speaking markets in late 2016-early 2017, updated in 2018. We identified 61 prominent mental health apps, 56 of which were still available in 2018. Apps frequently requested permission to access elements of the user's mobile device, including requesting so-called ‘dangerous’ permissions. Many apps encouraged users to share their own data with an online community. Nearly half of the apps (25/61, 41%) did not have a privacy policy to inform users about how and when personal information would be collected and retained or shared with third parties, despite this being a standard recommendation of privacy regulations. We consider that the app industry pays insufficient attention to protecting the privacy of mental health app users. We advocate for increased monitoring and enforcement of privacy principles and practices in mental health apps and the mobile ecosystem, more broadly. We also suggest a re-framing of regulatory attention that places consumer interests at the centre of guidance.