British SMS counselling service shared counselling sessions with researchers
A British SMS counselling service for people experiencing mental health crises has leaked millions of messages to third parties despite an earlier promise on the service’s website that was subsequently removed. This is what research by the British weekly newspaper The Observer has revealed.
People can contact the counselling service Shout via text message if they are struggling with problems such as suicidal thoughts, self-harm, abuse or bullying. They will then receive a response from trained volunteers. The service promises confidentiality. According to the Observer, it is the largest service of its kind in the UK. It was founded by Prince William, among others.
As the newspaper reported over the weekend, more than 10.8 million messages from 271,445 consultations between February 2018 and April 2020 were used in a project at Imperial College London. The aim was, for example, to develop tools to predict suicidal thoughts with the help of artificial intelligence. Messages from children younger than 13 were also used.
FAQ amended
The counselling service’s FAQ does say “anonymised and aggregated” data could be shared with partners for research purposes. However, the FAQ had originally also asserted that individual conversations would “never be shared”. The Observer reports this section was removed in spring 2021, after the period studied by Imperial College.
Details such as names, phone numbers or places of residence were removed from these conversations, according to Shout. The service claims this makes it “highly unlikely” that the data could be linked to individuals. However, full conversations were used in the study. They contained details about the problems of the individuals concerned. The Observer reports that one aim of the study was to extract personal information about those seeking help, such as their age and gender, from the messages, both manually and automatically.
Shout has been around since 2019, founded by Prince William, his wife Kate Middleton, and Prince Harry and Meghan Markle. The Duchess and Duke of Cambridge’s Royal Foundation has also invested £3 million in the project. The counselling service is operated by the Mental Health Innovations Foundation. The foundation pointed out that users agree to the data protection regulations. When they first contact the organisation, they are sent a link containing the relevant information. Users could also read this later and have their data deleted if necessary. According to the Observer, the privacy policy was also changed last year in order to be able to use the data for “a better understanding of mental health in the UK in general”.
“False promise”
Phil Booth of medConfidential, an organisation that campaigns for the protection of health data, criticised Shout, saying people in a crisis situation could not be expected to understand that their conversations would be used for any purpose other than to help. Shout had made a “misleading” and “plainly false” promise.
A sufferer who sought help from Shout in 2019 and 2020 for anxiety and an eating disorder also told the Observer she had not known her messages could be used for research purposes. “When you’re in crisis, you don’t think, ‘Is this information going to be used for research?’ You can spin it to sound good, but it feels like your vulnerability is being exploited.”
Criticism was also voiced by Foxglove, a digital rights organisation. Co-founder Cori Crider said the use of the messages for a study raised “serious ethical questions”. “If they first say in their FAQ that one-on-one conversations won’t be shared and then suddenly move on to training artificial intelligence with hundreds of thousands of full conversations, they’ve bypassed the feelings and expectations of vulnerable people.” Trust is particularly important in the health sector, he said, especially when it comes to mental health. A lack of trust can prevent desperate people from seeking help.
Imperial College said the study “fully complied” with “rigorous ethical review guidelines”. However, Dr Sandra Leaton Gray of University College London criticised that parental permission should have been obtained to use messages from minors.
Data protection authority investigates the incident
Only recently, a similar case had come to light in the USA: At the end of January, Politico magazine had reported that the crisis aid Crisis Text Line had passed on data from conversations to a company that develops software for customer services. Shout is an international partner of Crisis Text Line. After the collaboration with the company became public, Crisis Text Line had ended it. As the Observer reports, British users were assured that their data would not be affected.
Mental Health Innovations not only works with researchers, but also with companies. However, no data from the counselling sessions was passed on to them, the foundation explained. Victoria Hornby, executive director of Mental Health Innovations, said Shout’s datasets were “unique” because they contained information about people in their own words. This could be of great benefit to research, she added.
The case is now before the UK’s data protection authority, the ICO. It is currently investigating Shout and Mental Health Innovations’ handling of user data. A spokeswoman said, “When handling people’s health data, particularly children’s, organisations need to be particularly careful and put safeguards in place to ensure their data is not used or shared in ways they would not expect. We will evaluate the information and make enquiries into this matter.” (js)