A very good read from a respected source!
REPORT. Preventable yet pervasive.
The prevalence and characteristics of harmful content, including suicide and self-harm material, on Instagram, TikTok and Pinterest
LEARN MORE
THE METRO. Molly Russell’s dad says social media firms ‘still pushing out harmful content’
Sam Corbishley
Wednesday 29 Nov 2023 7:09 am
Social media algorithms are still ‘pushing out harmful content to literally millions of young people’ six years after schoolgirl Molly Russell ended her own life, her father has said.
Ian Russell said the findings of a report by the suicide prevention charity set up in his daughter’s memory ‘must now be seen as a fundamental systemic failure that will continue to cost young lives’.
Its release comes in the week Molly, who took her own life aged 14 in November 2017 after viewing suicide and other harmful content online, would have celebrated her 21st birthday.
Mr Russell, chair of trustees at the Molly Rose Foundation, said: ‘It’s saddening to see the horrifying scale of online harm and how little has changed on social media platforms since Molly’s death.’
The charity said it had found harmful content at scale and prevalent on Instagram, TikTok and Pinterest.
It added that on TikTok, some of the most viewed posts that reference suicide, self-harm and highly depressive content have been viewed and liked over one million times.
Last September, a coroner ruled schoolgirl Molly, from Harrow, north-west London, died from ‘an act of self-harm while suffering from depression and the negative effects of online content’.
The charity’s report, created in partnership with data-for-good organisation The Bright Initiative, saw the Foundation collect and analyse data from 1,181 of the most engaged-with posts on Instagram and TikTok that used well-known hashtags around suicide, self-harm and depression.
t warns that it believes there is a clear and persistent problem with readily available and harmful content because many of the harmful posts it analysed were also being recommended by a platform’s algorithms.
The report noted that while its concerns around hashtags were mainly focused on Instagram and TikTok, its concerns around algorithmic recommendations also applied to Pinterest.
The Molly Rose Foundation said it was concerned that the design and operation of social media platforms was sharply increasing the risk profile for some young people because of the ease with which they could find large amounts of potentially harmful content by searching for hashtags or by being recommended content along a similar theme.
It said platforms were also failing to adequately assess the risks posed by features which enable users to find similarly-themed posts, and claimed that commercial pressures were increasing the risk as sites compete to grab the attention of younger users and keep them scrolling through their feed.
Mr Russell said: ‘The longer tech companies fail to address the preventable harm they cause, the more inexcusable it becomes. Six years after Molly died, this must now be seen as a fundamental systemic failure that will continue to cost young lives.
‘Just as Molly was overwhelmed by the volume of the dangerous content that bombarded her, we’ve found evidence of algorithms pushing out harmful content to literally millions of young people.
‘This must stop. It is increasingly hard to see the actions of tech companies as anything other than a conscious commercial decision to allow harmful content to achieve astronomical reach, while overlooking the misery that is monetised with harmful posts being saved and potentially “binge watched” in their tens of thousands.’
Mr Russell added that the findings highlighted how important the new Online Safety Act was, and that new online safety regulator Ofcom needed to be ‘bold’ in how it held social media platforms to account under the new laws.
‘Our findings show the scale of the challenge facing Ofcom and underline the need for them to establish bold and ambitious regulation that delivers stronger safety standards and protects young lives,’ he said.
In response to the report, Technology Secretary Michelle Donelan said it was ‘despicable and indefensible’ that social media firms were ‘still turning a blind eye to the scale of horrendous suicide and self-harm content’.
‘This is one of the reasons why the Online Safety Act includes numerous measures to protect both adults and children from such content,’ she said.
‘Ofcom are consulting on the illegal harms duties the Act will bring in so that social media companies are aware of what is expected of them, meaning laws can be implemented as quickly as possible.
‘These companies must not wait and instead should act now, to ensure we don’t see more tragic stories such as Molly’s, and I will be raising this directly with them in a meeting soon.’
A company spokesperson for Meta, which owns Instagram, said: ‘We want teens to have safe, age-appropriate experiences on Instagram, and have worked closely with experts to develop our approach to suicide and self-harm content, which aims to strike the important balance between preventing people seeing sensitive content while giving people space to talk about their own experiences and find support.
‘We’ve built more than 30 tools to support teens and families, including our Sensitive Content Control, which limits the type of content teens are recommended.
‘We continue to look for more ways we can help teens have age-appropriate experiences online and will be announcing further developments designed to make it even harder for teens to discover potentially sensitive content soon.’
A Pinterest spokesperson said: ‘Pinterest is committed to creating a safe platform for everyone.
‘We are constantly updating our policies and enforcement practices around self-harm content, including blocking sensitive search terms and evolving our machine learning models so that this content is detected and removed as quickly as possible.’
A TikTok spokesperson said: ‘Content that promotes self-harm or suicide is prohibited on TikTok and, as the report highlights, we strictly enforce these rules by removing 98% of suicide content before it is reported to us.
‘We continually invest in ways to diversify recommendations, block harmful search terms, and provide access to the Samaritans for anyone who needs support.’
THE STANDARD. Molly Russell’s father says tech firms still failing to stop the spread of suicide content online.
A new report from the Molly Rose Foundation says social media firms are failing to stop the promotion of self-harm and suicide content.
By LYDIA CHANTLER-HICKS
30 November 2023
The father of Molly Russell who took her own life as a teenager six years ago says tech firms platforms are still not doing enough to protect young people online.
Molly, from Harrow in north-west London, ended her life at the age of 14, after viewing suicide and other harmful content online.
In the week that would have marked her 21st birthday, suicide prevention charity the Molly Rose Foundation, set up in her honour, has published a report saying it had found harmful content prevalent on Instagram, TikTok and Pinterest.
Social media platforms suffer from significant, fundamental system failings in handling self-harm and suicide content, according to the report.
The charity said that on TikTok, some of the most viewed posts that reference suicide, self-harm and highly depressive content have been viewed and liked over one million times.
Ian Russell, Molly’s father and chair of trustees at the Molly Rose Foundation, said: “This week, when we should be celebrating Molly’s 21st birthday, it’s saddening to see the horrifying scale of online harm and how little has changed on social media platforms since Molly’s death.
“The longer tech companies fail to address the preventable harm they cause, the more inexcusable it becomes. Six years after Molly died, this must now be seen as a fundamental systemic failure that will continue to cost young lives.
“Just as Molly was overwhelmed by the volume of the dangerous content that bombarded her, we’ve found evidence of algorithms pushing out harmful content to literally millions of young people.
“This must stop. It is increasingly hard to see the actions of tech companies as anything other than a conscious commercial decision to allow harmful content to achieve astronomical reach, while overlooking the misery that is monetised with harmful posts being saved and potentially ‘binge watched’ in their tens of thousands.”
Wednesday’s report has been created in partnership with data-for-good organisation, The Bright Initiative, and saw the Foundation collect and analyse data from 1,181 of the most engaged-with posts on Instagram and TikTok that used well-known hashtags around suicide, self-harm and depression.
The report warns there is a clear and persistent problem with readily available and harmful content because many of the harmful posts it analysed were also being recommended by a platform’s algorithms.
The Molly Rose Foundation said it was concerned that the design and operation of social media platforms was sharply increasing the risk profile for some young people because of the ease with which they could find large amounts of potentially harmful content by searching for hashtags or by being recommended content along a similar theme.
The report noted that while its concerns around hashtags were mainly focused on Instagram and TikTok, its concerns around algorithmic recommendations also applied to Pinterest.
It said platforms were also failing to adequately assess the risks posed by features which enable users to find similarly-themed posts, and claimed that commercial pressures were increasing the risk as sites compete to grab the attention of younger users and keep them scrolling through their feed.
Mr Russell added that the findings highlighted how important the new Online Safety Act was, and that new online safety regulator Ofcom needed to be “bold” in how it held social media platforms to account under the new laws.
“Our findings show the scale of the challenge facing Ofcom and underline the need for them to establish bold and ambitious regulation that delivers stronger safety standards and protects young lives,” he said.
In response to the report, Technology Secretary Michelle Donelan said it was “despicable and indefensible” that social media firms were “still turning a blind eye to the scale of horrendous suicide and self-harm content”.
“This is one of the reasons why the Online Safety Act includes numerous measures to protect both adults and children from such content,” she said.
“Ofcom are consulting on the illegal harms duties the Act will bring in so that social media companies are aware of what is expected of them, meaning laws can be implemented as quickly as possible.
“These companies must not wait and instead should act now, to ensure we don’t see more tragic stories such as Molly’s, and I will be raising this directly with them in a meeting soon.”
A company spokesperson for Meta, which owns Instagram, said: “We want teens to have safe, age-appropriate experiences on Instagram, and have worked closely with experts to develop our approach to suicide and self-harm content, which aims to strike the important balance between preventing people seeing sensitive content while giving people space to talk about their own experiences and find support.
“We’ve built more than 30 tools to support teens and families, including our Sensitive Content Control, which limits the type of content teens are recommended.
“We continue to look for more ways we can help teens have age-appropriate experiences online and will be announcing further developments designed to make it even harder for teens to discover potentially sensitive content soon.”
A Pinterest spokesperson said: “Pinterest is committed to creating a safe platform for everyone.
“We are constantly updating our policies and enforcement practices around self-harm content, including blocking sensitive search terms and evolving our machine learning models so that this content is detected and removed as quickly as possible.”
A TikTok spokesperson said: “Content that promotes self-harm or suicide is prohibited on TikTok and, as the report highlights, we strictly enforce these rules by removing 98% of suicide content before it is reported to us.
“We continually invest in ways to diversify recommendations, block harmful search terms, and provide access to the Samaritans for anyone who needs support.”
If you’re struggling and need to talk, the Samaritans operate a free helpline open 24/7 on 116 123. Alternatively, you can email jo@samaritans.org or visit their site to find your local branch.