Facebook has said it is “deeply sorry” after it emerged a teenager who took her own life had viewed disturbing content about suicide on social media.
Molly Russell, 14, died in 2017. Her father Ian says he believes Instagram “helped kill my daughter”.
Facebook, which owns Instagram, said graphic content which sensationalises self-harm and suicide “has no place on our platform”.
Advertisers have also raised concerns over ads being next to such posts.
According to a BBC investigation, adverts for some UK high street brands are appearing alongside graphic content about self-harm, depression and suicide on the social media app.
Instagram says adverts are not targeted to appear next to certain videos or content.
Mr Russell earlier told the BBC how after his daughter died, the family began to look at the Instagram accounts she had been following from people who were depressed, self-harming or suicidal.
“Some of that content is shocking in that it encourages self harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter.”
On Wednesday, Facebook executive Steve Hatch responded, saying: “The first thing I’d like to say is just what a difficult story it was to read and I, like anyone, was deeply upset.
“I’m deeply sorry for how this must have been such a devastating event for their family.”
When confronted with print-outs of Instagram posts showing graphic photos of self-harm, he said: “We’d have to make sure that we look at these and ensure that those are taken down if they are against our policies.
“If people are posting in order to seek help and in order to seek support from communities, the experts in this area tell us that is a valuable thing for them to do. It can help with recovery, it can help with support.
“If it’s there to sensationalise and glamourise, of course it has no place on our platform, it shouldn’t be on our platform. And if we need to work harder to make sure it isn’t on our platform then we certainly will.”
Adverts next to graphic posts
Separately a BBC investigation found that some of the brands whose ads appeared next to disturbing images and videos include Dune, Marks and Spencer, the Post Office and the British Heart Foundation charity.
They were all unaware of the problem, said they would never deliberately advertise next to such content and were committed to working with social media companies to tackle the issue.
ISBA – the trade body for advertisers – has raised concerns about adverts appearing alongside Instagram posts.
Phil Smith, the head of ISBA, said: “Brands do not want to see their advertising appearing in this context.
“What we need is an independent oversight body funded by the industry potentially international in scope which stops the platforms marking their own homework and that can give confidence to the public, the politicians and the advertisers that content is being properly independently moderated.”
Molly’s father said: “The truth is that the internet is making money out of other people’s misery and it shouldn’t be.
“I mean that’s just dreadful, that’s immoral – and it’s not taking enough steps to prevent that – it’s not taking enough steps to safeguard young people’s lives.”
- Reality TV ‘harms youngsters’ body image’
- University switches off social media
- Almost half of UK internet users ‘harmed’
Asked how brands can trust Facebook and Instagram, Mr Hatch said companies “want to make sure that we’re living up to the responsibilities that they have of us and I think we can always improve”.
“But there are areas where we’ve made significant amounts of investment, huge amounts of focus on trying to get this right. But it is recognised that this is a complex area.”
Footwear retailer Dune said it was deeply shocked and saddened by the issue and would never deliberately advertise alongside such content, while Marks and Spencer said it would be “seeking additional assurances from Instagram”.
The Post Office said it would “never target ads based on inappropriate or harmful content” and the British Heart Foundation said “we will be asking Instagram to act swiftly to prevent such content from being so easily accessible, shared and to protect people from viewing it”.
‘Ads based on interests’
Instagram said: “We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and work hard to remove it.
“However, for many young people, discussing their mental health journey or connecting with others who have battled similar issues, is an important part of their recovery.
“This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
Responding to concern over the placement of adverts, it said: “Ads on Instagram are not targeted to appear next to certain videos or content.
“Ads people see are based on interests, not the content you see above and below those ads.”
A spokesman for Prime Minister Theresa May called Molly’s death a “tragic case”, adding that she had made clear social media companies had “a responsibility to regulate content on their platforms” and needed to “step up and address these concerns”.
It comes after suicide prevention minister Jackie Doyle-Price announced that the government was aiming to reduce suicides by at least 10% by 2020 – in part by working “collaboratively” with social media and tech companies.