Social media companies should have a legal “duty of care” to children, the UK Parliament’s Science and Technology Committee has recommended.
The group’s report said social media had helped facilitate bullying and online grooming of children.
It called for a regulator to be established to take “enforcement action” if apps broke the law.
A number of privacy and data scandals have led to calls for social media companies to be regulated.
Facebook’s new vice-president Sir Nick Clegg has already pledged to do “whatever it takes” to make the company’s platforms safer for young people.
Sir Nick was responding to the case of 14-year-old Molly Russell, who took her own life after viewing distressing self-harm images on Instagram in 2017.
Her father said he believed Facebook-owned Instagram was partly responsible for his daughter’s death.
What is the report about?
The committee has been looking at the effect of social media and so-called screen time on the health of children.
During its inquiry, it surveyed more than 3,000 young people.
It said social media had positive attributes, but also led to:
- damaged sleep patterns
- body image issues
- online grooming
It said these risks had existed before social media, but the popularity of apps had helped “facilitate” problems, especially child abuse.
The report, titled Impact of Social Media and Screen-Use on Young People’s Health, was released on Thursday.
It was produced by a cross-party group of MPs including Liberal Democrat Norman Lamb, Conservative Damien Moore, Labour’s Liz Kendall and the SNP’s Carol Monaghan.
What does the report recommend?
The report said social media companies should have a formal legal duty of care to their users.
- establishing a regulator to minimise the harms of social media and take enforcement action
- making sure rules and policies are consistent across video-sharing sites, social networks and search engines
- a new partnership between the government, tech companies and law enforcement to tackle child exploitation online
Despite the Cambridge Analytica data scandal, in which a university researcher gave the Facebook data of millions of people to a political consultancy, the report also said social networks should share more data with researchers.
- Facebook’s popularity dips with UK children, says Ofcom
- Screen time ‘may harm toddlers’
- Worry less about children’s screen use, parents told
- Internet regulator considered for UK
It argued that this could help websites identify those at risk and improve safety measures.
The government is expected to publish a White Paper about online harms in the coming months, and the committee urged it to use this as an opportunity to act.
Appetite for regulation
The Royal Society for Public Health (RSPH) said it “welcomed” the report. Its director of external affairs, Duncan Stephenson, gave evidence to the committee.
He said in a statement: “We fully support the select committee calls to help researchers and others to better understand the long-term effects…. to our wellbeing.”
On Wednesday, England’s Children’s Commissioner, Anne Longfield, published her own open letter calling on social media firms to take more responsibility for protecting children from disturbing content.
She urged them to back a legal duty of care obligation, and finance a digital ombudsman to act as an independent party between young people and the tech giants.
Andy Burrows, from the children’s charity NSPCC, said in a statement: “For far too long social networks have been allowed to operate in a Wild West environment and put children at unacceptable risk.
“The government now has a crucial opportunity to set out a comprehensive plan to protect children online.
“This must include an independent statutory regulator with enforcement powers, that can impose strong sanctions on platforms that fail to keep children safe.”