Ofcom granted new powers to regulate user-generated content platforms
Ofcom has been granted new powers to force social media firms to respond to harmful content posted on their platforms.
Previously, Facebook, Tiktok, Youtube and Snapchat have all self-regulated, defending their own rules with regards to the removal of unacceptable content.
However, critics believe that in order to keep people safe, independent rules are required.
This move extends the power of Ofcom, who were previously responsible for regulating television and radio broadcasters and dealing with complaints made against them.
It comes as a result of the Online Harms consultation, which considered the responses of 2,500 participants.
The regulations are directed at bodies with user-generated content, notably Facebook, Twitter, Snapchat, YouTube and TikTok.
The death of Molly Russell, who took her own life after she viewed graphic content on Instagram, has furthered calls for social media to regulate the content broadcast on their platforms.
As part of government plans for a new legal duty of care, the extent of the new powers will be announced later today.
Further to the outright removal of harmful content, including violence, child abuse, terrorism and cyber-bullying, platforms will also be expected to "minimise the risks” of such content appearing at all.
Baroness Nicky Morgan, the digital secretary, said: "There are many platforms who ideally would not have wanted regulation, but I think that's changing.
"I think they understand now that actually regulation is coming."
NSPCC, the children’s charity, have applauded the decision. A spokesperson said “Too many times social media companies have said: 'We don't like the idea of children being abused on our sites, we'll do something, leave it to us.
"Thirteen self-regulatory attempts to keep children safe online have failed.
"Statutory regulation is essential."
As yet, the penalties Ofcom will issue are unknown.