UK Social Media Platforms Will Now Be Held Accountable for Harmful Content
The regulatory company Ofcom will protect users from damaging content on social media platforms.
The regulatory company Ofcom has been given the authority to oversee and regulate potentially harmful or illegal content on social media across the UK, the BBC reported on Wednesday.
Ofcom will now have the power to hold social media platforms accountable and protect people from cyberbullying, graphic violence, terrorism, and child abuse.
In addition to removing harmful and illegal content from platforms like Facebook and Twitter, Ofcom will reduce the overall risk of such content appearing on the internet in the first place. Until now, the government-approved regulatory company has only regulated broadcast media and not internet safety.
Social media platforms have previously been self-regulating and have long resisted outside oversight of their content and, as a result, have allowed harmful and damaging content to remain up on their sites.
"There are many platforms who ideally would not have wanted regulation, but I think that's changing," Digital and Culture Secretary Nicky Morgan told the BBC.
The move comes nearly a year after the UK proposed a series of online safety laws. Former Prime Minister Theresa May vowed at the time to ensure internet user safety by fining social media companies and holding CEOs personally accountable for allowing harmful content to be distributed across their platforms.
The measure will increase online protections for children in the UK, helping to safeguard them from predatory abuse and trafficking. Ofcom will work to remove child pornography and photos of children who have been trafficked.
Online child sexual abuse can include verbal and physical harassment, grooming, and even the live-streaming of abuse, which can all cause irreparable damage, leaving many survivors of abuse with depression, anxiety, and problems with addiction.
The National Society for the Prevention of Cruelty to Children (NSPCC) praised Ofcom’s new responsibilities, claiming that social media platforms’ previous attempts to self-regulate have been largely unsuccessful.
"Too many times social media companies have said: 'We don't like the idea of children being abused on our sites, we'll do something, leave it to us,'" said NSPCC Chief Executive Peter Wanless. "Statutory regulation is essential."
The UK government also wants to reinstate age verification requirements on certain websites to restrict access to pornography and prevent children from being targeted on websites displaying adult content.