The UK government has brought back a revamped and significantly changed Online Safety Bill.
But what do these changes mean?
At its core, there are clear protections for children and newly strengthened laws against the encouragement of self-harm and distribution of intimate images without consent.
The huge move forward with this Bill is that social media companies that do not follow their own terms and conditions will now face fines of up to 10% of their global turnover. For companies like Meta, Twitter, and TikTok that could be billions of pounds in fines.
And this is significant because platforms’ rules cover all the things we care about: racism, misogynist abuse, eating disorder content, self-harm and more. The problem is they don’t enforce them.
We see this all the time in our research:
When Instagram failed to act on 9 out of 10 reports of misogynistic abuse, they could now face fines for not consistently enforcing their standards against such hate.
When Twitter failed to act on 99% of anti-LGBTQ+ hate reported to them, they could now face fines for their failures.
When our researchers reported hundreds of racist anti-Jewish posts to social media firms using their user reporting tools 84% were not acted upon. They could now face fines for their failure to act.
CCDH has been campaigning for platforms to enforce their rules since our inception.
The Online Safety Bill would do just that. Our mission has always been to increase the economic, social, and political costs for people who spread hate and misinformation and the platforms that enable them. This Bill is a massive step forward to that end.
The focus now has to be on getting this back to Parliament and making it as strong as possible, with powers for the regulator, Ofcom, to check platforms are keeping their promises and hold them accountable when they fail.