Tech giants to be held ‘legally responsible’ for harmful online content
The tough words came from Michelle Donelan MP, secretary of state for digital, media, culture and sport in an open letter to parents and guardians that said the onus for protecting children and young people online sat “squarely on the technology companies’ shoulders”.
Donelan also stressed that children would be protected from material that wasn’t “strictly illegal or banned by platforms”.
In her letter, Donelan wrote: “Under this bill, TikTok, Snapchat, Facebook, Instagram and other social media companies will finally be held legally responsible for the content on their sites. They will be forced to protect their users, or face billion-pound fines and the potential blocking of their sites in the UK.”
The long-awaited Online Safety Bill, which is still working its way through Parliament, will shift the responsibility away from parents having to change their child’s computer settings or apply filters to shield them from harmful content.
Instead, technology companies must build these protections into their platforms or face “severe” legal consequences.
“I know how worrying it can be as a parent or guardian trying to protect your child online, when you cannot always see who they are talking to or what sites or apps they are visiting,” she added.
“So I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders.”
The bill will protect children and young people by:
• Removing illegal content, including child sexual abuse and terrorist content
• Protecting children from harmful and inappropriate content, from cyberbullying and pornography to posts that encourage eating disorders or depict violence
• Putting legal duties on social media companies to enforce their own age limits – which for almost every single platform are set at age 13, and yet are rarely enforced
• Making tech companies use age checking measures to protect children from inappropriate content
• Making posts that encourage self-harm illegal for the first time – both for children and adults
• Ensuring more transparency on the risks and dangers posed to children on the largest platforms, including by make tech companies publish risk assessments.
Safeguards for adults
Although the strongest protections in the bill are to protect children and young people, safeguards for adults have also been included.
Adults will be protected from posts that are illegal; from content that is prohibited by the social media companies in their own terms and conditions; and they will be given more control over the content they see on your own social media feeds.
Beyond illegal content
Donelan added: “We are not just shielding younger users from illegal content. We are also protecting them from material that is not strictly illegal or banned by platforms, but that can cause serious trauma nevertheless.
“We have already seen too many innocent childhoods destroyed by this kind of content, and I am determined to put these vital protections for your children and loved ones into law as quickly as possible.”
Earlier in the year, financial fraud on social media and dating apps was to be included in the bill to help tackle romance fraud where people are manipulated into sending money to fake identities.
This inclusion would also tackle fake investment opportunities posted by users. But the addition was criticised as fraud via advertising, emails or cloned websites were omitted.
Next steps for the bill
The new laws are progressing through parliament. The government is working closely with regulator Ofcom to lay the groundwork for the laws to be enforced once they are introduced.
The government said it will take a phased approach to bringing in the Online Safety Bill’s duties as Ofcom’s powers come into force.
It added that Ofcom will initially focus on illegal content, to address the most serious harms as soon as possible.