TikTok, Snapchat and Twitch could face fines under strict Ofcom regulation

Technology

Ofcom has issued new guidance to video-sharing platforms (VSPs) including TikTok, Snapchat, and Twitch about how they must protect their users from harmful online content.

The communications regulator has the power to levy fines of up to £250,000 if the companies don’t properly tackle content relating to terrorism, child abuse, or racism.

In the most serious cases of breaching the ten measures set out in the legislation, the video-sharing services could be suspended or restricted in the UK.

Ofcom says that 70% of all users have reported being exposed to harms online, with hateful content being the most widely encountered by almost a third of users (32%) while more than a fifth of users (21%) have encountered racist content.

Ofcom says most people using video platforms have experienced harm
Image:
Ofcom says most people using video platforms have experienced harm

The rules only apply to video content and so won’t include racist comments or still images for instance.

It also won’t cover on demand services like Amazon Prime or Now TV, which are regulated separately, as only platforms which allow users to upload and share videos will fall under the regime.

The new guidance is a result of European legislation which became British law after Brexit, but will be superseded by the UK’s own Online Safety Bill which is currently being scrutinised by parliament.

More on Snapchat

But the legacy from the EU legislation means that the biggest platforms for sharing video content, YouTube and Facebook, still fall under Irish jurisdiction.

Unlike in the regulator’s work with broadcast content, it won’t be assessing individual videos but instead focusing on the processes and mechanisms that VSPs have in place.

Dame Melanie Dawes, the chief executive at Ofcom, said: “Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them.

Four social networks have teamed up with the government
Image:
The rules only apply to video content

“The platforms where these videos are shared now have a legal duty to take steps to protect their users.

“So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”

To comply with the rules, VSPs must:

Provide clear rules around uploading content, ensuring users know uploading videos “relating to terrorism, child sexual abuse material or racism is a criminal offence”;

Have easy reporting and complaint processes “that allow users to flag harmful videos easily” and “signpost how quickly they will respond, and be open about any action taken”;

And they must also restrict access to pornographic websites by using “robust age verification…to protect under-18s from accessing such material”.

Products You May Like