The United Kingdom Introduces The Online Safety Bill

The United Kingdom Introduces The Online Safety Bill

Felix Yim 17/03/2022
The United Kingdom Introduces The Online Safety Bill

The UK government is set to introduce its long-awaited Online Safety Bill in Parliament on Thursday.

New measures include tougher and quicker criminal sanctions for tech bosses and new criminal offences for falsifying and destroying data.

Internet users in the UK are one step closer to a safer online environment as the government’s new world-leading online safety laws are brought before parliament today.

The bill is intended to tackle a wide range of harmful online content, such as cyber-bullying, pornography and material promoting self-harm.

Social media platforms could be fined or blocked if they fail to remove harmful content, and their bosses could be imprisoned for a lack of compliance.

The bill's regulator Ofcom will have the power to request information from companies, and executives who do not comply could face up to two years in prison within two months of the bill becoming law.

Senior managers would also be criminally liable if they destroyed evidence, did not attend an Ofcom interview, provided false information, or otherwise obstructed the regulator from entering offices.

Any firm breaching the rules would face a fine of up to 10% of its turnover, while non-compliant websites could be blocked entirely.

Tech firms haven't been held to account when harm, abuse and criminal behaviour have run riot on their platform.

Nadine Dorries, Secretary of State for Digital, Culture, Media and Sport

One of the new aspects of the bill is the introduction of a "right to appeal" for people who feel their social media posts have been taken down unfairly.

Big social media companies will be required to assess risks of the types of legal harms against adults which could arise on their services, and will have to set out how they will deal with them - and enforce these terms consistently. Here are the key recommendations:

- All harmful sites including pornography and gambling should have duties to stop children from accessing them.

- Social media algorithms should not promote competitive advantage, hatred and violence.

- Tech companies should appoint a safety controller.

- Individual users should be able to complain to an ombudsman when platforms did not meet their obligations.

- Scams and frauds - such as fake adverts - should be covered.

Social media platforms will also have a new legal duty to prevent paid-for fraudulent adverts appearing on their services.

Share this article

Share this article

Felix Yim

Tech Expert

Felix is the founder of Society of Speed, an automotive journal covering the unique lifestyle of supercar owners. Alongside automotive journalism, Felix recently graduated from university with a finance degree and enjoys helping students and other young founders grow their projects. 

   
Save
Cookies user prefences
We use cookies to ensure you to get the best experience on our website. If you decline the use of cookies, this website may not function as expected.
Accept all
Decline all
Read more
Analytics
Tools used to analyze the data to measure the effectiveness of a website and to understand how it works.
Google Analytics
Accept
Decline