Big social media companies will be required to assess risks of the types of legal harms against adults which could arise on their services, and will have to set out how they will deal with them - and enforce these terms consistently. Here are the key recommendations:
- All harmful sites including pornography and gambling should have duties to stop children from accessing them.
- Social media algorithms should not promote competitive advantage, hatred and violence.
- Tech companies should appoint a safety controller.
- Individual users should be able to complain to an ombudsman when platforms did not meet their obligations.
- Scams and frauds - such as fake adverts - should be covered.
Social media platforms will also have a new legal duty to prevent paid-for fraudulent adverts appearing on their services.