All Things Legislation: The Trust & Safety Laws of 2022

By
December 12, 2022
The Scales of Justice & Digital Law

This past year has been a big one in the Trust & Safety sphere. From new legislation being proposed, passed, and enacted, there have been major changes in laws surrounding the safety and security of users and platforms from every corner of the globe. To keep readers updated, ActiveFence has compiled some of the most significant legislative updates from the past year.

Europe

Digital Services Act:

The landmark legislation in Europe this past year was the Digital Services Act (DSA), which was passed in 2020 and entered into force in November.

The DSA establishes platform responsibilities surrounding harmful content with transparency reporting obligations to ensure clear accountability. It also seeks to standardize mechanisms for reporting harmful content in order to protect users’ fundamental rights. These obligations, which apply to platforms operating in the European Union, are intended to counter the spread of disinformation, prevent the sale of illegal goods or services, prevent the proliferation of child sexual abuse material and other illegal content, limit targeted advertisements, make key data accessible for research, and increase transparency, among others. Rules include obligations related to content takedowns, advertising, and algorithmic processes.

Platforms are required to implement appeal mechanisms, cooperate with trusted flaggers, maintain transparency about algorithmic recommendations to users, and for what the EU terms Very Large Online Platforms (VLOPs), obligations include crisis response protocols, external and independent audits, and data sharing with authorities and researchers.

Dive into our in-depth DSA report to learn more about what platforms need to know.

EU Code Against Disinformation:

The 2022 Code of Practice on Disinformation, which will be enforced starting December 16, 2022, outlines a risk mitigation strategy aiming to combat disinformation, ensure transparency in on-platform advertising, promote safeguards to ensure the integrity of services, and empower users to be able to flag and report violations. It applies to VLOPs, as outlined in the DSA, and requires biannual reporting to a specialist task force. The code, in conjunction with the DSA and other pieces of legislation, will be part of a greater European framework on internet safety.

For more information, see our full report on the EU code.

TERREG

The EU’s TERREG legislation, which was enacted in June, aims to prevent the dissemination of terrorist content online through a multitude of obligations. Among other obligations, platforms are required to meet a tight, one-hour deadline to remove or disable violative terrorism-related content after receiving a removal notice from a national authority. They must also supplement technological moderation tools with human moderators, produce transparency reports on their terrorism-related Trust & Safety efforts, and follow duties of care as laid out in the law. Non-compliance may result in fines of up to 4% of a company’s global turnover.

See ActiveFence’s one-pager on TERREG for more information.

UK Online Safety Bill:

The United Kingdom’s Online Safety Bill, which, as of publication, is still being reviewed in the House of Commons, aims to be a defining piece of legislation in the field of Trust & Safety. The first bill to apply explicitly and exclusively to platforms hosting user-generated content, it requires companies to uphold “duty of care” in order to maintain user safety. It obligates platforms to conduct risk assessments, specifically with regard to illegal content on platforms, risks of harm to children, and – separately – risks of harm to adults. The results of those risk assessments may necessitate the implementation of new or adjusted platform policies in order to maintain compliance with the OSB.

For more information, ActiveFence’s legal team produced an in-depth report on the Online Safety Bill.

Recommendation On Impacts of Digital Technologies on Freedom of Expression

In April 2022, the Council of Europe set forth the Recommendations On Impacts of Digital Technologies on Freedom of Expression, which outlines suggested guidelines for the council’s 46 member states, as well as platforms, media organizations, and other private entities operating within them. It lays out various suggestions regarding content moderation tactics, assessments, transparency, and digital infrastructure design, among many others. While not legally binding, it has been adopted by all of the council’s member states.

South America

Resolution 23.714

Just ahead of their presidential election in October 2022, Brazilian authorities unanimously approved a resolution that sets out to curtail election-related disinformation. It gives the country’s Superior Electoral Court the ability to hold platforms accountable for violative content that threatens the electoral process and levies fines for companies not in compliance with timed takedown orders, among other obligations.

Read ActiveFence’s report on this new disinformation regulation.

North America

Age-Appropriate Design Code Act

California’s Age-Appropriate Design Code Act (ACDA) was passed in the summer of 2022 and will be enacted in 2024. A state-level law, it aims to protect children by mitigating associated with product design, data collection, privacy rights and more. Platforms that collect personal information, operate for financial profit, or conduct business in California while meeting specific thresholds will all be held accountable to the ACDA’s requirements; violators will face financial penalties.

Check out our guest blog on this child safety act by ActiveFence’s General Counsel.

Online Algorithm Transparency Act

In June 2022, a bill was introduced into the Canadian House of Commons in an effort to ensure algorithms do not use personal information as a means to discriminatory treatment of individuals or groups. While it has not advanced in the house since its introduction, the Online Algorithm Transparency Act seeks to require transparency not only with regard to the basis of algorithms, but content moderation and related enforcement decisions.

Asia

Expanded Anti-Trafficking in Persons Act of 2022

Bolstering the country’s defenses against abuses of online platforms for human trafficking, the Philippines in October 2022 passed the Expanded Anti-Trafficking in Persons Act of 2022. The law holds internet intermediaries responsible for gross negligence of offenses occurring on their platforms for violations related to trafficking, forced prostitution, organ sale, and child laundering, among others. Violations and non-compliance will carry financial penalties.

For more information on this legislation, see ActiveFence’s concise report on this new anti-trafficking act.

Online Safety (Miscellaneous Amendments) Bill

The Singapore Parliament in November 2022 passed the new Online Safety Bill, which tackles various aspects of guaranteeing user security in the face of digital harms and risks. It applies broadly to online communication services, a nearly all-encompassing term, and intends to regulate violative content, holding service providers accountable for its removal. Expected to take effect in 2023, the bill takes into considerations the unique threats to children as well as adults online, and will levy fines against platforms found in violation of its rules and requirements.

Africa

Industry Guidelines for Child Online Protection and Safety in Kenya, 2021

In March of 2022, Kenya set forth the Draft Industry Guidelines for Child Online Protection and Safety. Part of the country’s 5-year plan to tackle online child sexual exploitation and abuse, the guidelines compel internet intermediaries and platforms to create design digital spaces with children’s safety in mind, recommend industry-wide guidance on detection and mitigation of threats, develop transparency mechanisms, and lay out reporting requirements, among others.

Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries

Nigeria’s National Information Technology Development Agency released in June the Code of Practice for Interactive Computer Service Platforms and Internet Intermediaries, which sets forth a number of guidelines and requirements for platforms. It obligates 24-hour takedowns of violative content set out by Nigerian law, annual filing of compliance reports, and complete risk assessments and compulsory disclosures.

Oceania

Online Safety Act:

Passed in 2021 and enacted in January 2022, Australia’s Online Safety Act expands the country’s laws regarding online safety and the technology in place to protect it. It holds service providers, like UGC platforms, more responsible for the content that users post and requires industry-wide development of codes to regulate illegal and restricted content. Like the UK’s Online Safety Bill, Australia’s Online Safety Act also takes into account protections for children and adults separately, as well as generally looking at safety for all users, regardless of age.

Harmful Digital Communication Act

An amendment was added to New Zealand’s Harmful Digitation Communication Act in March 2022, focusing specifically on unauthorized posting of intimate visual recordings. While the legislation doesn’t specifically mention online platforms, it does outline the government’s ability to demand offending content be disabled or taken down.

 

Interested in learning more about online safety regulations? Access our Trust & Safety Compliance Center and stay compliant.

Table of Contents