The Case for Transparency Reports

Transparency reports share the details of online platforms’ content moderation and data-sharing activities. Offering many benefits to online platforms, transparency reporting is becoming more common for both small and large online services. 

These reports offer platforms the opportunity to gain the trust of their users by being fully open and disclosing their activities relating to user privacy, freedom of expression, and safety – the things users care about most. However, with no standardization, there is a lack of clarity around what a transparency report should look like. Furthermore, whether it’s worthwhile to allocate resources to this costly and timely venture remains a question to many.

Here, we explain what a transparency report is, the laws that require publishing the reports, and how companies can begin the reporting process.

The History

The history of transparency reports dates back to 2010, when Google began to disclose information regarding digital governance and enforcement measures. Google essentially wanted “to provide hard evidence of how laws and policies affect access to information online.”  In the following years, Twitter (2012), Microsoft (2013), Apple (2013), Yahoo (2013), and Facebook (2013) all published their first transparency reports. At the time, the reports only shared data surrounding government requests for user information and content taken down. However, since then, they’ve expanded to include a company’s internal content moderation metrics. Etsy published the first release of a policy enforcement report in 2015. In 2018, the Santa Clara Principles promoted the adoption of industry content moderation reporting and set guidelines for transparency and accountability.

Today, transparency reports are far more common. According to Access Now, a digital rights advocacy organization, 88 companies released transparency reports in 2021. With new regulations coming into play and increasingly critical public, this number is expected to increase.

So, What is a Transparency Report?

As mentioned, transparency reports vary in the information they share, though all connect back to transparency about company activities. We define transparency reports in our Trust & Safety Glossary, as: 

“Voluntary reports that provide transparency into the ways that data is handled and moderation decisions are made. This report communicates key metrics that can include: the volume, type and region of detected content, how that content was detected, what actions were taken, the volume of and reactions to appeals, and changes over time – among other metrics.”

Transparency Report Content

Initially, as mentioned above, transparency reports shared only law enforcement requests for customer data and demands to remove content. This answered the public concern over data privacy and government interference at the time, as it still does today. Following the growing trend of transparency reports, companies began to share their content and platform enforcement measures in addition to government-related data. Often, the content of a transparency report sheds a positive light on a company. It showcases how platforms care about their users by sharing the actions taken to protect privacy, enforcing a policy that protects users, and listening to user appeals.

Legal Requirements for Publishing Transparency Reports

Transparency reports aren’t only for the purpose of positive public perception. They are also beginning to be required by regulators worldwide. The recent passing of the Digital Services Act, which requires transparency reports even for smaller online services, signals an increase in regulations that are likely to include similar obligations.

A few countries already require this. For example, Germany passed the NetzDG Network Enforcement Law in 2017, which requires social media companies to publish how government removal requests are processed bi-annually. TERREG, the EU’s regulation on terrorist content which came into effect in June 2022, requires platforms to produce annual transparency reports to demonstrate compliance.

To learn more about the regulations that require transparency reports and their reporting obligations, download our guide to transparency reports.

To learn more about the DSA’s requirements, download our guide here.

Challenges & Solutions

When publishing reports, companies should be aware of the following challenges.

Allocating resources

The challenge: Publishing transparency reports requires Trust & Safety teams to implement the processes of tracking content moderation efforts, which can be expensive and time-consuming. Additional resources needed include the design, launch, maintenance, and expansion of reports

The solution: Teams should be compiling relevant data regardless in order to track Trust & Safety KPIs. Learn more about measuring Trust & Safety here. Furthermore, It is important to remember that users are not necessarily expecting a platform to publicize transparency reports, which gives companies room to start small and scale up at their own pace. 

Lack of Standardization

The challenge: Currently, there is no one way to classify and report platform activities. What shows meaningful transparency isn’t always clear, resulting in different platforms reporting different metrics based on what they understand to mean transparent. The lack of consistency and standardization makes it challenging for companies looking to begin reporting their activities. 

The solution: The lack of standardization should not dissuade companies, and it even can be advantageous. Leaving room for innovation and the development of new standards can only add meaning to transparency. 

Potential for backlash

The challenge: In rare cases, companies have received backlash for under-reporting. That is, companies receive criticism for omitting certain information and are accused of only sharing data that shows them in a positive light. 

The solution: Typically, the companies that receive backlash for their reports are high-visibility platforms. Such criticism is not likely for platforms of small- to medium-size. However, to avoid such risks, platforms should strive for full transparency and provide as much information as possible. Reports should also be clear to users so they can understand that proper systems and processes are in place, increasing trust in a platform. Another practice that should be implemented, is to provide real-time transparency. This means that when a user’s account or content is suspended or removed, users receive a clear explanation for their violation and can appeal. This will contribute to building trust with platform audiences.

The Benefits of Transparency Reports

Transparency reporting offers online platforms many benefits, the most obvious of which is what it signals – trustworthiness.

Public Perception

Transparency reports signify that a company is sincere about being open, safe, fair, and honest. They’re a tool to communicate company values by showcasing their commitment and efforts to make their platform proper and safe. Building or rebuilding trust, reports form relationships with users, reassuring them that their communications, online presence, and representations of who they are, remain safe. 

Holding governments accountable

Transparency reports reveal how governments interact with user data and the demands they place on platforms. This holds governments accountable for maintaining the privacy of users, a value that many customers deeply value. 

Influence on public policy

Often overlooked, transparency reports can potentially contribute to public policy discourse. By reporting on extremist activities, propaganda, disinformation, and more, context is added to conversations on how to combat these issues. Policymakers, think tanks, governments, and platforms can all benefit from this data. Being transparent about these issues also creates the opportunity for the industry to collaborate on solutions.  

Next Steps

Companies should plan their next steps in implementing transparency reports to gain users’ trust, demonstrate transparency, and meet potential requirements. To do so, Trust & Safety teams should:

  1. Understand relevant legal obligations for reporting 
  2. Obtain buy-in from the rest of the company for cooperation and support throughout the process
  3. If required by law, determine the frequency and details necessary to report and what resources will be needed. In the case of no legal obligations, evaluate what your team can realistically handle 
  4. Implement procedures to track and record relevant data for reporting

As we’ve learned, transparency reports shed light on how companies moderate content and interact with government entities. Meeting regulation obligations and giving platform communities visibility, which increases openness and trust with users, is a win-win for online platforms. 

To learn more about the Digital Service’s Act’s requirements, download our legislative review.