During a disinformation campaign, social media platforms delete inauthentic accounts and posts but find these quickly replaced by replicas. This ActiveFence report investigates the role of file sharing platforms in the organization of agents of disinformation – download the complete report below.
The term disinformation, or ‘dezinformatsiya’, was coined by Joseph Stalin, Premier of the Soviet Union. In official 1952 Soviet Union dictionaries this was defined as the “dissemination (in the press, on the radio etc.) of false reports intended to mislead public opinion”. Since then, disinformation campaigns have been used against many population groups by state and non-state actors, in order to do damage to rivals, and achieve political aims.
Today “fake news” is never far from the news agenda. Since 2016, we have seen coordinated interference via information operations in both the 2016 UK Brexit referendum and US Presidential election, the 2017 French Presidential election and the 2018 Taiwanese local elections to name a few. As last year’s US Presidential campaign reached its peak, so too did disinformation regarding COVID-19 and the vaccines created to counter this. Conspiracy theories about the integrity of the US elections were commonplace, and are persistent, while a significant level of distrust has been implanted into societies around the world.
The Spreading of Disinformation
Disinformation campaigns build credible counter-narratives to prevailing understandings in order to convince innocent users that false positions are true, or at least plausible. An example of successful disinformation is the QAnon conspiracy theory that was spread out of 4Chan in October 2017.
The QAnon narrative is that President Donald Trump was waging a secret war against Satan-worshipping pedophiles in media, politics, and Hollywood. This story has been completely discredited. Yet, according to an IPSOS poll conducted in December 2020, just 47% of US adults surveyed were able to correctly identify this outrageous story as false. This result was not accidental. The narrative described was created and spread via QAnon disinformation agents.
The success of QAnon’s disinformation campaign was the result of a continuous war of attrition being fought against the public, via social media platforms. To be effective, those spreading disinformation must share a large volume of disinformation consistently to permeate the consciousness of innocent users, while also generating new accounts quickly, to replace accounts which have been blocked by platform moderators.
To prevent the proliferation of such disinformation requires a full perspective on these harmful organizations and how they operate.
While the common perception is that disinformation is spread by bad actors using social media platforms, there is in reality a broader ecosystem and infrastructure providing operational support. The operational infrastructure includes messaging platforms for group coordination, and file sharing platforms to ensure group resilience and message amplification. It is to these file sharing platforms we turn.
The Role of File Sharing Platforms in Information Operations
To understand how best to counter these disinformation and psy-ops campaigns, ActiveFence investigated the workings of a number of disinformation organizations. We mapped their structure, tools, and presence across multiple platforms. Analyzing two case studies, detailed in the downloadable report, we will show how disinformation campaigns rely on file sharing platforms to organize effectively and counter moderation efforts.
Material Preparation and Organization
The sophistication of many disinformation campaigns points to centralized planning and resource sharing. ActiveFence locates and studies organizations engaged in spreading disinformation to learn more. An example is a Telegram group of former President Trump supporters and conspiracy theorists which has almost 13K members. This group worked to assist in the reelection campaign of the President, and to attack his enemies.
In our mapping of the organization, we found that at the heart of the organization was their bank of shareable information and memes, which was hosted on file sharing site Mega.nz. The Mega account contained 3.4GB of data, with 12,500 itemized memes, arranged into files by topic, and targets such as Joe Biden, Facebook, Twitter and the Clintons. The file sharing platform allowed this group to gather user generated content into one place, which was then shared by all users simultaneously. With a single command, all members of the group can easily find and share a specific set of material, and direct it at the target. This strategy is not unique to QPatriots. ActiveFence identified similar organizational strategies in place by other bad actor groups in the propagation of conspiracy theories and false information.
While the storage and sharing of materials were essential to meeting their goals, ActiveFence also investigated how disinformation networks are able to regenerate deleted accounts and return to social media platforms so quickly after removal by moderators. Studying a number of organizations we found that their core users exist on multiple platforms at once, as multiple backups of each other, so that if one account were deleted it can be regenerated quickly and easily. One example of this is a QAnon-supporting account detailed in the report below, which existed across 17 platforms, including four accounts on file sharing platforms (each created on the same day), with identical user information. These accounts were used to store Q-Drops content (messages from ‘Q’, the fictional character at the heart of the QAnon conspiracy) for posting and other shared files, to provide links to the other accounts in its network, and to act as cross-platform backups, should one account be lost.
Disinformation campaigns are games of repetition of false information, propagated until users find it difficult to know whether information is true or false. To succeed, organizers must be able to flood social networks with messaging about their narratives. In order for platforms to prevent this guerilla warfare, moderators must be able to prevent the influx of coordinated inauthentic material in the first place. File sharing platforms offer an important resource to disinformation campaign organizers, both as an organizational resource and as a means to ensure campaign continuity in the face of platform moderation.
To be effective in combating these organizations as a whole, a cross-platform approach is required. ActiveFence investigates harmful actors, mapping their operations across the internet to find and provide all the information necessary to combat these organizations.