no

The Online Battlefield: Digital Warrior Tactics

Digital Warrior Tactics on Mainstream Social Media

Far-right digital groups are engaged in disinformation campaigns waged across multiple platforms, on both the clear and dark web. ActiveFence tracks these organizations, maps their behavior and studies their tactics to protect platforms from attack. Download the complete report below for a more in-depth analysis. 

Background

The 2020 US Presidential Election was held against the backdrop of the COVID-19 pandemic, a public angry at shuttered businesses and high unemployment levels, alongside a culture war raging about the very essence of the USA. Into this fractious context, strategic campaigns were launched by groups aimed at, in their words, “Bringing sight to the blind, one red pill* at a time”, to aid Donald Trump’s re-election campaign (to “red pill” means to wake people up to the “dark truths” about society).

The Online Battlefield

Over the past four years, as the political arena moved from the controlled set of the nightly news to the unfiltered feeds of social media, discourse has democratized. This digital revolution has given each citizen a voice to speak with, and question those in power. However, society now faces its unintended consequences: extreme actors seizing control of democratic tools to spread dangerous disinformation at scale.

Many far-right groups and conspiracy theorists believe themselves to be in a battle with “the establishment” and seek to build a conservative order, freed from “elite” interests. To fight this information war, organizations seek to mobilize as many accounts as possible. These groups have adopted a new strategy of compartmentalization and attack: organizing away from conventional social media platforms, and returning to launch coordinated campaigns across mainstream sites to sway public opinion.


The alternative social network, Parler, is often used in tandem with mainstream platforms like Twitter to spread disinformation

Digital Warrior Groups

The first group covered in this report was established in June 2020 and claims to have over 40K members. This right wing group is engaged in digital warfare against mainstream news and the Democratic party and aims to build a ‘digital army’ to “red pill” Democrat voters . This group was heavily promoted by leading figures such as former US National Security Advisor, Michael Flynn, on Twitter and Parler (to a collection of 1.2M followers) as well as on YouTube. Flynn lent the group his professional respectability and platform and shared their posts and materials to his followers.

The second group is a pro-Trump action group, that operates similarly to the first. The leaders of this organization push their followers to engage with and disseminate its political messaging, using personal social media accounts to broadcast pre-prepared political campaigns. Subjects range from allegations against Black Lives Matter protests, to Deep State conspiracy theories regarding QAnon, and COVID-19 disinformation, particularly about the discredited treatment for the disease using Hydroxychloroquine, a wedge issue in the United States in 2020.

Disinformation promoted by members of a digital warrior group

Building & Deploying a Network of Far-Right Digital Soldiers

Both organizations use a strategy that mimics the bot campaigns of previous disinformation attacks: flooding social media with false information from multiple accounts. Instead of bots, they cultivate and co-opt authentic accounts into inauthentic agents to spread propaganda and misinformation widely. Both organize away from the mainstream platforms, so that the proliferation of content appears spontaneous and uncoordinated to prevent moderator detection.

Using a steady supply of pre-prepared memes and articles, group members work to erode trust in established narratives for users of mainstream social media platforms. Grooming susceptible users, these individuals then invite new initiates to meet on unregulated platforms (such as Telegram and Parler, among others), where they would be further radicalized in private training groups. Here, the recruits were supplied with the ammunition and know-how to convert others, run operations, and once trained were sent out to join local state groups, building the network.

Each day in the run-up to the November election, the campaign leaders of the first group would post politically charged, misleading and often false “attack content” to Parler, for mainstream accounts to share. The second group similarly organized from their website where pre-prepared Tweets with links were generated daily from which members copied, pasted and then posted from their own private accounts. The material consisted of images and political memes, links to articles and pre-written posts focusing on allegations of voter fraud, pedophilia within the Democrat party, and other material designed to delegitimize the Democratic campaign.

Parler is among the most popular platforms used by Digital Warriors

Group Actions

One of the groups covered in this report produced a library of content and achieved enough traction to gain national US media attention. A post alleging that Joe Biden was seen holding the hand of the KKK Grand Wizard, with the caption “Biden with Grand Wizard Of KKK. So who again is playing you, lying to you, using you for the votes, Creators of the KKK, opposed civil rights of blacks. Yup that’s the Democratic party” entered the mainstream US news cycle. The post was designed to deflect accusations of racism on the right of American politics, and suppress African-American enthusiasm to vote. The allegation was false.

 

By dropping doses of truth into social media platforms and using a daily targeted strategy. It’s really quite simple, but incredibly effective. With sufficient repetition, our goal is [to] get Democrats to question what they’ve been told.
- Group Leader

 

The sustained power of the disinformation campaigns covered in this report was exemplified by a new study by the Digital Innovation and Democracy Initiative that found that between October and December, there were 1.2 billion US engagements with sites providing false and manipulative content on a single mainstream social media platform. In the same quarter on Twitter, almost one-third (47 million) of all links shared by verified accounts in the US were to sites hosting deceptive content.

The effect cannot be understated. There are real-world consequences. The QAnon conspiracy moved from a fringe phenomenon to a political force that could not be ignored by the sheer volume of posts to social media. One QAnon follower, Anthony Comello, is on trial in New York for the murder of mafia boss, Francesco Cali. Comello believing Cali to be a member of the ‘Deep State’ controlling the USA. Meanwhile, the “stolen election” conspiracy pushed by former President Donald Trump, and promoted by the groups covered here, amongst others, led to rioters storming the US Capitol Building trying to prevent the Electoral College votes for President being counted. Control of the conversation has led to control of the events.

Reacting

Platforms are increasingly being held accountable for the materials which they host. In addition, the conversion of mainstream users into political assets poses a real risk and a complex challenge to Trust and Safety managers. The fact that the accounts sharing disinformation are not purely bots, amplifying a network’s agenda, means that they are difficult for individual moderators to identify as inauthentic users. The result, if unchecked, is the proliferation of health and electoral disinformation, alongside growing calls to violence.

The compartmentalization of far-right digital warrior’s activities across various social networks requires a cross-platform perspective to identify and counter the threat. ActiveFence offers the perspective and intelligence capability to locate the sources of the attacks and to protect platform users from this content.

ActiveFence continuously works to track dangerous organizations worldwide and identifies disinformation campaigns. Our unique intelligence capabilities allow us to track harmful groups across platforms. In doing so, we can locate material designed to spread disinformation at its origin point. By finding the source of the materials, and the individuals responsible for creating it, we are able to act to block this material before it is posted onto mainstream platforms and identify those accounts who will seek to carry out future attacks.

For more details and examples, receive the complete whitepaper via email.

Sorry,
I agree to receive emails from ActiveFence about products and services. For more information, please see our Privacy Policy.

    Disinformation

    All Research

    Learn About The ActiveFence Approach.

    Contact Us