Now: Efficiently moderate content and ensure DSA compliance Learn how
Manage and orchestrate the entire Trust & Safety operation in one place - no coding required.
Take fast action on abuse. Our AI models contextually detect 14+ abuse areas - with unparalleled accuracy.
Every user deserves to be protected - and every Trust & Safety team deserves the right tools to handle abuse.
The threat landscape is dynamic. Harness an intelligence-based approach to tackle the evolving risks to users on the web.
Don't wait for users to see abuse. Proactively detect it.
Prevent high-risk actors from striking again.
For a deep understanding of abuse
To catch the risks as they emerge
Disrupt the economy of abuse.
Mimic the bad actors - to stop them.
Online abuse has countless forms. Understand the types of risks Trust & Safety teams must keep users safe from on-platform.
Stop online toxic & malicious activity in real time to keep your video streams and users safe from harm.
The world expects responsible use of AI. Implement adequate safeguards to your foundation model or AI application.
Implement the right AI-guardrails for your unique business needs, mitigate safety, privacy and security risks and stay in control of your data.
Our out-of-the-box solutions support platform transparency and compliance.
Keep up with T&S laws, from the Online Safety Bill to the Online Safety Act.
Protect your brand integrity before the damage is done.
From privacy risks, to credential theft and malware, the cyber threats to users are
In 2024, 57 national elections will take place throughout the world, a historic event that will not happen again until 2048. The world will be watching as voters head to the polls to elect the next US president, and as major elections take place in the EU, India, Mexico, and potentially in the UK, Russia, and Ukraine, all in the same year.
Source: Anchor Change Election Cycle Calendar
Just as the convergence of these events is unprecedented, so is their anticipated impact on the geopolitical landscape. For Trust & Safety teams, the risks surrounding election integrity are higher than ever. Specifically, this sequence of elections presents a prime opportunity for threat actors to spread misinformation across platforms while exploiting GenAI tools to amplify reach.
With these risks in mind, here are ten elections that content moderators should follow in 2024.
Similar to the past two US presidential elections in 2016 and 2020, the upcoming 2024 elections are expected to pose significant online and real-world risks. The January 6, 2021 unrest at the US Capitol further underscores the potential for false and misleading claims to result in violence, especially when those narratives are amplified by prominent figures.
With more than a year to the election, false narratives designed to undermine trust in the electoral process and instill doubt in the US political system are already spreading online. Extremists are expected to intensify their efforts to mobilize around flashpoint cultural issues such as race, immigration, and LGBTQ+ rights. There remains a further risk of violence perpetrated by individuals radicalized by far-right, nationalist, and/or racist narratives online.
Domestic actors, including PR and communications firms, political campaigns, and conspiracy theorists are also very likely to use organic and inauthentic methods to spread mis/disinformation during the election. Foreign actors, especially Russia, China, and Iran, are almost certain to conduct election-related influence operations designed to sow confusion, increase social tensions, and undermine confidence in institutions.
Read more about election-related misinformation narratives ahead of the US presidential election.
One of the most important national votes in 2024 will take place in India, the world’s largest democracy. India’s culturally diverse population presents a distinct set of risks, as various online actors aim to create tension and incite social unrest. The precedent for election-related violence and growing concerns about the health of India’s democracy will make the upcoming election critical for Trust & Safety teams.
Information operations and disinformation in India are often designed to amplify existing social divisions and fuel intercommunal violence. Of equal importance, Russia and China both have significant geopolitical interests in the region and engage in foreign influence operations in India.
Harmful content in India often takes the form of anti-Muslim conspiracy theories, false and misleading claims about politicians, hate speech, and incitement to violence. Common tactics include deceptive practices such as coordinated behavior and spoofing international media websites.
Learn how prominent narratives that gained momentum before the May 2023 Karnataka state election can offer valuable insights into the emerging trends ahead of India’s upcoming national elections.
Ukraine’s 2024 elections are uncertain due to a martial law prohibition of elections during times of war. Nonetheless, Russian actors are expected to attempt to interfere in the vote as a byproduct of the ongoing conflict between Kyiv and Moscow. Ukrainian political parties and PR companies are also known to use bot farms, trolls, and bloggers to spread misinformation about rivals.
There is a strong possibility that social media accounts operated by soldiers with large followings could also be deployed for this purpose.
Misinformation can include fake narratives, images, videos, documents, polls, and newspaper headlines, i.e. “kompromat.” A further threat comes from extremist elements that are active in Ukraine and spread hate speech and promote violence targeting religious and ethnic minorities.
With over 200 million voters registered to participate, Indonesia’s February 2024 election will be the world’s largest direct presidential election. This election holds great significance as Indonesia stands as the world’s third-largest democracy.
Indonesia has a recent history of social unrest and violence during election cycles, particularly after voting transpires. Riots have been staged in reaction to and inflamed by accusations of electoral fraud. Defense Minister Prabow, who has previously been accused of stoking communal tensions through his alliances with conservative Islamists and sharing unfounded claims of voter fraud, is seeking the presidency for the third time.
Political actors on all sides have increasingly and systematically used loose networks of so-called “cyber troops” to manipulate public opinion online. Hoaxes, false information, and inflammatory content are commonly spread on social media during elections. False and misleading claims often target candidates’ Islamic piety, allege communist ties, or stoke anti-Chinese sentiment.
Following reports of widespread irregularities throughout the 2018 election, Bangladesh continues to face increasing international pressure to hold free and fair elections. The government of Bangladesh has responded to such pressure by imposing restrictions on the media and limiting open political debate online while simultaneously allowing channels spreading pro-government misinformation to thrive.
The spread of anti-minority narratives, primarily aimed at Christians and Hindus online has fueled regular incidents of social unrest and violence. Radical clerics and extremist Islamist terror cells in Bangladesh also continue to disseminate content promoting or glorifying violence online.
The outcome of Taiwan’s 2024 presidential election will have considerable implications that extend far beyond its borders. The election is set to take place amid rising tensions with China, which has stepped up rhetoric and military activity around Taiwan. The opposition Kuomintang (KMT), which is perceived as more Beijing-friendly, has warned that a vote for the ruling Democratic Progressive Party (DPP) could lead to war. At the same time, the DPP has framed the election as a choice between democracy and autocracy.
Chinese state media, pro-Beijing influencers, and fake accounts will likely spread mis/disinformation targeting the DPP and other perceived pro-independence candidates. Beijing’s information operations are likely to include both sophisticated and unsophisticated means, including AI-generated videos promoting pro-CCP narratives, fake news created by content farms, and dedicated messaging clusters such as the pro-China “Spamouflage” network that operate at scale. Beijing’s allies, especially Russia, are also likely to share and amplify pro-CCP narratives via their own networks.
In Russia, President Vladimir Putin is expected to declare his bid for a fifth presidential term based on his eligibility to remain in the role until 2036. Should President Putin proceed with a bid for a new presidential term, Kremlin spokesperson Dmitry Peskov predicts Putin will win more than 90% of the vote in 2024.
Despite the authoritarian nature of Putin’s government, the mirage of democratically held elections is an important legitimation tool for the Kremlin. The vote appears likely to occur without a resolution to the war in Ukraine, increasing the incentive for pro- and anti-Putin actors to influence the information environment.
Many Russians use VPNs to circumvent restrictions on Western social media platforms, which means pro- and anti-Kremlin groups will likely seek to influence voters on these platforms.
Russian politicians, Russian state-linked media, and pro-Russian propagandists have repeatedly shared hate speech, dehumanizing language, and calls to violence targeting Ukrainians. Russian state media and poorly governed local social media apps often amplify or tolerate xenophobic, racist, and homophobic content shared by far-right users globally.
For the first time in Mexico’s history, the country is poised for its first female president in 2024 following the nomination of each of its leading candidates. Factors such as Mexico’s recent controversial decision to overhaul the country’s widely lauded electoral watchdog, the National Electoral Institute, add a new element of uncertainty to a vote likely to be marred by misinformation and violence both in the real world and online.
Threat actors across the political spectrum are likely to use PR groups, coordinated networks, bots, and other authentic and inauthentic methods to spread propaganda and denigrate rival candidates. Trust & Safety teams should also anticipate the widespread use of fraudulent polls, documents, and newspaper headlines based on previous elections.
Criminal groups use violence and threats to pressure voters and politicians and disseminate graphic content and propaganda videos on social media platforms. Dangerous and graphic content circulated heavily throughout the 2021 midterm elections, during which at least 250 politically motivated killings occurred.
The European Parliament’s 2024 elections will result in shifts in both size and composition due to the addition of 15 new Members of the European Parliament (MEPs). Parliamentary elections will take place amid internal divisions over migration and Ukraine, the growing influence of the far-right in Europe, and stricter regulations on harmful and misleading content online.
Far-right, far-left, nationalist, and Eurosceptic parties participating in the election are almost certain to be significant sources of mis/disinformation, especially related to migration, climate, the war in Ukraine, LGBTQ+ and gender rights, as well as European integration.
Russian influence operations are likely to disseminate pro-Kremlin and anti-Western narratives through various channels and actors, including local intermediaries and the establishment of local think tanks. China will likely use a mix of overt and covert tactics, including clusters of bots and amplifiers on social media platforms as well as deep avatars, to promote pro-Beijing content and sow discord.
The timing of the UK general elections is yet to be finalized, however, they must be held no later than January 2025 but could take place as early as December 2024. Polling suggests the ruling conservative Tory Party, which has been the primary governing party in the UK since 2010, is on course to lose power to the Labour Party amid discontent over the cost of living crisis and other grievances.
During the 2019 general elections, some of the main sources of disinformation were political candidates and parties themselves, a dynamic likely to be repeated in the upcoming vote. Malicious actors, including far-right groups, have worked to spread misinformation and mobilize protests around sensitive issues such as LGBTQ+ rights, migration, identity, and religion.
Foreign actors are almost certain to engage in targeted influence operations. Russian disinformation campaigns during UK elections have been a persistent issue in recent years, including during the 2019 general election when Russian-linked networks spread false information and amplified divisive narratives. China has also engaged in sophisticated efforts to shape public opinion, including cultivating relationships with British politicians and using state-run media outlets, influencers, and inauthentic accounts to share pro-Beijing narratives and sometimes sow discord in domestic affairs.
While Trust & Safety teams regularly deal with a steady stream of evolving risks to platform integrity and user safety, election integrity in 2024 will bring a heightened focus on new threats to the forefront. Ahead of these high-profile elections, it is critical for teams to take an agile approach to geopolitical risks. This involves encouraging users to freely share thoughts and opinions while also proactively mitigating the spread of misinformation.
ActiveFence analyzes thousands of sources of misinformation across platforms, scanning millions of items daily, in over 20 geographies and 100 languages to surface emerging trends. Combining our automated technology with an intelligence-based approach, we access the web’s deepest sources where misinformation and dangerous trends originate. Sign up for our webinar and learn how your platform can implement early detection and intelligence insights to mitigate election risks before crises develop.