How Gaming Platforms Can Ensure Safe Spaces for Women

By
March 16, 2022
Women holding a gun

Women now make up almost half of all gamers, diversifying what was previously predominately male. Unfortunately with the increase of female players comes the increase in discrimination. In this article, we tackle how platforms can counter these abuses.

Statistics of gamers online

By 2025, almost one third of the world’s population will be playing video games, forty-one percent of whom are women. Making up for nearly half of all gamers, seventy-seven percent of these women have experienced gender-based discrimination while gaming. Translated into numbers, this seventy seven percent is over eight hundred and twenty three million people. 

From sexism, bullying. threats, and abuse, recently there has been an outcry from women speaking out about harassment.  Many companies have made efforts to change the face of gaming as more women join the trend. In the last year, 18% of new games launched featured female characters. Women in games, an organization advocating for the reimagination of the gaming industry, is addressing the gaming culture from the perspective of the workforce, product, and player community. However, efforts to tackle the abuse itself must improve.

While the gaming industry clearly faces a systemic challenge, with every challenge, there is an opportunity.  In this blog, we explore how gaming platforms can create a safe, inclusive environment that will empower women.

Securing Gaming From Abuse

1. Tapping Into Harmful Communities

Many harmful groups stem from communities ranging from men’s rights activists to incel communities. Groups from within these communities are active within the broader gaming ecosystem, and either seek to maintain male dominance of gaming spaces or campaign to remove women from games in their entirety. The goals of these communities are naturally contrary to gaming companies which seek to build large and inclusive social spaces. 

To create safe social spaces for their women, gaming companies must tap into off-platform communities where threat actors who target vulnerable users can be found. These users identify themselves, discuss current and future targets, and share trending abusive content.

Detecting harmful online groups that are connected to gaming equips Trust & Safety teams with the necessary knowledge to identify future targets of abuse both on and off the platform, as well as dangerous trending behaviors. This access to intelligence enables users to be secured from toxic activity, which can harm brand reputation amongst vulnerable users, and necessarily weaken user retention.  

2. Assess the Ecosystem: Identify Cross-platform Abuse

While the challenge of countering organic on-platform abusive activity is complex, it can be solved by improving the atmosphere across online games for female users. Altering what is perceived to be acceptable can reduce the recurrence-rate of harmful incidents in general. This requires Trust & Safety teams to work across platforms to secure their user-base.

Just as with other hate groups, misogynistic communities meet in online groups and forums to coordinate and plan abusive activity directed at women in gaming. Access to cross-platform threat actor chatter enables in-game moderators to identify where abuse is likely to occur within their environment and therefore take measures to reduce this. In addition, visibility to both the harmful action on-platform and its off-platform coordination, can help Trust & Safety teams to identify connections between harmful users and recognize who should be flagged for temporary or permanent banning.

In-game misogynistic abuse can take the form of large-scale audio harassment of female players in multiplayer games, but it can also present itself as sexual harassment of both female avatars and their operators. 

3. Test Community Health

Detect on-platform risks by understanding wider community dynamics

The sheer number of in-game interactions that take place each moment render effective observation increasingly difficult. However, an efficient method to gain an accurate reading of the climate is to assess the communications in off-platform communities built around specific games such as forums, messaging platforms and websites. This is essential to understand the health of the community and identify where enhanced moderation efforts are required.

Detecting organic abuse taking place within game-specific forums can also help Trust & Safety teams identify potential perpetrators of violative on-platform abuse. It can also assist the flagging of at-risk users. Taking such proactive steps will not only improve the environment surrounding a game in general , but it will also be crucial to ensure the construction of safe, inclusive spaces for all players.

4. Prevent Marginalizing Community Activity

Beyond the games themselves, off-platform communities also empower live streamer gamers who broadcast their gameplay across a variety of platforms. Misogynistic communities target the livestreams of female gamers for abusive activity and conduct digital raids against their accounts. The broadcasts of these female gamers’ broadcasts are frequently spammed with sexually explicit and derogatory messages, while fake nudes are weaponized against female gamers to humiliate those active online. 

As raiding events are coordinated in advance, platforms with access to off-platform intelligence can put into place proactive measures to safeguard their users. These could include the prevention of harmful content being shared, or the increase of a specific livestream’s moderation resourcing.

5. Identify Damaging Copies

The reputations of games and gaming platforms are not only jeopardized by harmful on and off-platform harassing behavior. The sharing of modified versions of games, or bespoke creations sold and distributed through forums and special interest groups, jeopardize platforms as well.

ActiveFence has identified multiple modified versions of mainstream games which allow players to subject women to acts of sexual violence. These altered games display the branding and design of their originals. Other detected mods have removed female characters and female voices from the gameplay. This corruption of the gameplay not only seriously infringes on copyright, but risks normalizing misogynist behaviors in the main online ecosystem of a game.

Awareness of these modified and bespoke harmful games enables companies to take action to prevent damage of their reputation.

Empower One, Empower All

When a social space becomes hostile to one group, everyone suffers. Places where women are intimidated are often also unwelcoming to other minority groups. Harmful activity encourages more unpleasant behavior, and if one group is driven from the public realm, others will follow.

The cost that stems from not countering gender-based in-game discrimination is felt in the retention of players of all kinds. Left unchecked, this toxic behavior  will ultimately harm a game or platform’s potential revenue. 

However, the converse is also true. It can be estimated that 823 million female gamers have been subjected to gender-based discrimination in the gaming ecosystem. Those companies that can build secure communities that are accepting, welcoming and diverse can tap into an exponential growth of users. 

 

Table of Contents