How Are Pedophiles Hiding In Plain Sight on Platforms?

By
November 7, 2022

It’s widely believed that pedophiles engage in illegal online activities only in the darkest corners of the internet. However, this isn’t strictly true: these predators exploit websites and platforms of all sorts, including major social media networks to consume and disseminate child sexual abuse material (CSAM), exchange information with other pedophiles, and establish communication with potential victims. They’re able to operate in the open because they speak in their own codes and use euphemistic language that is undetectable by the uninitiated. They are, in essence, hiding in plain sight.

It’s important to note two facts about this phenomenon. First, the way in which pedophiles communicate with each other online isn’t what it used to be. Especially on mainstream platforms, child predators don’t simply send one another explicit text and imagery, as these items would be flagged by AI or content moderators. To combat this, they’ve developed their own language that allows them to communicate with one another yet doesn’t raise the eyebrows of Trust & Safety teams. Second, there isn’t one singular group of predators, but hundreds, if not thousands, of subcommunities exist all around the world, each using its own language, dialect, and variations, which makes detecting violative content that much more difficult. It’s not just that pedophiles use unsuspecting language to fly under the radar – it’s that they use so many different languages.

What are Trust & Safety Teams Up Against?

Trust & Safety teams are tasked with creating content policies, detecting violations, and carrying out enforcement, but what happens when the content that goes against the rules isn’t so easy to see? By developing their own types of codes, pedophiles can evade detection by content moderators. That means that Trust & Safety teams are essentially looking for an enemy they can’t see.

Pedophiles have adopted codewords determined by varying guiding principles that they use to communicate directly with one another while evading moderation teams, and to describe the type of CSAM they are interested in or are sharing. One common theme is using the first two letters of the phrase ‘child pornography’ and applying them to different phrases to denote CSAM. By using phrases that start with the letters C and P, pedophiles are able to communicate with each other while going undetected. They’ll also often refer to historical figures or cultures which are suspected of pedophilia, names of victims of pedophilia, or the owners of CSAM websites, forums and groups who have been arrested. Further, predator communities across the globe employ terminology in their own language, with references that are unique to their own culture, making detection even more difficult.

The Importance of Understanding Secret Communication Among Pedophiles

The key to easy identification is ground-breaking, dedicated research into the distinctive terminology and circumvention techniques used by pedophiles. To best leverage this expertise, Trust & Safety teams must be agile and equipped with sophisticated technologies, and must apply the research gleaned in the process to their content moderation policy and practices.

With a constantly growing number of people using the internet, and with the increasing amount of time that minors spend online, the danger is only expected to grow as such activities continue to permeate the web and evade traditional algorithmic detection mechanisms.

In the absence of effective moderation, and as pedophiles invent new techniques to evade detection mechanisms, social media platforms will increasingly constitute a serious danger for children. Adequate moderation that can maintain pace with these evolving threat tactics must be a priority for platforms. As parents grow more and more intolerant of the risks those platforms pose to their children, they are less likely to give their consent to their kids to use them. Only by learning the ins and outs of pedophile tactics, including linguistic choices in a variety of languages, can violative content be effectively detected and removed.

Want to learn more about the threats facing your platform? Find out how new trends in misinformation, hate speech, terrorism, child abuse, and human exploitation are shaping the Trust & Safety industry this year, and what your platform can do to ensure online safety.

Table of Contents