Now: Efficiently moderate content and ensure DSA compliance Learn how

Art as a Pedophile Tool

By ,
March 15, 2023

Online pedophile groups have adopted a new method of coordinated communication. This allows them to actively expand their network and sexualize minors on the clear web without detection. By developing their working knowledge of online pedophile culture, Trust & Safety teams will be better equipped to identify dangerous and concealed activity on their platforms.

This article is based on the research of Maya Lahav, ActiveFence’s Senior Researcher on Human Exploitation and Child Safety.

You can’t solve a problem you don’t understand.

The presence of active and growing pedophile networks on platforms of all sizes threatens vulnerable users and communal safety. The entrenchment of this behavior lays the foundation for criminal activity, risking platform integrity and legislative and regulatory compliance demands.

To ensure online child safety, Trust & Safety teams must identify and counter dangerous behavior. These actions must be supported by a policy that reflects and is responsive to the challenges. However, threat actor behavioral evolution exposes platforms and their users to harmful new threats.

Online pedophile activity frequently presents itself as the sharing of illicit material of visual recordings or depictions of child sexual abuse and graphic commentary that sexualizes minors.

A new threat emerges: the pedophile exploitation of art

ActiveFence’s Child Safety team has conducted an intensive review of online pedophile communities in over thirty national ecosystems, and we have identified a new covert organizational tactic across many mainstream platforms.

Pedophiles are building new communities by sharing often-legitimate art depicting or referencing children. Seemingly innocuous interactions are, in reality, masked gestures to share underlying primary objectives. Threat actors engage in carefully guarded, coded discussions about this material against an ‘artistic standpoint,’ or at other times, remain mute, having limited engagement by disabling the comments and likes features to avoid attention.

Throughout these communities, discussions revolve around various topics, such as locating nude or semi-nude depictions of children in cinema and television or sharing images of Classical Greek painted pottery showing adult men with boys. “Art analysis” incorporates contemporary and historical representations of children in cartoons, cinema, or even text. The conversations are contrived, with pedophile community moderators forbidding sharing traditional child sexual abuse material (CSAM) or even overt sexualizing language as part of their online activity.

Instead, online threads in forums and hidden messaging groups revolve around how the depicted children appear healthy and strong, including praise for the quality of artistic composition, (e.g., ‘the shape of the leg,’ the ‘balance of light on the body’) or the conveyance of historical attitudes toward youth.

These communities not only analyze the art they also discuss the artists. Our work has uncovered indexes of artists believed by pedophiles to be, or to have been, sympathetic to their cause. They use the names of these artists (and their work) as secret keywords that pedophiles can use publicly without drawing outside attention.

What is the purpose of this behavior?

Despite initial outward appearances, these are not innocent conversations about art or history.

The primary purpose of this activity is to engage with sexualization of minors (SOM) content without doing anything illegal. Through pedophile interactions, art produced by legitimate artists is transformed into harmful content for malicious use and consumption, as such, it will not be flagged and can be located on the clear net. Pedophiles have found a loophole and created a grey area for Trust & Safety teams to tackle.

For example, members of the BoyLover pedophile community discuss ancient Greek vase decorations showing boys and men, images that could appear in any history book or academic course. But the discussions posted on public platforms are loaded with inferences and obscurities related to their pedophile community’s praise of pederasty—a Classical Greek educational-sexual relationship between an adult and a boy—to strengthen their claims that contemporary society has arbitrarily condemned such ‘romantic’ relationships.

Elsewhere, members of other pedophile communities share images of nude sculptures that depict youth. For example, pedophiles in one thread created a list of monuments that decorate public parks in a specific geographical area.
Each image is symbolic and serves as a flag to attract other pedophiles.

Several communities use the clandestine tactic to evade platform detection and prosecution. One website we assessed warned its users not to share sexual materials, image-based or textual, and to restrict public conversations to the art, to avoid police action. They claimed that following British police action, they had moved their operations out of the UK.

What are the consequences?

If pedophile communities gather online to publicly discuss depictions of minors without overtly sexualizing them, is it dangerous? The short answer is yes.

This content can be viewed as signals as much as stimulation. They are identifiers of the presence of a community of similarly-minded individuals and a locus for communal growth. Pedophiles may not be talking publicly, but they are, in fact, making connections and forming relationships with one another. They are doing so on the open internet, in close proximity to children.

While their public communication may not be regarded as an initial threat, we often see a dangerous next step. These websites and groups hosting this child-focused artistic material offer the possibility of premium membership.

It is in the premium, members-only areas where more sexual content depicting children is shared. Users cannot access these areas without personal approval from a moderator or another community member’s recommendation. Premium membership is often monetized, with several clear net entities offering members the option of purchasing illegal sexualized recordings of real minors.

What can be done?

Hosting this violative sexual content cloaked in the non-sexual discussion of art can foster the circumstances for a growing market of the trade and distribution of CSAM. The fact that this activity occurs on the clear web can implicate reputable companies.

This activity lives on sites hosted by mainstream website-building platforms and on social media and instant messaging platforms. Given the covert facade, this harmful activity can be easily overlooked. However, it must be prioritized by Trust & Safety teams working to ensure child safety.

Using deep threat intelligence, Trust & Safety teams can better understand this violative activity, and be better equipped with the tools they need to decipher the dog whistle messages being sent.

The solution is not restricting art, but rather recognizing the importance of research and taking proactive steps to detect online behavioral patterns among threat actor communities. Accordingly, understanding the malintent behind this community’s activities is a necessary step to identifying online harm. By recognizing new pedophile keywords in use, symbols, and other behavior patterns, moderators can proactively detect these harmful groups and disrupt their communities, blocking offending domains and flagging accounts engaged in this illicit action.


Want to learn more about the threats facing your platform? Find out how new trends in misinformation, hate speech, terrorism, child abuse, and human exploitation are shaping the Trust & Safety industry this year and what your platform can do to ensure online safety.