no

Transnational Child Exploitation

Legal Divergence & Exposure

CSAM produced in Japan is sought out and collected by child predators, and is shared and sold on mainstream platforms. ActiveFence investigates this behavior, identifying new keywords from dark web chatter.

Background

The proliferation of child sexual abuse material (CSAM) is spreading at an alarming rate. According to Internet Watch, reports of child sexual abuse online grew from 1 million in 2010 to around 17 million in 2018. These 17 million reports related to almost 70 million images and videos of CSAM. As this threat continues to grow, national legislators are racing to keep up. They are drafting new legislation that governs platforms which host user generated content and are therefore at risk of abuse. For their part, platforms are responding by developing ever-stronger guidelines and terms of use, the results of which are discussed and laid out in the first of the ActiveFence Policy Series, which focuses on Child Safety.


The UK is making waves in internet law with the Online Safety Bill 2021

A Changing Legal Context

The most significant forthcoming development comes within the UK’s Online Safety Bill 2021. If passed, this will establish a new precedent in internet legislation – the duty of care from platform to user. Just as hotels must have working fire alarms and sprinklers installed to detect and combat danger for their visiting guests, so too must online platforms protect their users by identifying and denying access to all CSAM from within the borders of the UK as it is uploaded. The EU is evaluating similar proposals with consideration scheduled for late 2021.

The new CSAM requirements, if passed, will affect all platforms accessible within the EU and UK (~10% of the world’s population). This presents a challenge as platforms must develop and implement both technology and processes to identify CSAM as legally defined where the material is accessed, which varies between countries.

Junior Idol – ジュニアアイドル

Junior Idol is a genre of “soft-core pornography” focused mainly on girls, under the age of 15 years, who are photographed and recorded on video dressed in underwear, swimsuits, or seifuku (“schoolgirl uniform”). This genre became popular in Japan in the mid 1990s under the title chidol (a portmanteau of child and model). Although the photography sexualizes the girl it also emphasizes their youth. Writing in Japan Today, Patrick Galbraith best described the genre when he said, “Junior idols sell raw innocence—a major commodity today”.

Junior Idol sits uneasily in the grey area of legality due to Japanese law’s absolutist definition of child pornography, which requires that the image be designed “to arouse or stimulate the viewer’s sexual desire”, should show a child naked, or with genitals exposed, or engaged in a sexual act. Since Junior Idol content shows children clothed, without sexual acts in progress and with genitals covered, the material usually does not pass the Japanese threshold of child pornography. However, in Europe, Australia, New Zealand and North America, where broader definitions are in use, the Junior Idol materials would be classified as child sexual abuse material. In the USA, for example, child pornography is defined as any visual depiction of sexually explicit conduct involving a minor (defined as anyone under the age of 18). The EU extends this definition to also include those who appear to be minors.

Due to the legal divergence between Japan and Western countries, it is possible for child predators, outside of Japan and who have become familiar with the Junior Idol genre to purchase its content from mainstream e-commerce platforms operating in Japan. Consequently many photos and videos of these CSAM productions are in circulation throughout the world. Specific knowledge is required for moderators to know to look for and then identify Junior Idol on platforms accessible in Western countries. Beyond the genre itself they must also understand the associated cultural signifiers, and relevant keywords to find associated materials.

Where gaps in cultural knowledge sit on legal fault lines, companies can find themselves at risk. In this context they can accidentally host illegal content for months or years without detection on their servers. In the coming era of internet regulation ignorance will not be a defense.

In the coming era of internet regulation ignorance will not be a defense...

Broader Legal Divergences

The divergence in CSAM definitions in legislation is not simply a Japan/West split. There exist significant differences in the legal definitions between many Western countries. For example in North America, Canadian definitions for describing CSAM far exceed those in the USA. Canada not only prohibits the recording of sexual material using those aged under 18 years (like in the US), but also outlaws “any written material, visual representation, or audio recording that advocates or counsels sexual activity with a person under the age of 18 years”. This Canadian legal wording led, in March 2021, to a Canadia man being arrested and charged for accessing, possessing and transmitting child pornography. The material did not depict real children but rather was animated child pornography and would have been legal in the US. This distinction led to a US citizen in 2011 being charged for crossing the US/Canadian border for the possession of Manga which featured illustrations of minors engaged in sexual acts.

In our exclusive report, available for download below, we focus on the risks for e-commerce platforms presented by internationally produced child pornography and CSAM-related merchandise. The risk is not limited to e-commerce platforms, but is presented wherever user generated content is able to be uploaded: social media and video sharing platforms in particular. We also provide new keywords related to Junior Idols to assist trust and safety professionals in their work, making the internet a safer place.

For more details and examples, download the complete whitepaper.

Sorry,
I agree to receive emails from ActiveFence about products and services. For more information, please see our Privacy Policy.

    Child Safety

    All Research

    Learn About The ActiveFence Approach.

    Contact Us