Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

The terminology you use informs your whistleblower: Why Adult Pornography Websites should not refer to CSAM as CSAM

Image retrieved from: Xerpa

Author: Carolina Christofoletti

Link in original: Click here

Legislators should call it CSAM (Child Sexual Abuse Material) as long as things like lewd exhibitions and the production of indecent images of children are also criminalized as Child Sexual Abuse. That is a matter of legal integration.

Being so, in order not to be made an under-inclusive legal disposition, the legal changes from the Child Pornography terminology to the Child Sexual Abuse Material one must be accompanied by an integrated review of the overall legislation. This is, but, a criminal law problem and, as such, a problem that should be discussed from a criminal law point of view.

The point I want to discuss today is a more society-related one. Briefly, the reason why I am writing this is that I concluded that Child Sexual Abuse Material (CSAM) may be an adequate terminology for some platforms and also for theoretical discussions.

But when it comes to more practical scenarios, there are some places where this CSAM terminology can not only be inadequate but a real risk to the platforms’ internal channels: Adult Pornography Websites.

I am completely aware that, during the last years, some adult pornography platforms have rewritten their terms (some also have a so called CSAM Policy) to better fit the ethical eyes of the market. But, prior to analysing if this change is adequate, we must understand exactly what, in each context, the child pornography or Child Sexual Abuse Material (CSAM) terminology is wanting to achieve.

Law-people want, with this CSAM terminology, to amplify the set of criminal materials. But adult pornography websites want, using the very same concept, to communicate with their users what kind of content is expected to be reported. And that is, exactly, where the problem arises.

When we are talking about adult pornography websites, we are usually talking, as a matter of fact, about:

1)   Places where the degree of toleration to define “abuse” is usually high, since the videos uploaded to those platforms usually represent a broad set of “sexual fantasies” (some of them, also to be classified as paraphilias, such as sadism).

2)   Places where any suspicious of a ‘consented act’ can elude the conclusion that the material, though involving a minor, is abusive.

This concludes, so, that the simple mention of “Child Sexual Abuse” (what, under the eyes of a pornography consumer, probably means Category A – sexual activity with penetration, sadism and sexual activity with animals) as the linguistic reference to the contents whose upload in the platform is prohibited and whose removal is mandatory can have a disastrous effect on the platform’s report channels itself.

By this negligent nomination, Child Sexual Abuse Material classified as category B (sexual activities without penetration) and C (indecent images) risk not only be constantly uploaded as a wide set of new, non-technologically detectable material but also to be hardly ever reported.

In a platform where pornography is not per se seen as abusive, putting child sexual material outside the very same ‘pornography’ category that the platforms operate with (which include, also, indecent images and acts without penetration) set its audience under the fake assumption that the illegal material in question (Child Sexual Abuse Material) is, somehow, highly different from the ‘adult material’ that the platform normally hosts. As a consequence, Child Sexual Abuse Material will have to meet a very extreme standard to, finally, be reported to the platform.

I have already commented elsewhere about the importance of analysing the data that comes to adult pornography platforms external channels. The imbalance of a high number of non-CSAM reported to the CSAM Hotlines would be, after all, as also the imbalance of a high number of Category A material found by platforms users, a clear indication that compliance people have still lots to do on those platforms.

A proper compliance program for pornography industry should start assessing, first of all, what comprehension do its audience have about what the platforms write in its Terms of Service, that is, about what Child Sexual Abuse Material is and what the world Abuse means inside the context of an adult pornography website.

Ethical CSAM terminology is important, but it is worth nothing if it has the possible collateral effect of silencing, by the force of the philosophical doubt, the very same whistleblowers that should, in fact, be helping the platform to identify this very kind of illegal content on their channels.

No, we do not need the audience of adult pornography websites to ask where pornography ends and where abuse begin, for EVERY CHILD PORNOGRAPHY MATERIAL IS A CHILD SEXUAL ABUSE MATERIAL. The only thing that we expect from the audience of adult pornography websites is to report children and adolescent material, if they ever find it.

Calling it again child pornography would, in an adult pornography environment, bring the terminological association back and recover, again, the sense that any sexual content related to someone under than 18 years old is to be reported without any further.

Child Sexual Abuse Materials reporting provisions should be written in a way to make it clearer than ever that it CSAM is a concept that is not, in any case, on the eyes of the beholder. And adult pornography websites are a place where this finding is especially important.

To think about.