Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Where non-accuracy data could turn into something (very) odd: Electronic Service Providers CSAM Reports and Pornography Websites

Image retrieved from: Wallpaper Abyss

Author: Carolina Christofoletti

Link in original: Click here

There is something that absolutely needs to be written. Why does PornHub, in its first transparency report, claiming to be ‘proud’ of the cooperation with NCMEC to report Child Sexual Abuse Materials (CSAM) eventually found on the platform, fail to expose their own metrics regarding the 4,171 unique images reported to NCMEC?

This is, to say the least, unconventional – especially considering the fact that this first Transparency Report emerges precisely in the rubble of a very serious scandal related to child sexual abuse material. One piece of data that seems relevant- though hidden- to me would be precisely the amount of material found by the moderators and the amount of material found by the public in adult pornography websites.

And I will explain to you why: This is, exactly, the data that has the power to reverse the invincibility of the “effective compliance” argument, especially considering the special risk environment in which adult pornography websites are.

For legal reasons (e.g. knowingly accessing CSAM is a criminal offence in some countries) and behavioural reasons (close the page, and have nothing more to do with it), the chance that an ethically-minded visitor to an adult pornography site will report Child Sexual Abuse Material with explicitly illegal titles or even go to check it is, after all, very low. The explicitness of the environment itself makes it hard to explain, and the zero-tolerance platform policy hard to trust.

As such, any compliance program tailored to an adult porn site should be aimed precisely at reports made not by Artificial Intelligence but by the public. From an Internal Channel point of view, this non-reporting behaviour is what pornography industry should be dealing with.

Considering that it is precisely these pornography platforms, where no keyword crawlers tend to be in operation, where child sexual abuse materials are most easily hidden, one could conclude that in terms of technology, enforcement should be aimed at new, as yet unknown, Child Sexual Abuse Materials for which those platforms are still an open door. Even if the internal channel data is hidden, it is essential to internally analyse it, for example, for the purposes of identifying similarities between keywords, hashtags, or others. This would make the question “Where to look” much easier.

When it comes to adult pornography distribution platforms, then the reporting data (in this case, hosted by NCMEC) is of interest primarily and especially because of the high chance of its incompleteness. And this is maybe the highest point of this article.

Given that the independence of some adult pornography websites internal report channel are not verifiable, and neither is the internal channel’s data on CSAM transparent (as Twitter and Facebook, two platforms deemed as ‘more insecure’ than PornHub in a far-too-fallacious argument made by PornHub itself, do), it is possible that the reports are being sorted out… for convenience.

This is why I am still insisting on the qualitative component of this data. It would, after all, be too strange if it were found that, on a platform of highly explicit content, the child sexual abuse materials found are only ‘indecent’ or, in the end, simply discarded as not illegal.

To the reporting channels, then, the question is: How accurate are the reports that come directly from the pornography industry and do they match the expected level of newness of this industry?

To think about