Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Lost somewhere in the deep, unindexed and uncrawlable world: How hard it is to find the “Report it” command

Author: Carolina Christofoletti

Link in original: Click here

A real case study of the Adult Pornography Website called, in this article, “ Platform X”

Currently, I have been dedicating my time to better studying and analysing the compliance polices that are in force in adult pornography websites for managing the risk of having Child Sexual Abuse Materials (CSAM) uploaded to their channels. I chose platform X to be the empirical model for my case study. In platform X, there are two documents that interests me the most: Their CSAM Policy and their Intern Report Channel (Content Removal Request Formulary).

The CSAM Policy is one I had an extremely hard time trying to find it in the middle of a thousand of new articles pointing out some recent scandals. I found it after more or less 15 minutes (not so easily for someone used to look for those documents) of digging into Platform X scary (full of hardcore videos in its first page) website, in its “Trust & Safety Policies”. The Content Removal Request was easier to find, being displayed in the top bottom of any page in the support and help section.

Once I found it, I decided to check, just in case, what was happening with those pages, so I conducted a reverse search on Google, which is my (and most people) default Browser. First, I copied the link of the page I was (Platform X CSAM Policy) to Google. Google answered me with “Your search didn’t match any documents”. I then removed the link and wrote “Platform X Child Sexual Abuse Material Policy”. Google answered me with 1,220,000 results, once again most of them composed by newspaper articles. Already exploring the third page of Google’s 1,220,000 result, I was unable to find the Policy I was looking for. Yes, maybe, it was a problem of ranking (considering that newspapers are much more popular than any website policy), but the fact is that this document was, even so, extremely hard to find. As I said, I found it latter on directly on the platform. When I typed Content Removal Request, the removal link was ranked in as my first result on Google. The same thing happened when I typed, directly, the link to the Google.

More than something curious, that seemed a relevant data for me: Why the website Child Sexual Abuse Material Policy was not indexed? Curiously, even though we are in the second case talking about a removal formulary, this formulary does not even contain a single mention to the need of reporting materials involving minor (no matter what) found in the platform. After all, in any case those are being used for sexual purposes, already for the platforms they are hosted. Illegal or not, that material must be, as a matter of corporate ethic (before even a matter of law), immediately removed.

I changed then the search engine to Bing, and the result was the same. Differently from Google, Bing displayed for my link search a series of other website policies (including Discord terms of Service), but the Policy I was looking for was still out of sight. I then decided to type for Platform X Child Sexual Abuse Material Policy, to see if I could find the link like that. The results were quite the same: At this time, while I could find lots of newspaper mentions that the policies were updates, I had no idea where this “new policy” was written. I still could not find where the document was. When I changed my searches to the Content Removal Request link and tried it also by typing only these three words on Bing, I was more luckily: The link to the formulary was ranked in the first position of my Search Results.

Why this CSAM policy was is not indexed by search engines if those are meant exactly to inform anyone wanting to report those kinds of materials to the platform? This Policy, made to make the Platform “compliant” to third-part eyes, is also not even linked in the Content Removal Formulary. Why? Curiously, this CSAM Policy is also where the reference to the external report channels and the ask for contacting law enforcement people appear. Why is a document of such relevance not indexed?

The case becomes even more interesting if we consider that for flagging a video on Platform X a formal registry with the platform is mandatory and the Content Removal Request Formulary, which allows (in principle and in formal terms, not in practical ones) a complaint to be filled anonymously does not even make reference to the possibility of reporting Child Sexual Abuse Materials.

From the point of view of someone up to fill a CSAM report directly to the platform, this is relevant for a series of reasons.

First, if one does not find the policies directly and visibly on the platform, one will “Google” and “Bing” it with the same lack of success I did. The Terms of Service, linked on the platform formulary and where the term “minors” appear, are only available in English.

I conducted the search as “CSAM Policy” because I knew what the document’s name was, even though I could not find it through a Google and Bing search. But, not necessarily do people who are reporting CSAM on an adult pornography website knows what the terminology is. Complying with their common sense, they are possibly waiting for the platform to display, with clear rules and easy to report formularies, a warning that NO content featuring minors (teenage and children) is considered as pornography and for that and other purposes, will be immediately removed by the platform.

The reason of that is that, putting together the known-to-the-public keywords in a single phrase, a whistleblower up to report a CSAM found in the platform and looking for the removal formulary or to the platform’s CSAM Policy directly on Google (for the websites buttons are hard to find, for he is afraid of clicking on something wrong on the platform or any other reason) would have a better chance of finding the Platforms Report Channel from an external link.

This is one solution, working with language and crawler logics.

The other one is, no matter what the name of the page is (in the case, CSAM Policy), to make it accessible and reachable for anyone who is trying to find, through a search engine, where the report formulary is. That means, for example, indexing it in a way that for people writing on Google (Report Platform X child pornography), the CSAM Policy (urging the whistleblower to report it, in compliance with the platform’s policy), the Content Removal Request Formulary and the Flagging Procedures (report it simply, which was not mentioned here because of the registry requirement) must be promptly displayed.

This is another solution, working with search engines indexing.

If report channels are not easy to find and report orientation is hidden somewhere far away from the 3 clicks Apple model, those channels and policies are, though existent, considered “inexistent” (cosmetic) from a compliance point of view. The legal risk remains as high as before, and that is something that adult pornography websites should, overall, think about.