Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Social Media Platforms and the to-the-hotlines reported & confirmed URLs

Retrieved from: Positivo Tecnologia

Author: Carolina Christofoletti

Link in original: Click here

I would like to insist for a few minutes on the question of CSAM links.

According to Facebook’s Safety Team:

“We have also taken steps across our apps to make the broader internet safer for children. This includes running PhotoDNA on links shared on all our apps from other internet sites and their associated content to detect known child exploitation housed elsewhere on the internet.” – Facebook Blog, 11 June 2020, Facebook Joins Industry Effort to Fight Child Exploitation Online

Considering that CSAM links tend to be highly volatile, opening links to check if there is any known illegal content being hosted inside them seem to be a great idea if the purpose it to avoid it being spread through Facebook platform channels. For the premise to be true, some points need to be clarified:

* Is this PhotoDNA search being conducted across all links shared on the platform, or only in some links? If only some links deserve a deeper examination, how are they chosen?

reverse search on CSAM links through platforms such as Facebook, Instagram and others may help clarify what role did those platforms played in the distribution of that specific material and, with further metrics, on CSAM dissemination outside Facebook as a whole.

It is rational to believe that, if platforms such as Google, Facebook and others are well monitored (compared to some much darker Internet corners) against known (and also machine learnable) CSAM imagery, the amount of external CSAM links actually shared on the platform might be far higher than the amount of CSAM imagery that are found. And for a simple reason: Links are, at present, easier to conceal.

Facebook claims, on their Community Standards, that any posts of links related to CSAM material constitute a violation of their policy. This link data remains though, meanwhile, hidden on their Transparency ReportsAbout how many external CSAM links are we talking about?

Were the ISPs already notified, are those links not accessible any more or was the illegal content (temporarily) removed or not, the fact is that Facebook should be actually interested in crawling those links inside their platforms and remove it.

*First, because it signs that, somehow and at a certain time, Facebook was, in fact, used for sharing external (maybe on the Deep Web hidden) CSAM material. In some cases, it could also indicate that the platform was, actually, the very place where the illegal assembly around that and other CSA materials was born. Mapping the emergence of those networks is the first step to disrupt it.

*Second, because known links could lead to other “in-Facebok hosted pages” with still more of CSAM related material which lived and live, until then, still undetected by Facebook CSAM imagery technology.

* Third, because it can serve as a future indicator for algorithms of ‘where to look’ if, after mapping it, Facebook Research Labs start looking for patterns across this sharing mechanic.

I’m afraid such a thing as ‘CSAM link database’, in contrast to Image Databases, does not exist. At least, when we are talking about Industry.

But it could and should exist. Specially, to address cases where we are talking about external links that remain, for whatever reason, still active for a long time after being reported to the appropriate channels and whose URLs remain intact on social media platforms.

This is an extraordinary point of this immense Internet battlefield where CSAM hotlines and Industry could, in a near future, cooperate.

After all, de-indexing external CSAM links on highly crowded platforms remains as a crucial point to stop its sharing and downloading, specially (but not only) in cases where ISP action is delayed or simply non-existent.

To think about…