Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

What about the CSAM Megalinks?

Image Retrieved from: Canal Tech

Author: Carolina Christofoletti

Link in original: click here

For any researcher looking for missing data on behalf of Child Sexual Abuse Material (CSAM) compliance policies, court documents come to be a very informative source. And John Doe vs. Twitter comes to be no exception.

In cases where victims are facing the power of multinational digital corporations, lawyers usually have the (very appreciated) care of conducting some kind of open-source investigation on the platforms, prior to rendering any documents to courts. Even though this could seem, at first, something trivial, it is but where you recognize where the good lawyer is. From a legal point of view, it changes everything if we are talking about a huge non-compliance issue applicable overall, rather than an unfortunate case where things have gone wrong.

Without entering on further legal details about that, I would like to point out a very particular point that came, in three pages of this court document (pages 17-20 in the Compliant document), to light: the Megalinks.

If you do not know what Megalinks are, I will explain you shortly. Megalinks are part of the so-called peer-to-peer networks (here is an interesting article to read about it), meaning that the files are hosted nowhere but on the network peer’s computer. Differently from Twitter, where the platform will remove the file and, for the platform only, the file is gone, peer-to-peer escalate things to a level that should make anyone working on the Trust & Safety and Legal Teams of social media platforms care about it: Every time you download the file through a “magnet link”, you need to be part of the network, and, as a way of “improving data sharing speed”, the network “slaves” your computer to be, also, a sharer of this very same material.

The network expands and the complexity of fighting it also. Even though things are easy to investigate (see Europol’s Peer-to-Police initiative) there (TOR itself has also made a mention about it, saying that Torrents – another P2P network- were a very bad idea for privacy), every time someone clicks one such a link, removing the file from the Internet becomes more difficult.

Because the extent of the problem is, in this case, much bigger than the hosting of a single file directly on the platform’s servers, social media platforms should be in urgent contact with the “crawlers” in order to identify, in proper time, this files for removal. Hashtags blocking would, also, be another point of disruption, as we will further see.

Even though the claim seems to be innovative, as also the problem, it is not. The only thing is that, while social media platforms know it, from copyright issues, since, at least, two years ago, the silence about the Child Sexual Abuse Materials (CSAM) on this very format is, for me, remarkable.

The claim in John Doe vs. Twitter that I highlight is two: First, the hashtags which make explicit reference to very well-known forms of trading/sharing such material, so to say, Dropbox (known as Cyberlockers) and Megalinks (known as Peer-to-peer), including a send-to-receive reference (s2r). As shown by the print-screens added to the court document, the search mechanism from Twitter suggests a dangerous combination such as location + CSAM references hashtags.

Even though copyright people have already developed a way of “monitoring” what is going on with those links, I had never, though researching for about four years on that, seem any mention of any kind about an industry effort to solve, exactly, this very same links while being shared through their platforms. It does not mean, necessarily, that I does not exist. It means, simply, that this document, if existent, has intelligently bypassed by diligent searches.

But there is something I have already seen: Those very same policies on copyrights. In 2019, Facebook removed all his links to Pirate Bay which was, to say it simple, a torrent searcher for pirated books, films and others. In 2019, Facebook was displaying, for anyone attempting to upload, including to its chats, links to the Pirate Bay the following message: “You can’t share this link. Your post couldn’t be shared, because this link goes against our Community Standards.”

In 2021, Facebook is still letting CSAM known hashes and known torrents to enter its platform for further removal. Why? But not only Facebook – the notice and remove policies apply overall in industry for CSAM files. Curiously, but, that not for copyrights. We might get even more surprise if we started to measure, comparing those two complaints, the speed accordingly to which those reports are assessed.

Peer-to-peer CSAM files remain a hidden risk for social media, whose Trust & Policies remain still image-fixed. It is time to give the proper attention to the URLs.

Short commentary, to think about.