Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Sunsetting a CSAM investigation: The lost intelligence of non-CSAM data removed by default

Image Retrieved from: Wallpaper Access

Author: Carolina Christofoletti, Jorge Barreto and Hericson dos Santos

Link in original: Click here

As soon as Facebook finds, be that through Artificial Intelligence System or through Human Review (be it moderators or public reports), Child Sexual Abuse Materials (CSAM) being hosted on their platforms, Facebook ban not only the files which were found to violate their platform Terms of Service and Policies but the whole account which was sharing those.

Facebook’s rules are clear: “If we determine that you have clearly, seriously, or repeatedly breached our Terms or Policies, including in particular our Community Standards, we may suspend or permanently disable access to your account.” (read it in the Item 2 clicking here). In a nutshell, Facebook will not only remove the questioned file, but the entire account through which it was posted.

The problem, as one might guess, arises when CSAM materials come to Facebook’s knowledge first, hitting, maybe 3 months, maybe 6 months or maybe more than a year later, to the local and foreign Law Enforcement Authorities (LEA).

There are two scenarios: One, local and foreign Law Enforcement Agencies looking for targets where useful information was lost, through NCMEC reports. Two, local and foreign Law Enforcement Agencies who find CSAM files sourced forensically on Facebook and who do not have any idea how this file was being shared on Facebook, or if the target was using Facebook to “clean, forensically”, a file creation as a download.

If they do not have this data, checking where and if it ever appears on Facebook is almost impossible, for “checking Operation hashes with Facebook found hashes” is not part of Facebook’s Law Enforcement Cooperation Request, which must identify requested records with particularity, including the specific data categories requested and date limitations for the request, as well as include: “The email address, phone number (+XXXXXXXXXX), user ID number (http://www.facebook.com/profile.php?id=1000000XXXXXXXX) or username (http://www.facebook.com/username) of the Facebook profile.” (You can read it in Facebook’s official page here)

Hard case, especially if social media platforms are cleaning, by default, non-CSAM things of CSAM compromised accounts.

In an oversharing area, the Open Source Intelligence (OSINT) paradise of those compromised accounts is simply lost by a click, being the CSAM file the only thing that, due to U.S. rules, is being saved and reported (in case, through NCMEC). What do our target look like? How old is he? What restaurant did he visited? Is he married? Do we see the victim somewhere there?

But, if Facebook removes an account (with a bunch of other relevant materials, which could include non-CSAM photos of the criminals with its victim, for example) for violating their CSAM Terms of Service, would it not be also prudent to report it to National Law Enforcement Agencies (though foreign ones, according to what data is available), additionally to reporting it to NCMEC, suspending (freezing) the accounts provisory (rather than removing it), until a LEA agent comes and see what may be of investigative value?

Translating: Would it not be adequate if, in cases related to CSAM material those accounts, rather than waiting cooperation agreements to speedily work and LEA agents to run with those requests, could be preserved by default with proper and sequential LEA communication, according to the applicable law?

Remember that, at present, “Facebook do not retain data for law enforcement purposes unless we receive a valid preservation request before a user has deleted that content from our service.” (You can read it in Facebook official page here). Keep this highlighted information because it will be of further use.

There is also something we must solve, and immediately: Facebook CSAM databases are with the NCMEC, what leads to the weird situation where, sometimes, local Law Enforcement Agencies are dealing a NCMEC report (e.g., same file, another IP) hosted by their Federal Partners (whose database do not communicate) and they do not know, and maybe will never know that… because the local police has come first, and the databases are centralized in the hands of Federal Agencies.

This “reverse search” does not exist at present, even if having such integrated databases would have been of great value.

Account’s preservations are expected to be a great Law Enforcement Resource, if their Agencies ever come to be “faster than Facebook” in identifying that and if the Data Request with its Legal Fundaments itself does not cause the account to be, as mentioned in Facebook’s Terms of Service, immediately deleted for purposes of Terms of Service violations. LEA are to be seen as trusted CSAM “reporters” and their Data Request is supposed to cause the same “bombing” effect.

Talking about non-crossed data, if we consider that criminals are intelligent enough to properly manage their social media fingerprints, it seems also of relevance having, somewhere (where the data access allows), a map of cases where identical CSAM collections – even if they only come to be hashed later, when this information will appear – was being shared in the same or even in other platforms, even if by different profiles, with different IPs and e-mails, which could also mean: We are talking about the same individual or related ones.

What about the accounts created with the same identification data, surviving some very few hours, and later voluntarily removed, so fast as they were created: Is it not also an important data to map? Would this data make, somehow, sense, if Law Enforcement Agencies later discover that what the suspect in case was sharing brand-new material, hosted in the platform itself (out of tech eyes) or indicating its hostage somewhere else? It seems so.

Think about it.

Authors: Carolina Christofoletti, Hericson dos Santos and Jorge Barreto, at the realm of Internet Governance and Crimes against Children (USP Brasil) Working Group.