Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

The Mephistophelian pact of end-to-end encrypted platforms on the fight against CSAM

Anton Kaulbach. Fausto und Mephisto. Faust und Mephisto. Öl auf Leinwand. 80 x 65 cm. Rechts oben signiert.

Author: Carolina Christofoletti

Link in original: Click here

Very recently, I have written an article talking about the WhatsApp and Telegram case (which you can read here) and the missed intelligence on WhatsApp (which you can read here). The point I wanted you to think for a while is that Law Enforcement Cooperation Guides exists and should work, as long as, Law Enforcement Agencies came to the problem first.

And the issue arrives, first, when we try to transpose, without any further, Trust & Safety and Law Enforcement procedures from non-encrypted things directly to their end-to-end-counterparts. Legal scholars working with Comparative Law are familiar with the “functional equivalent” problem.

The argument is one based on the asymmetry of information: Who knows more what is going on, based on data visualization: WhatsApp and Telegram or Law Enforcement with their public reports (I am talking from Brazil, where “infiltrating” procedures are quite different)? First option, for sure.

So far as I know, there is no such a thing as a duty to inform Law Enforcement Agencies when a WhatsApp group is removed by “violations of Child Sexual Abuse Material Community Policies”. WhatsApp should report it to the National Center of Missing and Exploited Children (NCMEC), but NCMEC Internet Service Providers Reports data has not even a single mention related WhatsApp (even though its siblings, that is, Facebook and Instagram, do appear). Nor is there anything of the kind about Telegram (You can read about it here).

What happens there might be even trickier than what one sees at first sight: Every time you report it to a report button that is managed by the platform itself and without any further accountability (both Telegram and WhatsApp have no transparency reports), you could display this data as you wish, isn’t it?

Law Enforcement people are (or should be expected to be) trained ones, with a legal protection to infiltrating those sordid places and, as such, they never press (or should never press) the “Report it” button. But public is not, they do (and should really) press it. The issues arise when we start to ask WhatsApp and Telegram what do they do with those public reports: Remove it if you want, but copy it all to an Excel document and send it immediately to law enforcement agencies.

But do you agree with me that, if platforms simply remove it and if things are encrypted (meaning that Telegram and WhatsApp are deleting things they have no idea of what they are and in what amount), you are burning evidence?

Indeed.

Not only you are burning everything, but you are banning the groups and not the users. How could they ban users, after all, if one cannot access communications in an end-to-end encrypted anything? Have you ever stopped to have a look at the country code and regional code of telephone numbers there? At some level, you may agree with me that criminals know each other. If one stops to map this for a while, I am pretty sure that one can figure out what is the 2.0 version of a removed group there.

Those Trust & Safety teams should, so, at some level, work inside the police.

Forget end-to-end platforms for a while and let us go to the hypothesis where public gives up reporting directly to platforms, and they go, either to the public ‘Review section” at Appstore to try to boycott it, either to a consumerist platform (as has happened to Reclame aqui) or to the police.

When they go to public places to make confidential complaints, they show routes, reason why those “complain” platforms should be also intensely moderated for anything with Child Sexual Abuse Material reference. Consider that and let us move to someone who, facing CSAM in a end-to-end platform, decided not to report it to Facebook (who holds a much larger amount of personal data than the police could ever think about having) but to Law Enforcement Authorities: Suppose that they try to reach Disque 100 Brazil.

Not everyone reporting CSAM went there to check and prove if it was really so. They reporting willingness stop, except if they have really faced the material, a moment before. What they send CSAM hotlines and police hotlines are something like a “please go there and see what is going on” request.

As international colleagues understand it, there is no such a thing as a “Proactive Search” regulated in Brazil. I will not enter, in this occasion, on those details. I want, in this occasion, that everyone starts (including the platforms themselves) taking the end-to-end encrypted reports seriously.

Law Enforcement Cooperation procedures are and shall not be, for the reasons mentioned above, the same as their non-encrypted counterparts. The improved communication request come from both sides: Police people, you need to have an open channel with those platforms because, if they ban you or the group, you lose the investigation. Platforms, you must put someone from LEA to see, immediately, what is going on. You control, after all, the “join group” requests.

Think about it.