Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

An Essay about Blindness: Instagram’s CSAM Reporting Policies

Image retrieved from: QuotesGram

Author: Carolina Christofoletti

Link in original: Click here

Do you know how Dark Web whistleblowers differ from their Open Web colleagues? Whoever enters the Dark Web knows, because of an imaginary criminal scenario that media has created on their heads, that if they ever found an underage material there it is probably Child Sexual Abuse Material (CSAM).

Because law enforcement people and platforms forgot to say whistleblowers that criminal content also appears on the Open Web in platforms, they use every day such as LinkedIn, Facebook, YouTube and others. When whistleblowers find CSAM on the Open Web, they tend to think that they are either revering, that is, seeing things where it does not exist (literature calls it paedophilic gaze) or that they are too intolerant to be on that platform.

After all, YouTube told me they are removing 99,4% of their Violating Content automatically through technology, Instagram also, Facebook also, so I must be really revering. More than that, if I enter Facebook or YouTube’s page, they explain very well in their Platforms’ Policies how good they are and how infallible they are. The tech giants! Why should I not trust them? They must be super well-equipped!

My friend lawyer… this is not odd to you, is that? I hope not. As I have mentioned somewhere else, someone with a sharp legal eye will recognize what I call here double immunity. And the double immunity only works, like in game-theory, if everyone is agreeing premises that are on the same direction. The difference from game-theory here is that one side of the premise is clear and open… for public scrutiny. First thing, so, stop with this claim of perfection. This is legally harmful, including to the very problem you are trying to solve.

In some places, like Instagram, the hosting platform lets, somewhere in your account configurations, a track of all things you have, during all your Instagram life, already reported to the platform. (Account>> Configurations >> Help >> Support Requests >> Reports] (Freely translated from German). Keep in mind that report buttons are not “user-friendly things”, nor tracking a matter of transparency. This is called legal design, and I intend to mention it elsewhere for it is a long subject. By now, I will only give you a preview question to think about it: When you press a button or sends a direct formulary, your report belongs only to the platform – who can now claim it has never received it.

I am choosing the Instagram case to talk about because things have changed there in a “curious” way. Instagram has included, very recently, a new, weird functionality that has not appeared anywhere in their “Terms of Service”, and, dangerously, in a more than polemical category: Nudity or Sexual Activity.

Once you report it, up to we-do-not-know-when, Instagram blocks the content for you (replacing it for a blur with an eye with a forbidden sign in it), and you become, weirdly enough, unable to report this very same post again. Instagram does not want duplication, but I am afraid it also fears escalation and criminal creativity in inserting criminal contents on their platforms. This is a shoot in Instagram’s own feet. Why?

Curiously, if you have ever had to report some kind of Child Sexual Abuse Material to Instagram, you will promptly remember that this very broader category is, in fact, exactly where the “Report CSAM” button is situated. (Report >> Nudity or Sexual Activity >> Involves a child).

No way to write an article for you without testing it: Yes! Reported as CSAM posts are also being blocked from further reports and stay in the platform, for everyone!

 Instagram, this is but craziness. If you have a content reported as CSAM there, why are you blocking it to the whistleblower only, when you should be blocking it for all Instagram Community, at least, until its first check? More than that, why are you blocking, after proved, your whistleblower forever?

You lose escalation, you lose criminal mechanic my Tech Giant!

Ideally, Instagram should let you appeal (as they let you appeal when they remove any of your content for violation of their Terms of Service), but, curiously, to Terms of Service Violation Reports, the appeal option does not exist.

For whoever has set this catastrophic Policy on Instagram, a security warning: The system has a gap, for the content is still accessible when your whistleblower access it not through Instagram’s App, but through its Web Version, where forensics opportunity is to be further explored, also and mainly, to send it to an external channel through URL extraction.

For whistleblowers, a warning: Report it, prior, to the App interface. If Instagram burns it, you say thanks for having read this article, extract the URL and go to the external channel (CSAM hotlines, if that is what we are talking about)

When you are but reporting things so serious as CSAM on Instagram, Instagram answers you something like that: “We can understand that this decision (not to remove it) is maybe upsetting for you, but there are some things you can do to avoid seeing, in the Future, content that you find disturbing…. You can block this disturbing account you have just reported, you can filter commentaries and emoji, etc.” (Freely translated from German).

Stop, making, your whistleblowers, a fool: They are not.

The turning point is here. Pay attention what is happening here: Through an “unmatching report”, Instagram is advising me to use a filtered version of its App in order not to see those kinds of contents. As a user, that is but not what I wish from Instagram. I do not want to use a filtered version and enter the bubble of a parallel world where Trust and Safety Policies exist only in my head. No. Even because those filters do not come to Instagram users by design and, above my head, the parallel world of unfiltered things keep existing.

I want Instagram to, instead of silencing whoever is reporting it with a prohibition to report it again if the report comes to be “unmatching”, to stop allowing illegal content such as CSAM material to enter its platform. And a way to do it starts, first of all, with the decision to take whistleblowers seriously.

 “Thank you for your report. We really appreciate your attitude of reporting it to Instagram. At this time, we decided not to remove your post because we do not see it as a violation of your Terms of Service. If you disagree, you can send it once more for your moderators for review or report it to an external channel (link). Instagram is working to make the platform better and safer place for everyone, and we need you to keep reporting”.

 

Seems trivial, but this is an example of a free of charge paragraph that could save the platforms millions of dollars… for having an internal report channel that does not work. Amazing, isn’t it?

Prior to that very bad advice that should serve to its whistleblowers own alienation, Instagram used to display a message saying something like: “This content/account does not violate our Terms of Service. Please read our Terms here”. At this opportunity, they referred you to the Terms of Service, as if to tell you: Read it carefully, prior to reporting something of the kind again.

We all know, or should know, that Instagram is doing a prior check with their AI system to all reported content and that human review comes only in a later stage, if and when things appear to be, in fact, a “Violation of platform’s Terms of Service”.

With this systematic, Instagram has a hard time with reported accounts. If an account is being reported, that means, first, that you need to figure out what exactly is going on and whereYou are lost in the jungle with a magnifying glass, and the lion roars just behind you.

If all you have is a “report-it” button, without any space for anyone commenting anything, the chances that you find, through someone pressing a button in your platform, what is going on is really-low. Specially, with illegal things are mixed with legal, nothing-to-do others.

Think, please, about it.