Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Classifying images in the real world, objective feature of pictorial things and why consent doesn’t and shall not matter when we are talking about pornography

Author: Carolina Christofoletti

Commentary on:  Stop Internet Sexual Exploitation Act Available

 

This legislation will apply to all online platforms that host pornography. This legislation does not intend to modify existing law under the Communications Decency Act, Section 230. Key components of the legislation:

  • Require platforms hosting pornography to, within two weeks of enactment: a) Require any user uploading a video to the platform verify their identity

Commentary:

If we google how age verification works nowadays, we are going to find that risk management is being done in a very questionable way: Self-disclosure formularies.

It’s absolutely clear that, for anyone wanting to upload illegal content, the trick is pretty simple: provide fake data. It also works for fake identity.

Bear in mind that, regarding pornography industry, the only real identity check or age verification is being done not by corporations, but by image classifiers and police. That is, right after illegality has come to their eyes.

Since literature still keeps incessantly showing us what kind of psychological harms can be derived from the fact of being part in a pornography scene (with scholars already discussing, for example, the right of victims not to be identified in sexual exploitation material), we should bear in mind the size of the trauma if the hereby provided identity is not a fake, but a real one.

  • Require any user uploading a video to the platform also upload a signed consent form from every individual appearing in the video.

Commentary:

Does a sign of consent solve anything? First point to be questioned is, considering that this Act proposal emerges in a context in which we are facing illegal content, if this signed content is valid.

From a civil law point of view, is the victim under coercion (also psychological coercion) so is this “contractual delegation of personal rights” void.

Finally, once we are talking about someone’s image, this consent is revocable at any time, which means that this consent formulary is worthless from a legal point of view.

The worst of all is that, if Sexual Exploitation Material was previously being shared without victims consent, with this “sign here for uploading” we would be creating a repulsive psychological threat to victims that, because of this unadvertised exigence, would be now created by platforms them.

● Creates a private right of action against an uploader who uploads a pornographic image without the consent of an individual featured in the image.

Commentary:

Wouldn’t a right of compulsory removal against the platform hosting the content be much more practical and efficient in stopping image circulation than compelling victims to sue the uploaders in Courts?

If the intent of the proposal is to make adult websites more responsible for the content that they are hosting, setting liability parameters on third parties will not help, but rather difficult things.

We now know where pornography corporations sit. They are actionable. If we are moving liability to third-party uploaders, we are going into the risk of never finding a defending to pornography victims suit.

● Require platforms hosting pornography include a notice or banner on the website instructing how an individual can request removal of a video if an individual has not consented to it being uploaded on the platform.

Commentary:

If someone’s consent of featuring in a pornography video is anytime revocable, removal shall under these circumstances not to be requested but barely carry out. If the pornography industry is to be regulated properly, we should see two kinds of analysis here:

a) Legal content, actionable at any time by its performers.

b) Illegal content, actionable at any time by anyone, where uploading consent is indifferent to the expected result: immediate removal. A report button shall also be created here, still with more urgency.

That means, for example, that we shall not be discussing if victims consented to the upload of their rape images. We shall not be discussing if people featuring on adult websites extreme contents are faking a crime scene or not. Objectively, the image is to be considered one of coercion and, for that, could not even be consented either to its real world scene (we keep criminal law philosophy alive), either to its production and either to its sharing.

Though brought together by an intersectional objectivity, sexual crimes and sexual crimes images are independent. If consent is relevant to some sexual crimes, it’s negligible when we are talking about images.

● Prohibit video downloads from these platforms, to be in place within three months of enactment of this legislation.

Commentary:

Though download prohibition can save copyrights, it cannot help content being shared outside the platform through other means. (think about, for example, recording by a camera).

Specially when we are talking about illegal content (consider, for example, that Child Sexual Abuse Material downloading is per se a crime), we shall ask if this is a real target or, rather, things already work differently.

  • Require platforms hosting pornography offer a 24-hour hotline staffed by the platform. Individuals who contact the hotline can request removal of a video that has been distributed without their consent.
  • Require removal of flagged videos as quickly as possible, but not to exceed 2 hours.

Commentary:

If 2 hours time limit can exceed the human potential of some platforms regarding the high amount of contents that can be actionable during that time, content suspension could be another, more directed, solution for the removal problem.

That means that while no one has humanly reviewed the content, it can not return to the platform… “Temporarily out of order” may be a much more practical solution for the timely action on suspicious content.

● Require platforms to use software to block a video from being reuploaded after its removal. The platforms must have this software in place within six months of enactment of this legislation.

Commentary:

That means, hashing images and being part of already existing hashing databases, which should be set at an international level.

Curiously – and only regarding Child Sexual Abuse Material – I have only come into only one adult platform that was using hashing technology to avoid the risk of having previously-know child sexual abuse material being shared in its domains. I have no information of something like that also working for adult content.

● Directs the Federal Trade Commission to enforce violations of these requirements.

Commentary:

Ok.

● Creates a database of individuals that have indicated they do not consent. The database must be checked against before new content can be uploaded to the platforms.

Commentary:

Sharing publicly or between private parties a list of people that have not consented to sharing material where they are featured is the perfect scenario for sextortion. We already know the viability of the scam!

And what shall we expect of those databases? For them to work, one of the premises must be confirmed a) or the identity on the database is real, for which sextortion is a real threat b) or victim identity is also real, for which we just need to check where the threatening material is.

No need of individuals database being administered by pornography industry!

  • Instructs the Department of Justice to promulgate rules on where this database should be housed, and determine how to connect these victims with services, to include counselling and casework.
  • Failure to comply with this requirement will result in a civil penalty to the platform, with proceeds going towards victims services.

Commentary:

Since compliance measures regarding this database proposal relies on the data sharing between government agencies and pornography industry, we are back to the previous point.

It may be that victims do not want to receive on their houses a letter saying that, as their images were found and government bodies know that, they will be working on their cases… maybe a day in the future.

Since platforms are most of the time a legal person and, depending on where these rules are being set, corporations’ reasonable steps to avoid illegal content hosting can avoid their criminal (intention parameter) and maybe also their civil liability (expectation of conduct), letting things as they already are is what we can expect by enforcing those parameters.

Think about it. Regulating content might be more urgent than regulating how they can be laundered