Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Users intentions related to Child Sexual Abuse Material (CSAM) in social networks

Author: Braian Arroyo

Link in original: Click here

A recent post by Facebook on its blog called Understanding the intentions of Child Sexual Abuse Material (CSAM) sharers – Facebook Research (fb.com), analyses the intentions of users who share child sexual abuse material (CSAM) on its platform, taken from an investigation by its child safety team. It included interviews with world experts on sexual exploitation, including the National Center for Missing and Exploited Children (NCMEC), in order to improve its intervention and make it effective by differentiating those who share images based on a sexual interest in children and adolescents, from those who share CSAM material with the intention of being funny (memes) or outrage.

 ¿ Why are these differences important to Facebook?

  • This will provide a broader context to improve reporting to NCMEC and law enforcement, as well as more effective triage to identify children who are currently being abused.
  • Develop specific interventions to avoid the exchange of these images.
  • Differentiate people who share CSAM from anger or indignation to those who share it out of pleasure.

 Research

The research pointed out a number of key themes around the types of engagement people may have with child sexual abuse material, such as the intersections between online and offline offences and spectra of offensive engagement (from browsing to production of images, videos, etc.).

Facebook, like many companies in the digital world, analyse their platforms using behavioural signals techniques. The taxonomy or type of research is influenced by Lanning’s work in 2010.

“His research outlined a number of categories of people who engage with harmful behaviour or content that involves children. Lanning divided those who offend children into two key groups: preferential sex offenders and situational sex offenders. ”

Lanning classified the situational offender as someone who “does not typically have compulsive-paraphilic sexual preferences, including a preference for children. However, they can have sex with children for a variety of and sometimes complex reasons.” A preferential sex offender, according to Lanning, has “definite sexual inclinations” toward children, such as pedophilia.

 Facebook research taxonomy

With the aforementioned, Facebook began working on tests to understand the intentions of those who share CSAM.

According to their report, the prevalence of this content is very low, which means that shipments and views are very infrequent, even so, when they detect this type of infraction, regardless of context or motivation, they eliminate it and report it to NCMEC.

For this purpose, they have analysed 200 reports sent to NCMEC for having CSAM, generating variations in the analysis of behaviour and intentions among users, testing these indicators on the platform together with the data science team.

In the article they make explicit that they are just doing early tests to improve their understanding in order to apply it on a large scale. From these results it appears that 75% of the analysed accounts did not have malicious intentions (that is, they did not intend to harm a child or adolescent) but that it came from other reasons such as having shared content due to users indignation or bad mood.

Definitions and examples of the classification of users selected during the investigation made by the company are listed below.

 Malicious users

 Preferential offendersPeople whose motivation is based on an inherent and underlying sexual interest in children (i.e. pedophiles / hebephiliacs). They are only sexually interested in children.

           Example: The user is connected with several minors, and he is coercing them to produce CSAM, with threats to share the existing sexual material they have obtained.

 Commercial Offenders: Persons who facilitate child sexual abuse for the purpose of financial gain. These people benefit from the creation of CSAM and may not have a sexual interest in children.

Example: A parent who is making their child available for child abuse through a live broadcast in exchange for payment.

Situational offenders: People who take advantage of situations and opportunities available to interact with CSAM and minors. They can be morally indiscriminate, they can be interested in many paraphilic subjects and CSAM is a part of that.

Example: The user who is approaching other users to request sexual images (adults and children), if a child shares images, they will engage with those images and the child.

 Non-malicious users

 Unwitting offenders: This is a broad category of people who may have no intention of causing harm to those children and adolescents who are depicted in CSAM images, but they decide to share these contents for humor, outrage, or ignorance.

Example: The user shares a CSAM meme of a child’s genitals being bitten by an animal because they think it is funny.

 Non-Exploitative Minor Users: Children and adolescents who are engaging in normal development behaviors that, while technically illegal or against policy, are not inherently exploitative, but do contain risks.

Example: Two 16-year-olds who send sexual images to each other (sexting). They have known each other since school and are currently in a relationship.

 “Risky” situational offenders: Individuals who habitually consume and share adult sexual content, and who come into contact with CSAM and share this material as part of this behavior, potentially without being aware of the age of the subjects in the images they have received or shared.

Example: A user received an image or video representing a 17-year-old, is not aware that the content is CSAM. She is sharing it again in a group where people are sharing adult sexual content.

No hay texto alternativo para esta imagen

Image 1: taxonomy of users determined by Facebook

Research carried out by the Online Child Safety Research Team of the Grooming Prevention Public Policy Institute, Honorable Chamber of Deputies of the Province of Buenos Aires, Argentina

Taking as a reference the Facebook´s material regarding the intentions of the users who share CSAM on its platform, this team has analysed data obtained from the last five (5) reports sent to Facebook where group of users involved with CSAM were reported.

The report included groups and their members, but not all users have a direct relationship with CSAM. A recurring practice has been seen in the publications. Many users use the platform as a bridge to get or share child sexual abuse material, for this they make publications or comments in order to migrate to instant messaging apps, such as WhatsApp, Telegram or Facebook Messenger groups. For this, they share a URL, which leaves any user: girl, boy, adolescent or adult on Facebook at one click away from these attackers.

Among the recurring practices, some displayed publications use their own code that includes the letters CP (child porn), used to generate variants in Spanish such as “carne y papas” (meat & potatoes), “caldo de posho”(chicken soup), “costal de papas” (potatoes bags),” Código postal “(postal code), to mention only few of them. Here are some screenshots taken by our researchers from this exercise.

No hay texto alternativo para esta imagen

Image 2: taxonomy determined by the IPPPG – HCD Buenos Aires

No hay texto alternativo para esta imagen

Image 3: Examples of existing cases in the network

We have also identified users who post to groups that are not intended to share CSAM material, but use it as a means of reaching other users who are interested in sharing this type of material. For this, they generate URLs of WhatsApp groups, among other applications.

Making use of the taxonomic groups that Facebook has differentiated previously, we can identify a variant within the classification of “malicious users”, which we could define as situational-link-criminals who consume and share this material, with awareness, making use of the platform Facebook as a means of interaction with other users, migrating to private instant messaging systems, where they explicitly exchange CSAM.

This document is reserved in nature, made for the Facebook team in order to indicate a variant that exists within the platform and could be analysed in order to detect the users who perform this manoeuvre, to generate the corresponding reports to NCMEC.

Within the child safety policies, Facebook clarifies that it eliminates profiles and content that explicitly sexualize children and adolescents. This team understands that non-explicit content, as it does not represent levels of child nudity, is more difficult to detect. The images alone might not pose an immediate danger, but by analysing the behaviours, accompanying texts, keywords used by users in each country and region, you can help determine the content of CSAM to be removed.

Finally, it is necessary to clarify that this methodology is used in all social networks, video platforms and any other online space that allows users to interact to recognize, organize and immediately migrate with a click on a link to instant messaging services. For this reason, it is increasingly complex to generate automated detection mechanisms, and it will become a great challenge for which it will be necessary to have trained and trained operators who can help in detecting and stopping the spread of CSAM.