Business Ethics & Corporate Crime Research Universidade de São Paulo
FacebookTwitterGoogle PlusYoutube

Inside the Battle for Internet Global Morals: Instagram Kids – A Commentary (Part 1)

Retrieved from: Technoblog

Author: Carolina Christofoletti

Link in original: Click here

 

If platform’s ‘user minimal age’ requirements are currently being defrauded by parents, siblings or relatives who formally hold accounts that are, in fact, used by children, creating a Kid’s Version of popular Apps such as the image-based Instagram may be an adequate solution for better protecting children in the online environment.

At least, it seems a more realistic solution than popping-up the screen with easy-to-close Safety Banners every time an invitation to connect is sent from someone who has triggered the platform’s mysterious Safety Alert (for critics, see this article).

We already know, by the force of empirical facts, that children are using not only Instagram, but social media Apps as a whole. Instagram himself, together with other Apps such as Tiktok, Youtube and others, has also a Parent’s Guide attesting the state of arts.

And social media platforms are right when, rather than denying facts, they confront themselves with the problem as it’s really presented: What to do with the children on Youtube, Instagram, Facebook, Tiktok and so on that, despite the age requirements, are still using those platforms?

Social Media Addiction is already there. No one denies it. But there’s something that, facing the present numbers, might be even more concerning and urgent than social media addiction: Child Safety.

What we talk about when we talk about Instagram

According to Facebook’s 2020 Transparency Report, Instagram acted (see Instagram Community Guidelines), between October 2020 and December 2020 (Q4), on:

  • 11.M pieces of Adult Nudity and Sexual Activities content
  • 5.0 M pieces of Bullying and Harassment content
  • 800.0 K pieces of Child Nudity and Sexual Exploitation of Children content
  • 308.0 K pieces of Organized Hate content
  • 355.1 K pieces of Terrorism content
  • 6.6 M pieces of Hate Speech content
  • 3.4 M pieces of Suicide and Self-Injury content
  • 5.6 M pieces of Violent and Graphic content
  • 70.2K pieces of content related to Firearms
  • 1.4M pieces of content related to Drugs

All those contents were found after they were posted.

According to Instagram 13+ model, posts are not being approved before being posted, though they may be moderated later on.

Little is known on flagging procedures. Also, there is no cue on how long they survive on the platform, neither on how hidden those harmful contents are.

Age-Checking technology

Whilst age verification technology could help when children are creating their own account, the solution is void when we realize that most of the children using those platforms are doing so by accounts that were created by their parents and will survive until the end of their digital lives.

Children will not stop using some apps just because, according to the platform’s terms of service, they shouldn’t hold an account there. And, in times of pandemic homework and closed schools, platforms should bear in mind that parents may be willing to help the children to find, in the middle of the pandemic boredom, something to be occupied with.

The fact is that, without a Kid’s Version of the App, children are now being exposed to the same content that adults are. Though children are present, the platform obeys a 13+ design.

Content suggestion algorithm

Once children are using a parallel account created by loved ones, the algorithm they are being exposed is the same that is working on the +13 version. Even though content suggestions can be personalized according to child’s interest, a few wrong clicks or a known keyword is what is needed to move from a cartoon to a pornography website. Nothing prevents kids from accessing escorting models’ pages, child’s model agencies, eating disorders page and other kinds of (potentially or actually) harmful contents directly from their supposedly 13+ accounts.

New Child Safety Features

Implementing new safety features for 13-18 users in an adult platform could have been an ideal solution, was not for the fact that it was constructed far away from reality. Why should we suppose that, rather than rouletting the birth year, children are really choosing their exact birthday date when registering… and in the age of data protection? More obvious than never, if the account was ‘age-validated’ by the parents, child safety features are in fact, offline.

How much do parents know about the importance of setting correct birthdates when registering their children’s account? The answer to this question becomes more transparent when we ask, in parallel, how many accounts are in fact registered as such (from 13 to 18 years old).

A Proposed Design: What could Instagram Kids look like

Instagram Kid’s is meant to be a fully updated version of Instagram 13+. That means that, not only we should expect a more kids-friendly design of things such as Report, Block and Restrict red buttons, but Instagram 13+ moderation is made less complex. Needs had suddenly become easy identifiable.

With Age-Verification technology working, Instagram for Kids is meant to be a universe completely separated from its 13+ colleague. Adults, rather than sending following request to children (which should be later approved and moderated by children themselves), would not be allowed there.

Both of Instagram worlds could be allowed to communicate, but all posts featuring in the Kid’s version should have to be previously approved by a new Team of Content Moderators that, rather than generalist, could be specialized in ‘for children’ contents.

Instagram Community could help Instagram in this pre-selection: If, at the time of the post, users helped Instagram to flag if the content is meant to feature on Instagram Kids, an earlier choice would have been already done. With lower numbers of posts to be reviewed and combined with another technological tools, human review is achievable.

Further, new Terms of Service and Community Guidelines would be set to the Kid’s Version, simplifying the writing of those terms on the Instagram 13+ Version. A platform that is adequate for children is expected to be born here and things such as sex emojis or sex words could survive in the Adult’s version of the app, if anything illegal is being featured there.

Also, things such as nudity filter, grooming technology and CSAM PhotoDNA could be implemented directly in the platform side, freeing child protective technology from ‘misuse’ fears.

Further, privacy settings such as Turned-Off Geolocation could be implemented by design and adds could be finally reviewed according to not an ‘also adequate to’, but ‘adequate to’ children’s criteria.

Without the need of activating any platform function…. everything by design.