Thumbnail of YouTube video “WHICH Sibling MAKES THE BEST TiKTOK Challenge!!” Credit: Not Enough Nelsons, YouTube (Public Domain)

Protecting Children from Tiktok

Zeal Jain

--

Background

The internet and social media have become increasingly prevalent in the everyday lives of people and especially in the past year with the pandemic. People were stuck at home looking for a way to destress and take back the social opportunities they were missing out on. As a result of this, TikTok, the social media platform where people can share funny, inquisitive, and creative short videos, blew up. I myself have become an avid user and once in a while, I will catch myself popping out a dance move or repeating a trendy saying from the app. One day I made a joke in reference to a video I saw on the app and was shocked to find out that my 12-year-old sister understood right away.

My sister and I. Credit: Zeal Jain

She mentioned how she had been using it for a while along with almost every other person in her grade. Later I found out that according to internal company documents reviewed by the New York Times in July of 2020, more than a third of the app’s daily users may be 14 years old or younger. My initial thought was “that’s cool I guess” but after thinking back to many encounters I had with really mature or hateful content showing up on my feed, it did not feel right that so many young children were accessing this at such a young age when they are naive and still have so much to learn, so I decided to research more into how regulation can be implemented to protect young users on the app.

The social media app founded in Shanghai, China was released in 2014 under its former name Musical.ly where people could post short lip-syncing videos and generate followings. In 2017, ByteDance bought this company, redesigned it to be Tiktok, and expanded the genre to short videos of all kinds.

Hierarchy of companies under Byte Dance and history of acquisition. Credit: GGC Capital, Cheetah Labs

The company says that to run their platform they must collect all sorts of data from their users, which includes personal, device, location, app usage, messaging, browsing data, and more. Veronica Barassi, an anthropologist and professor in Media and Communications interested in data and human profiling, states that with this kind of collection, Tiktok and other big data companies have the means and technologies “to build unique ID profiles to track children through time and across different spheres of social life, with little transparency and accountability.”

With the popularity and reach. of the application rising, it is important for these concerns to be addressed and for actions to protect children from data and privacy violations and harmful online content to be examined and addressed.

Data and Privacy Protection

Due to the Children’s Online Privacy Protection Act (COPPA) that was enacted in 1998, the age of digital consent is recognized in the United States as 13 years old. Credit: Color-Inverted and Edited Screenshot (Zeal Jain) of COPPA Final Rule Amendments from the FTC (Public Domain)

If a child under 13 wants to use a digital service, parental consent must be required from the service. Because of the complexity and cost this adds, many digital service providers choose to make the minimum age of use 13. In 2019, the Federal Trade Commission (FTC) filed a lawsuit against Tiktok for violating COPPA by illegally collecting personal information from underage children on the app. According to the FTC statement, the company had been collecting this data three years prior under its former name Musical.ly as back then, they did not ask the users for their age. Tiktok did not formally admit to being guilty but settled the claims with 5.7 million dollars, one of the highest-paid fines for a social media privacy scandal, and decided to take steps to increase security for underage children.

In the 2020 version of Tiktok, users are initially prompted to enter their birthday. If the entered age is below 13, they are immediately restricted from entering the general app on that device unless they supply proof of age over 13.

When the eligible age is entered, users are immediately prompted the next steps of setting up an account and the ability to go back and reinput birthday. However, if an age younger than 13 entered, a notification pops up stating that the user can not access the regular, unmoderated version of the app and will have a different viewing experience (will be discussed more later). Credit: Zeal Jain

Even though it is still very easy to bypass Tiktok’s age verification method by lying from the start, according to a 2020 study by Liliana Pasquale, a professor at the University College Dublin and Lero, comparing age verification across different popular social media platforms including Facebook, Snapchat, Instagram, Whatsapp, and more, Tiktok had the most robust method as it was the only one that involved a two-time age check and bars the underage user from entering right away.

Content Regulation

Filtering out hateful behavior and ideology online is a difficult task to implement due to the constant creation, wide access, and fast travel of content among the app. Tiktok has implemented community guidelines that they expect users to adhere to and failure to adhere can result in exclusion from monetization, suspension from commenting or posting, and even banning. However, this is only used as a deterrent and does not completely eliminate harmful content from entering the network. In 2018, Motherboard, a group of Vice reporters, found a number of posts of pornography on the app along with posts from users, some that appeared under the age of 13, saying that people on the app were asking them for nude images of themselves. When Motherboard asked Tiktok for a statement, all Tiktok said was that this behavior is prohibited on the app and they already have methods to monitor it with a report and moderation system. Later that year, when another company under Tiktok’s parent company ByteDance had similar allegations but this time made by China’s internet regulator, they immediately increased their staff of moderators from 6,000 to 10,000 showing how influential governmental pressure is in improving moderation.

Because complete filtering of content on the main app is currently impossible, Tiktok hosts a separate viewing experience called TikTok for Younger Users specialized for those who entered their age as younger than 13. Under this version of the app, the videos under the user’s feed are directly curated from Tiktok. Interaction is quite limited as the user’s cannot maintain their own profiles, share videos, comment under videos, or direct message. This controlled setting prevents mature or harmful content from appearing on the child’s feed and cyberbullying from happening with the lack of social networks.

Parental Involvement

Because parents or caregivers are in a greater position of intervening in what the child is and is not allowed to see than the app themselves, parental involvement could easily become one of the best ways to protect the children. Eldar Haber, an associate professor for Law & Technology at the University of Haifa, discusses privacy implications for children due to the Internet of Things and concerns about digital parenting in a peer-reviewed scholarly article. He raises the idea that because digitalization has become so ingrained in society, it is inevitable for children to make some kind of contact with the internet, so parents should take steps to improve digital wellbeing and minimize surveillance. Some ways parents could protect their children include limiting interactions with sensitive data-collecting platforms or allowing the use of some social network platforms only when they are present. He also mentions that they could teach their children what to be wary of on the internet and how to deal with those risks, just like teaching them what to do in real-life situations of meeting a stranger.

If a parent still allows their child to use Tiktok, there are still ways that they can monitor their child’s experience using in-app parental controls. In 2020, the company introduced Family Pairing, which allows the parent to connect their account to their teen’s (or younger if the parent gives child under 12 access) account so they can control commenting, direct messaging, searching for videos, screen time, setting the account to public versus private, and content settings.

A guide explaining how to set up and use family paring. Credit: @tiktoktips from Tiktok

At the end of the day, it depends on the relationship with the parent and what they are comfortable with their children seeing online, but it is important to recognize parents also have a great position of intervening, not only the government and the company themselves.

Further Implications

Tiktok has the infrastructure to support more content moderation and implement stronger privacy policies, but in reality, it was only after the application of substantial financial penalties, the company decided to implement more protection mechanisms as seen with ByteDance increasing their moderating staff or Tiktok deleting their data and implementing harder age restrictions after the FTC lawsuit. Additionally, because of the fast-paced release and spread of videos on the app, it is difficult to truly protect individuals from seeing any kind of harmful content. This is why governmental pressure towards the media company to enact greater protections, internal responsibility from the company, and an outside intervention from caregivers are all necessary to protect children on Tiktok. Still, because of how rapidly social media evolves, it is important that these steps are constantly questioned from different angles, and new methods to regulate social media are constantly being researched, so the digital world also can be safer for children across the world.

Further Inquiry

Anderson, K. E. Getting acquainted with social networks and apps: it is time to talk about TikTok. Library Hi Tech News, 2020.

Barassi, V. Child data citizen: how tech companies are profiling us from before birth. The MIT Press, 2020.

Haber, E. THE INTERNET OF CHILDREN: PROTECTING CHILDREN’S PRIVACY IN A HYPER-CONNECTED WORLD. University of Illinois Law Review, 2020, 1209–1248.

Pasquale, L. Digital Age of Consent and Age Verification: Can They Protect Children? IEEE Software, 2020.

In the News

A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety Questions (The New York Times, August 14, 2020).

China’s King of Internet Fluff Wants to Conquer the World (The New York Times, October 29, 2018).

TikTok bans misinformation about Jews, but antisemitic. conspiracy theories still get millions of views (The Independent, October 21, 2020).

TikTok, the App Super Popular With Kids, Has a Nudes Problem (Vice, December 6, 2018).

TikTok: Record fine for video sharing app over children’s data (BBC News, February 27, 2019).

TikTok’s Chief Is on a Mission to Prove It’s Not a Menace (The New York Times, November 18, 2019).

--

--

Zeal Jain
Zeal Jain

No responses yet