Image: TikTok

“We’ll confess to singing along to a Stevie Nicks song or doing an air guitar solo when no one’s looking. But some people take their lip syncing to the next level. More than 200 million people – 65 million of them in the U.S. – downloaded the app, [which is now known as TikTok]. It gave users a platform to create videos and synchronize them with popular songs,” the Federal Trade Commission (“FTC”) asserted in a formal statement this week. The app, which was founded in 2012 by Zhang Yiming and has since been acquired by ByteDance, a Chinese company, “also allowed users to interact directly with each other.”

While “that may sound like fun for aficionados,” the FTC asserts, “it raises concerns for parents.” And it is in that vein and in light of an until-recently-ongoing investigation that the operators of the app have agreed to pay a $5.7 million penalty for illegally collecting children’s data in the United States. To be exact, the company was penalized as part of a newly-announced settlement with the FTC for allegedly allowing children to use the app without parental consent, thereby, running afoul of the Children’s Online Privacy Protection Act (“COPPA”).

Hardly a slap on the wrist, the $5 million-plus fine is the largest ever civil penalty to date levied in connection with COPPA, according to the FTC.

Created to protect the privacy of children under the age of 13, COPPA is enforced by the FTC and requires that websites and applications obtain parental consent before collecting or using any personally identifiable information of children under the age of 13. In addition to specifying what must be included in a company’s privacy policy, including the requirement that the policy itself be posted anywhere data is collected, as well as when and how to seek verifiable consent from a parent or guardian, the federal law also defines what responsibilities the operator of a website or app legally holds with regards to children’s privacy and safety online, including restrictions on the types and methods of marketing to those under 13.

By requiring to users to provide their email address, phone number, full name, username, a profile picture, and a short bio, and for the first three years, failing to ask for the user’s age in order to prevent individuals under age 13 from creating accounts, TikTok violated the law.

While only just released to the public in 2016, TikTok has become particularly popular, with users flocking to the app to create short videos lip-syncing to music, share those videos and interact with other users. The problem, according to the FTC? A “significant percentage” of those users were under 13, and TikTok (allegedly) knew that. 

As the FTC asserted in its complaint in the since-settled suit, TikTok had “actual knowledge” that it was collecting personal information from children. “A look at users’ profiles reveals that many of them gave their date of birth or grade in school,” the FTC claims. “And since at least 2014, [TikTok] has received thousands of complaints from parents of kids under 13 who were registered users. In just a two-week period in September 2016,” for instance, “the company received over 300 complaints from parents asking [it] to delete their child’s account.”  

“The operators of TikTok knew many children were using the app, but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” FTC Chairman Joe Simons said in connection with the settlement, stating that the Commission “takes enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law.”

In addition to the extensive fine that TikTok is being required to pay, which tops the previously record-breaking COPPA fine, the $4.95 million penalty that AOL Inc. received after it allegedly allowed advertisers to target children under 13 with online ads, TikTok has agreed to remove all videos made by U.S. users under age 13. 

TikTok stated that going forward, it will “prompt new and existing users to enter their age into the app, and that younger users would be offered a limited, separate app experience that introduces additional safety and privacy protections designed specifically for this audience.”

UPDATED (September 4, 2019): The FTC and the New York Attorney General’s Office reached a $170 million settlement with Google and YouTube for alleged violations of COPPA, thereby, marking a new record settlement-tried penalty.