European regulators fined TikTok €345 million ($367 million) for failing to protect children's privacy. This is the first time the short video sharing app has been penalized in Europe for violating strict data protection regulations.
The Irish Data Protection Commission, the main data protection regulator for big tech companies based in Europe's Dublin, said it had fined TikTok and reprimanded the company for violations dating back to the second half of 2020.
The Irish Data Protection Commission opened an investigation into TikTok's compliance with the European Union's General Data Protection Regulation in 2021.
Politico reported in August that the Irish Data Protection Commission was preparing to impose sanctions as its investigation focused on a range of TikTok features, including default account settings, family matching settings and age verification.
After consulting with the European Data Protection Board, the Irish Data Protection Board found that TikTok, by default, discloses children's accounts when children register on the platform.
This means that kids' videos are publicly viewable by default, and comments, duets, and edits are enabled by default.
Family Pairing is a feature introduced by the platform in 2020 that allows children's accounts to be linked to separate adult accounts to manage app settings such as limiting screen time and restricting direct messages and potentially inappropriate messages.
The Irish Data Protection Commission found that the family matching feature was not strict enough because a child's TikTok account could be linked to an account that the company had not confirmed belonged to a parent or guardian.
After registering, adult users can relax a child's account settings to allow direct messaging.
The ruling concluded that TikTok's age verification method did not violate GDPR laws, but the Irish Data Protection Commission found that the company did not adequately protect the privacy of children under 13 who were able to create an account.
In 2021, the platform enhanced the privacy settings for accounts of users between the ages of 13 and 15, making them more private by default.
The company said that it did not agree with the decision, especially with regard to the amount of the fine, noting that criticism of the regulator three years ago focused on benefits and training.
“We made changes long before the investigation began, including privatizing all accounts under the age of 16 by default and disabling direct messaging for children ages 13 to 15,” TikTok said.