Music video sharing app TikTok is again under investigation on how it collects, handles, and uses the personal information of kids, raising fresh concerns about the dangers that children face online.
Elizabeth Denham, head of the Information Commissioner’s Office (ICO) in the United Kingdom, said July 2 in a parliamentary hearing that the ICO has launched an investigation into whether TikTok was in violation of the European Union’s General Data Protection Regulation. The data privacy law requires companies to implement initiatives to protect the personal data of children.
Denham also pointed out how the open messaging system of TikTok allowed any adult to communicate with children on the platform, possibly without their parents knowing about it.
“We are looking at the transparency tools for children, said Denham said. “We’re looking at the messaging system, which is completely open, we’re looking at the kind of videos that are collected and shared by children online. So we do have an active investigation into TikTok right now, so you can watch that space.”
The investigation in the U.K. follows a similar move by the ICO’s U.S. counterpart, the Federal Trade Commission (FTC), which slapped a record $5.7 million fine on TikTok in February 2019.
The fine was the largest civil penalty ever imposed for children’s privacy violations, according to the FTC, as the app failed to acquire parental consent for users under 13 years old. The Children’s Online Privacy Protection Act states that websites and apps that will collect information from children below age 13 will first need to secure parental consent.
TikTok’s collection of children’s personal information, which is now under investigation for the second time, is just one of the issues that parents should be concerned with if they let their children loose online.
Google-owned YouTube, another video sharing platform, may have to implement major changes to its recommendation algorithm as it is under investigation by the FTC for how it handles videos aimed at children.
The investigation, which is in response to complaints made as far back as 2015, is examining accusations that YouTube is failing to protect children, particularly when the service’s algorithm recommends or queues inappropriate videos. The FTC is also checking whether YouTube improperly collects data from kids.
A New York Times report also called out YouTube’s automated recommendation system for involving otherwise innocent videos of children in the platform’s pedophile problem.