Tiktok is one of the most popular video-posting apps right now. The app is widely used both privately and professionally, by celebrities, companies, to teenagers. The app has 1 billion users worldwide. Of those, around 62% of its users are between 10 and 29.
But are the young users and the constant content being posted on the app safe?
The UK’s Information Commissioner’s Office (ICO) claims the video-sharing platform has inappropriately processed the data of those under-13s without consent. The breach is said to have been processed for more than two years from May 2018 until July 2020.
TikTok Inc. and Information Technologies UK Limited have received a “notice of intent” from the ICO, a legal document that comes before a potential penalty.
Information Commissioner John Edwards said:
“We all want children to be able to learn and experience the digital world, but with proper data privacy protections. Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.”
According to ICO’s investigation, the platform was found to have-
- processed the data of children under the age of 13 without appropriate parental consent
- failed to provide proper information to its users in a concise, transparent, and easily understood way
- processed special category data, without legal grounds to do so
TikTok and Privacy
TikTok has implemented a range of measures to improve user privacy and security, such as letting parents link their accounts to their kids and banning direct messaging for users under the age of 16.