British regulators have issued a demand for 11 social media and video-sharing platforms to enhance their protections for children’s privacy, citing a comprehensive review that exposed widespread deficiencies in safeguarding young users. The Information Commissioner Office in the UK has intensified its enforcement actions against companies that fail to adhere to the Children’s Code, a regulatory framework aimed at safeguarding minors online. The 11 platforms facing scrutiny are under investigation for their default privacy settings, geolocation data handling, and age verification procedures.
Deputy Commissioner Emily Keaney emphasized the paramount importance of children’s privacy in online services, stating that companies catering to children must prioritize privacy to avoid putting young people at risk of harm. The regulator is also looking into targeted advertising practices targeting children to ensure compliance with the Children’s Code and broader data protection laws. In an effort to gain deeper insights into the impact of social media on children’s privacy, the office has initiated a call for evidence focusing on how children’s personal information is used in recommender systems and recent advancements in age assurance technologies for identifying children under 13 years old.
Researchers, industry stakeholders, and civil society organizations are encouraged to contribute their expertise to this research, which will inform future regulatory actions to strengthen child protections. While the tech industry has made significant changes in response to the Children’s Code, the regulator underscores the ongoing need for vigilance to create a safer online environment for young people. Despite not immediately responding to inquiries about the 11 platforms under scrutiny, their response time, and potential fines for non-compliance, the ICO remains committed to upholding children’s privacy rights in the digital space.
This latest warning to social media and video streaming platforms comes after the ICO fined TikTok £12.7 million last year for multiple data protection law breaches, including allowing over one million children under 13 to use the platform without parental consent. Additionally, the US government has also taken a stand on children’s privacy by reprimanding Meta for misleading parents about its children’s data privacy practices. However, Meta has pledged to vigorously defend itself against the allegations, characterizing them as a political maneuver.
In conclusion, the regulatory scrutiny on social media and video-sharing platforms underscores the critical need to prioritize children’s privacy online and implement robust safeguards to protect young users from potential harm. As the digital landscape continues to evolve, regulatory bodies like the ICO play a crucial role in holding companies accountable and advocating for enhanced protections for vulnerable populations, especially children. The ongoing efforts to strengthen child protections through regulatory actions and industry compliance reflect a commitment to creating a safer and more secure online environment for all users, particularly the youngest and most vulnerable members of society.

