- The 25+ best Black Friday Nintendo Switch deals 2024
- Why there could be a new AI chatbot champ by the time you read this
- The 70+ best Black Friday TV deals 2024: Save up to $2,000
- This AI image generator that went viral for its realistic images gets a major upgrade
- One of the best cheap Android phones I've tested is not a Motorola or Samsung
Social Media Firms Fail to Protect Children’s Privacy, Says ICO
The UK’s Information Commissioner’s Office (ICO) has put 11 social media and video sharing platforms “on notice” for failing to do enough to safeguard children’s privacy.
The regulator warned the 11 platforms that they could face enforcement action if they do not bring themselves into compliance or demonstrate a compelling reason for their current approach.
This warning came following a review of the sign-up process young people go through to sign-up for accounts on 34 social media and video sharing platforms.
Among the 34 platforms included in the review were Facebook, Reddit, Snapchat, X (formerly Twitter) and YouTube. The ICO did not name the 11 platforms where privacy concerns were found in the sign-up process.
In April 2024, the ICO warned that social media ad video sharing platforms must work harder to safeguard the privacy of children using their services.
Major Children’s Privacy Concerns Identified
The ICO review highlighted a range of privacy issues relating to children in its review of the platforms. These included:
- Some platforms make children’s profiles public by default, and in a few cases, privacy only appeared possible if users agreed to pay a fee or to opt into a subscription service
- A small number of platforms appeared to enable children to receive friend or follow requests from strangers and to receive direct messages from strangers by default
- Some platforms appear to nudge children to switch geolocation settings on or encourage them to share their location with others through tagging or including location when posting content
- Some platforms may be profiling children for targeted advertising in a way that is not in the best interests of the child, for example through potentially excessive data collection and not giving children options to control advertising preferences
- While the majority of the platforms reviewed did use some form of age assurance at account set-up stage, with a minimum age of 13 specified, most relied on users’ self-declaration of their age, which is unlikely to be appropriate
The ICO has written to 11 platforms about these issues, in some cases warning of regulatory action if changes are not made to their practices.
It has also raised questions to some platforms to better understand how services are impacting children’s privacy. This includes how children’s personal information is being used in recommender systems.
This evidence will be used to inform ongoing work in improving how social media and video sharing platforms protect children’s privacy.
Emily Keaney, Deputy Commissioner at the ICO, commented: “There is no excuse for online services likely to be accessed by children to have poor privacy practices. Where organisations fail to protect children’s personal information, we will step in and take action.”
“Online services and platforms have a duty of care to children. Poorly designed products and services can leave children at risk of serious harm from abuse, bullying and even loss of control of their personal information,” she added.