EU Launches Investigation Into TikTok Over Privacy Concerns
The EU has opened an investigation into TikTok over concerns around the protection of minors, advertising policy and privacy.
The European Commission announced on February 19 that it was opening formal proceedings to assess whether the social media platform has breached the EU’s Digital Services Act (DSA).
The investigation will focus on several areas, including:
- Assessing whether TikTok has implemented the required mitigation of systemic risks, including algorithmic systems, that may stimulate behavioral addictions and/ or create so-called ‘rabbit hole effects’ – i.e., the tendency to become engrossed in something, usually online, to the point of losing track of time and neglecting other responsibilities.
- Assessing whether TikTok has put appropriate and proportionate measures in place to ensure a high level of privacy, safety and security for minors, particularly regarding default privacy settings for minors.
- Assessing whether TikTok has provided a searchable and reliable repository for advertisements presented on its platform.
- Assessing the measures taken by TikTok to increase the transparency of its platform.
Failures to these requirements would breach articles 34(1), 34(2), 35(1) 28(1), 39(1) and 40(12) of the DSA.
Today we open an investigation into #TikTok over suspected breach of transparency & obligations to protect minors:
📱Addictive design & screen time limits
🕳️ Rabbit hole effect
🔞 Age verification
🔐 Default privacy settings
Enforcing #DSA for safer Internet for youngsters pic.twitter.com/4d2F0FQUHw
— Thierry Breton (@ThierryBreton) February 19, 2024
TikTok sent its own risk assessment analysis to the EU Commission in September 2023.
The Commission then invited representatives of the social media platform owner, ByteDance, to answer formal Requests for Information on illegal content, protection of minors and data access in October and November 2023.
The Digital Services Act Explained
The DSA is a piece of legislation recently adopted by the EU to regulate online platforms and create a safer digital space. It came into effect on October 27, 2022.
It sets out clear obligations for these platforms to tackle illegal content, protect users’ fundamental rights, and combat the spread of disinformation.
Any company in breach faces a fine worth up to 6% of its global turnover.
The DSA came into effect on October 27, 2022, but platforms had a grace period to comply with its requirements.
This grace period depended on the type of platforms as categorized by the DSA:
- Platforms with more than 45 million EU users are described as Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs). For them, the DSA became fully applicable on August 31, 2023.
- All other platforms covered needed to comply with the DSA on February 17, 2024.
TikTok was designated as a Very Large Online Platform on 25 April, 2023, following its declaration of having 135.9 million monthly active users in the EU.
The EU opened its first formal probe under the DSA last year into social media company X over suspected breaches partly relating to posts following Hamas’ October 7 attack on Israel.
What’s Next?
The opening of formal proceedings will be conducted by Digital Services Coordinators, or any other competent authority of EU Member States.
Such an investigation empowers the Commission to take further enforcement steps, including interim measures and non-compliance decisions.
The Commission can accept any commitment made by TikTok to show compliance evidence.
The investigation can last as long as the Commission needs it to, as the DSA does not set any legal deadline for bringing formal proceedings to an end.
The Commission also said it will continue to gather evidence through other mechanisms, including sending additional requests for information, conducting interviews or inspections.