On May 15, 2025, the European Commission issued a preliminary opinion notifying TikTok (owned by ByteDance) that its ad repository does not meet the requirements set out in the Digital Services Act (DSA). According to the EC, the platform does not provide sufficient or cost-effective information about content, ad targeting, or financing—critical data for detecting scams, disinformation, and election interference.
What does the Digital Services Act (DSA) require regarding the ad repository?
The DSA, in force since the end of 2022, requires very large platforms (more than 45 million active users in the EU) to publish an ad repository that is:
Transparent and accessible, not only for the commission, but also for researchers and civil society
Machine-readable, with filters by content, payer, audience, period, and reach.
Fully searchable, allowing you to identify fraudulent, political, or manipulative campaigns.
These requirements serve as a safeguard against scam ads, disinformation campaigns, and hybrid threats, especially during electoral processes.
Main failures identified by the EC
Lack of transparency in content, segmentation and financing:
It is not possible to identify who pays, who the advertising is directed at, or what is being advertised.
Low usability:
The search engine does not allow for comprehensive queries; filters are limited.
Poor technical implementation:
The system is not accessible, reliable, or suitable for third-party analysis.
These deficiencies prevent a “full inspection” of the risks inherent in TikTok’s advertising systems.
Right of reply and possible sanction
TikTok now has the right to respond: it can request access to the file, examine the documents, and present arguments or evidence within the established timeframe. In addition, the European Digital Services Council will be consulted for a parallel technical report.
If the infringement is confirmed, the following may be imposed:
A fine of up to 6% of global annual turnover
Corrective measures, with enhanced supervision
Periodic daily fines if non-compliance persists
Case Background and Regulatory Scope
The formal investigation began in 2024 and also covered risks arising from:
Addictive algorithmic designs (“rabbit-hole effect”)
Protection of minors and age verification
Access to data for research
Risks associated with electoral campaigns (including the Romanian case, December 2024).
The current ruling is the first formal action under the DSA against TikTok for advertising transparency violations, following a similar warning issued to Platform X in 2024.
TikTok’s Reactions and Response
TikTok stated that it is “reviewing the preliminary findings” and reaffirmed its commitment to the DSA, although it noted disagreements with some regulatory interpretations and the lack of clear public guidelines.
The European Union emphasized that:
“Transparency in online advertising—who pays and how it is targeted—is essential to safeguard the public interest.”
The case strengthens TikTok’s oversight in other areas under investigation, such as child protection, algorithms, and electoral stability.
Implications for digital platforms and advertisers
Stricter regulations: Platforms will need to improve transparency from technical design to data structures.
Researchers and civil society: They will have better tools to detect fraud, misinformation, and political manipulation.
Pressure on advertisers: Brands may be exposed to public scrutiny of their campaigns.
Push-pull effect: Other digital giants (Meta, Google, etc.) will see their need to comply with the DSA reinforced.
