On 24 October 2025, the European Commission announced its preliminary findings that both TikTok and Meta Platforms may have breached the Digital Services Act (DSA). The statement marks another decisive step in the EU’s shift from legislative design to hands-on enforcement of its digital regulation.
Research access under scrutiny
One of the Commission’s key concerns is the difficulty independent researchers face in accessing public data. Under Article 40 of the DSA, very large online platforms (VLOPs), including TikTok, Facebook, and Instagram, must provide vetted researchers with meaningful access to data that “contributes to the detection, identification and understanding of systemic risks in the Union”, such as the spread of illegal or harmful content and disinformation.
According to the Commission, all three platforms failed to meet this standard. Their data-sharing mechanisms appear overly restrictive or incomplete, undermining the transparency the DSA seeks to guarantee. The Commission’s stance highlights that researcher access is now viewed as a compliance matter, showing how academic oversight and public scrutiny are becoming embedded in the EU’s regulatory framework.
Dark patterns and Notice and Action mechanisms
Meta also faces additional scrutiny for the design of its user interfaces. The Commission’s preliminary view is that certain interface choices may constitute ‘dark patterns’, designs that nudge users toward choices they might not otherwise make. In particular, Meta’s platforms are suspected of discouraging users from appealing content moderation decisions or of making it unnecessarily difficult to flag illegal content.
These findings show that interface design is not just a matter of aesthetics or user experience, but a regulatory issue. Under the DSA’s ‘Notice and Action’ mechanisms (Article 16), platforms must ensure that systems for reporting potential illegal content are “easy to access and user-friendly”.
What’s next?
TikTok and Meta now have the opportunity to respond to the Commission’s preliminary findings and to take corrective measures before any formal decision is made. However, the implications are serious: confirmed breaches could lead to fines of up to 6% of global turnover or ongoing periodic penalty payments for continued non-compliance.
Beyond the immediate penalties, the case illustrates how DSA enforcement is evolving, moving beyond simple ‘content take-down’ metrics to include data access and transparency reporting.
These proceedings are part of a broader enforcement wave that is testing the DSA’s practical strength. They suggest that the EU’s digital rulebook is no longer confined to legal theory, but it is becoming a living enforcement ecosystem, one where researcher access, user empowerment, and platform design all carry legal weight.
In addition, just a few days ago, on 29 October 2025, the delegated act on data access officially entered into force. This new framework allows qualified researchers to access not only public data, but also certain non-public datasets held by VLOPs and search engines. This development promises to further enhance transparency and accountability, helping to identify and mitigate systemic risks more effectively and paving the way for a safer online experience.






