The European Commission has issued preliminary findings that TikTok and Meta have breached their transparency and user protection obligations under the Digital Services Act (DSA). The Commission’s initial assessment indicates that both platforms failed to provide adequate access to public data for researchers, while Meta – for both Facebook and Instagram – may also have violated its duties related to user notification of illegal content and content moderation appeals.
According to the Commission, Facebook, Instagram and TikTok appear to have imposed excessively complex and restrictive procedures for researchers seeking access to public data. These obstacles have reportedly resulted in partial or unreliable datasets, hindering independent research into issues such as user exposure – particularly of minors – to harmful or illegal content. Ensuring access to platform data is a key transparency requirement under the DSA, enabling greater public scrutiny of the societal impacts of large online platforms.
The Commission also raised concerns about Meta’s “Notice and Action” mechanisms, which are intended to allow users to flag illegal content. The current systems on Facebook and Instagram, the Commission found, do not appear to be user-friendly or easily accessible and may include unnecessary steps that deter users from reporting harmful material. In addition, the Commission noted the use of “dark patterns” – deceptive interface designs — which can mislead or discourage users, thereby undermining the effectiveness of the reporting process.
Under the DSA, online platforms are obliged to act expeditiously upon being notified of illegal content and to ensure users can easily submit such notices. Failure to do so can remove the liability exemption that platforms otherwise enjoy for third-party content.
The investigation further highlighted shortcomings in Meta’s content moderation appeal mechanisms. The Commission found that users challenging the removal of their content or account suspensions on Facebook and Instagram are not currently given sufficient opportunity to submit explanations or supporting evidence. This limitation, it said, weakens users’ right to an effective remedy under EU law.
The Commission’s findings are preliminary and do not prejudge the outcome of the ongoing investigations. TikTok and Meta now have the opportunity to review the case files and respond in writing to the Commission’s concerns. They may also take corrective measures to address the alleged breaches. The European Board for Digital Services will be consulted as part of the process.
If confirmed, the Commission may issue a non-compliance decision, exposing the companies to fines of up to 6% of their total worldwide annual turnover and potential periodic penalty payments to ensure compliance.
The Commission also announced that new rules on researcher access to non-public data from very large online platforms and search engines will take effect on 29 October 2025. These measures are designed to enhance transparency, accountability, and the identification of systemic risks arising from online platforms’ activities.
Commenting on the findings, Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy, stated:
“Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice. With today’s actions, we have now issued preliminary findings on researchers’ access to data to four platforms. We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society.”
The preliminary findings form part of ongoing formal proceedings against Meta and TikTok under the DSA. These are separate from other ongoing EU investigations into the companies’ compliance with different areas of EU law.
