This database monitors enforcement actions—including court cases—targeted at very large online platforms and very large online search engines, and gatekeepers as defined under the Digital Services Act (DSA) and Digital Markets Act (DMA). It covers initiatives at both the European Union and Member State levels. At present, the tracker records measures enforced under the General Data Protection Regulation (GDPR), DSA, and DMA. Soon it will expand its scope to incorporate the Artificial Intelligence Act (AIA) once enforcement under this new framework begins. Updates are applied weekly (on Mondays by 17:00 CET).
For clarity and consistency, each enforcement action in the database is structured around several key pillars. The first pillar, known as the “Action Framework,” classifies each action into one of three categories based on the nature and formality of the enforcement process.
The category of “Official proceedings” refers to formal legal processes initiated by regulatory authorities to ensure compliance, where platforms are granted the right to defend themselves by being heard. Examples of such actions include the opening of proceedings, specification proceedings, and preliminary findings. Those actions are marked in blue.
In contrast, “Informal actions” encompass measures that occur outside the bounds of formal legal processes. These are typically early-stage or less formal interactions—for instance, requests for information, investigative steps, or retention orders—where the Commission exercises its investigative powers without resorting immediately to full legal enforcement. Those actions are marked in pink. However, it should be noted that such informal actions can also be undertaken during the course of official proceedings, as new investigations can be required to clarify the facts or respond to new information as it emerges. In that case, these actions will be marked in blue.
The third category, “Court cases,” includes actions that have progressed to judicial resolution with decisions rendered by EU courts, such as the General Court or the Court of Justice of the EU. These actions are marked in green. Examples of court-related actions are judgments, applications to the Court, interim relief orders, and Advocate General opinions.
The second pillar, “Title,” provides a brief description of each enforcement action. This description generally begins with the legal type of action, followed by the name of the relevant company or service mentioned in the official press release, and is often accompanied by a short contextual note. The third and fourth pillars are dedicated to identifying the “Company” and the “Service” respectively. The company field denotes the legal entity involved—such as Alphabet, Meta, or ByteDance—while the service field specifies the particular service provided by the company (for instance, distinguishing between Facebook and Instagram for Meta, or between Google Search and YouTube for Alphabet). Services are color-coded based on different categories, inspired by the classification of the European Commission for the DMA and adapted to this tracker: blue is for social media services based mostly on text (such as Facebook, Instagram, LinkedIn and X); pink is online platforms which are image and video-based (such as Pinterest, TikTok, YouTube); green is for search engines (such as Google Search, Bing); purple is for services linked to devices, such as operating systems, web browsers and app stores; brown is for marketplaces (such as Amazon, Zalando, Shein and Temu); yellow is for pornography platforms (such as PornHub, XVideos, XNXX); and orange is for intermediary online services (such as Booking.com, Google Shopping and Google Maps).
Within the fifth pillar, known as “Action,” each enforcement measure is categorized according to its legal nature—such as a request for information, the opening of proceedings, or an application to the court. Additionally, a few miscellaneous tags—“User complaint” and “Fine”—have been included to provide more context, indicating whether an action was initiated in response to a user complaint (as referenced in the press release) or resulted in a fine.
The sixth pillar, “Legislation”, indicates the relevant digital regulation(s) under which the enforcement action falls. Currently, the scope is limited to the GDPR, DSA, and DMA; the AIA is included in principle and will be fully integrated once its enforcement starts.
The seventh and eighth pillars—“Alleged violations” and the associated “Articles.”—detail the specific non-compliance issues and articles mentioned in the official press release. Following them, the ninth pillar, “Regulator/Court,” identifies the authority responsible for initiating the action. This could range from regulatory bodies such as the European Commission to judicial entities like the Court of Justice of the EU or the General Court. The tenth pillar, “Jurisdiction,” specifies whether the enforcement action originates at the pan-European (EU) level or from an individual Member State.
The eleventh pillar records the “Publication date” of the enforcement action, ensuring accurate timeline tracking in line with the weekly update schedule. The twelfth pillar, “Status”, describes the current state of an enforcement action, categorizing it as either ongoing or closed. An action labeled “Ongoing” indicates that its effects are still in operation or that a deadline remains active—this might apply, for example, to a retention order during active proceedings or pending court applications. Conversely, a “Closed” action denotes that the case or proceeding has concluded, such as when a definitive judgment has been issued or a retention order’s deadline has passed.
The final pillars display “Case number” (if available), include a “Link” to the press release, and provide a prominent “Enforcement case” identifier that allows users to filter by enforcement cases or court cases containing multiple actions or entries. Each entry is also expanded into its own page, where a summary of the action is available.
This well-defined methodology and structured approach ensure that we record, categorize, and present every enforcement action in a uniform manner. By adhering to these criteria, the Enforcement Tracker remains a transparent, reliable, and accessible resource for all users interested in the enforcement landscape of EU digital and data laws.
Advocate General (AG) opinion: The opinion of an Advocate General is sought in every case tried by the Court of Justice, unless the latter decides that there is no new point of law. Opinions are not binding to the judges of the Court. Anti-Steering: Prohibits gatekeepers from steering users toward their own services over competitors (DMA Arts. 5–6). Application to the Court: When a platform applies to the Court to contest a decision taken by the Commission and/or a legal provision. Artificial Intelligence Act (AI Act): EU’s first comprehensive legal framework for AI. Adopts a risk‑based approach: prohibits unacceptable AI (e.g. subliminal manipulation), imposes strict requirements on high‑risk systems (e.g. biometric ID, critical infrastructure), and sets transparency rules for lower‑risk AI. Choice Obligations: Gatekeepers must present a fair-choice screen for default services (DMA Art. 6). Civic Discourse: Any actual or foreseeable negative effects on civic discourse, electoral processes, or public security are considered systemic risks to be addressed by digital services (DSA Art. 34). Closing of proceedings: A decision closing the proceedings puts an end into the formal case against a platform, either because there has been a non-compliance decision or the infringement has been resolved (e.g. commitments). Commitments: When platforms offer commitments to ensure compliance with a legal obligation, authorities may make those commitments binding through a decision and declare that there is no further ground for action. Company: Refers to the legal entity responsible for the service in question. This includes corporate groups or parent companies such as Meta, Alphabet, or ByteDance. In enforcement contexts, the company is typically the official addressee of regulatory decisions or legal obligations. Complaint Mechanism: Internal complaint-handling system and out-of-court dispute settlement for online platforms (DSA Arts 20–21) Contact Terms: Gatekeepers must allow business users and service providers to distribute their offerings via core platform services—that is, they must permit access to their platforms (DMA Art. 6). Content Moderation: Obligations related to content moderation under the DSA (transparency or as part of risks assessments). Court of Justice of the EU: The EU’s highest court, which ensures the uniform interpretation and application of EU law across all Member States. The Court has sole jurisdiction over requests for preliminary rulings (interpretation of EU law) from national courts, and also has the jurisdiction to review appeals limited to points of law in rulings and orders of the General Court. Data Access: Access to platform data by vetted researchers and scrutiny bodies (DSA Art 40) Dark Patterns: Prohibition of manipulative interface designs that nudge users toward unintended actions (DSA Art 25) and topic which can fall under the risk assessment obligation (Art 34). Dark patterns can also be discussed in relation to consent under the GDPR. Data Protection Authority: An independent regulator that oversees and enforces data protection laws (e.g. the GDPR) within its jurisdiction. Decision of non-compliance: Such a decision concludes, following an investigation, that a platform has been non-compliant with a legal obligation, ordering the platform to cease and desist the infringement. Fines may be imposed. Designation: Covers the Commission’s designation of very large online platforms and search engines (DSA Art. 33) and of gatekeepers (DMA Art. 3). Digital Markets Act (Regulation (DMA): Targets “gatekeeper” platforms (e.g. large app stores, social networks, search engines). Imposes ex ante rules to ensure contestable and fair digital markets: bans self‑preferencing, mandates interoperability, and requires data‑sharing with business users to prevent market abuse. Digital Services Act (DSA): Governs online intermediary services offered to users in the EU (e.g. platforms, marketplaces, hosting providers). It reaffirms the principle of liability exemption and safeguards transparency and content moderation obligations, as well as more stringent due diligence obligations for very large online platforms (VLOP) and search engines (VLOSE) such as risk‑management duties (e.g. regarding illegal content, fundamental rights, civic discourse). Digital Services Coordinator: The national authority in each Member State charged with supervising and enforcing the Digital Services Act (except for section 5 obligations). European Commission: The EU’s executive arm, responsible for proposing legislation, implementing decisions, and ensuring compliance with EU treaties. The Commission is the sole enforcer of the DMA and for the DSA when it comes to the compliance of VLOPs and VLOSEs with their obligations under Section 5. General Court of the European Union: The court that hears direct actions for annulment brought by individuals and companies. General Data Protection Regulation (GDPR): EU-wide framework for the protection of personal data. It sets strict rules on the collection, processing and storage of personal data of data subjects in the EU, and grants individuals rights over their data (such as access, erasure and portability), and imposes fines for non‑compliance. Generative AI: topic which can fall under the risk assessment obligation by VLOPs and VLOSEs, often in relation to illegal content, risk for civic discourse and fundamental rights. (DSA Arts 34 & 35) Interim measures: Enforcement action on the basis of a prima facie finding of an infringement or due to the risk of serious damage for the users of the service at any point in time during the investigation. Interim relief order: Platforms may also demand interim measures such as the suspension of the operation of an act that has been challenged before the General Court. The Court may prescribe any interim measure if a set of exceptional circumstances are met, before a decision is reached in the main action presented before the Court. Interoperability: Gatekeepers must ensure interoperability of core platform services with third parties (DMA Art. 6). Investigative steps: Investigatory steps to gather facts and information Judgment: Decision of the Court on a case. Judgments of the General Court can be appealed to the Court of Justice. Judgments of the Court of Justice are definitive. Marketplace Obligations: Requirements for platforms to ensure traceability of traders, keep transaction records and combat illicit offers (DSA Section 4) Mitigation Measures: Platforms must implement proportionate mitigation measures for systemic risks (DSA Arts. 34–35). N&A (Notice & Action): The formal notice-and-action takedown procedure for illegal content (DSA Art 16) Online Advertising: Concerns personalized ads and the processing of personal data (GDPR); ad transparency—clear labelling, advertiser identity, and why you’re targeted (DSA Art. 24), ban on the use of sensitive data (and personal data for children) based on profiling for ad targeting (DSA Art. 26 and 28) and obligation to publish an ad repository (DSA Art 39). Opening of proceedings: When an authority opens a formal case against a platform. Authorities open formal proceedings after investigations, when they suspect an infringement. Personal Data Processing: Any operation on personal data requires a legal basis and accountability (GDPR Arts. 5–6); data-sharing obligations for gatekeepers must comply with data-protection rules (DMA Art. 5(2)). Point of Contact: Obligation for platforms to appoint a single point of contact for authorities and users (DSA Art. 12). Preliminary findings: The Commission informs platforms of its preliminary view that the company is in breach of the legislation but it does not prejudge the outcome of the investigation. The Commission explains the measures that it is considering taking or that it considers the platform concerned should take in order to effectively address the preliminary findings. The platform has then the possibility to exercise its rights of defence by examining the documents in the Commission's investigation file and replying in writing to these preliminary findings. Protection of Minors: Due diligence duties for platforms accessible to minors to ensure a high level of their privacy, safety and security (DSA Art 28). One of the risks that VLOPs and VLOSEs must take into account in their risk assessment (DSA Art 34). Recommender Systems: Transparency on how content is ranked or recommended to users (DSA Art 27), obligation to provide an option not based on profiling (Art 38) and can also fall under the risk assessment obligation (Art 34). Representation: Right of data subjects to be represented by a consumer-protection association—representative action may be brought without a mandate and independently of a specific rights violation (GDPR Art. 80). Requests for information (RFI): Request to verify platforms’ compliance. Fines can be imposed if a reply is incorrect, misleading or incomplete. Request for information (RFI): Request to verify platforms’ compliance. Fines can be imposed if a reply is incorrect, misleading or incomplete. Retention orders: Request to platform to retain documents which might be used to assess their compliance with legal obligations, so as to preserve available evidence and ensure effective enforcement. Right to Information: Right to be informed about processing activities (GDPR Arts. 12–14). Risk Assessment: Obligation to conduct systemic risk assessments by very large online platforms to identify and mitigate platform-level harms (DSA Art 34) Self-Referencing: Prohibition on gatekeepers “self-preferencing” — i.e. treating their own services or products more favourably in ranking, indexing or related conditions than those of third parties (DMA Art 6(5)) Sensitive Data: Special categories of data (e.g., health, ethnicity, religion) require higher protection (GDPR Art. 9). Sensitive data based on profiling cannot be used to target ads (DSA Art 26). Service: Denotes the specific platform or product offered by the company that is the focus of enforcement action. For example, Facebook and Instagram are services operated by Meta, while Google Search and YouTube are services offered by Alphabet. Enforcement measures may target particular services, even when obligations are imposed on the company as a whole. Specification decision: Specification decisions are taken following specification proceedings. The Commission does not take a position on whether the gatekeeper complies with its obligations but measures in the decision are binding (DMA only). Specification proceedings: Formalised regulatory dialogue in order for the Commission to identify concrete measures the gatekeeper should take to effectively comply with a certain obligation. The Commission has (DMA only). Supervisory Fee: Annual fee for providers of very large online platforms and search engines upon designation (DSA Art. 43).