On 24 September 2025, the European Commission announced that it had sent formal requests for information under the Digital Services Act (DSA) to major digital players , namely, Apple (App Store), Google (Play, Search), Microsoft (Bing), and Booking.com, seeking detailed information about how they identify and curb financial scams on their platforms.
While this step is not (yet) a formal infringement procedure, it is emblematic of the Commission’s intensifying enforcement posture under the DSA. In this article, we will:
1. Recap the relevant DSA regime (particularly for very large online platforms and search engines)
2. Analyze the Commission’s requests and what they ask of the platforms
3. Explore legal, compliance, and risk implications for platforms and intermediaries
4. Offer practical pointers for legal and compliance teams in the current environment
The DSA Framework: Key Features, Obligations & Sanctions
To understand the significance of the Commission’s move, one must appreciate what the DSA is, which entities it covers, and which obligations the platforms now face.
What is the Digital Services Act (DSA)?
The Digital Services Act (Regulation (EU) 2022/2065) is a somewhat new regulatory framework for digital intermediary services, adopted in October 2022 and gradually phased into application. Its purpose is to modernize the liability and oversight rules for online intermediaries, such as hosting providers, online platforms, app stores, search engines, etc. in the EU’s single market.
It significantly updates the earlier E-Commerce Directive, introducing new obligations, risk-based scrutiny, transparency duties, and enforcement powers at both national and EU levels.
Tiered Obligations by Size and Risk
One of the central innovations of the DSA is its ‘graduated obligations’ approach. Not all intermediary services face the same burdens. Rather, obligations scale with the size, influence, and risk profile of the service.
All intermediaries such as hosting services and basic platforms face baseline obligations, including notice-and-action procedures, transparency reporting, content moderation rules, cooperation with authorities.
However, very large online platforms (VLOPs) generally defined as platforms with more than 45 million monthly active users in the EU (roughly 10 % of the EU population) are subject to significantly more demanding risk management, independent auditing, reporting, and systemic obligations.
Search engines of similar reach are likewise captured under the ‘very large search engines’ (VLOSE) label and subject to analogous duties.
Because Apple’s App Store, Google Play & Search, Booking.com, and Bing cross that threshold, they are among the designated services that must comply with the highest tier of DSA obligations.
Core Obligations Relevant to Fraud / Scams
Among the multitude of duties imposed on VLOPs / VLOSEs, several are especially relevant to the risk of financial scams:
Systemic risk assessment / mitigation: A platform must regularly identify and assess systemic risks arising from its operation such as the dissemination of illegal content, consumer harm, manipulation, and disinformation. It must adopt and implement mitigation measures proportionate to those risks.
Know Your Business Customer (KYBC): Platforms must verify the identity of business users under certain circumstances, to avoid misuse by bad actors.
Transparency and traceability of ads: Platforms must maintain internal databases, a.k.a. advertising repositories, accessible to regulators, researchers, and the public to track which ads have been shown and to whom.
Proactive detection and removal of illegal content: Platforms must have adequate procedures, algorithmic and human-based, to detect and act on illegal content, including fraudulent content, even in the absence of user complaints.
Transparency reporting, external audits, and traceability: The largest platforms must publish detailed transparency reports, submit to external audits of their compliance, and provide access to regulatory oversight.
Cooperation with regulators and authorities: Platforms must respond to information requests from competent authorities, support investigations, and provide access to relevant data where allowed by law. In sum, the DSA transitions platforms from a ‘reactive takedown’ posture such as user reports, content owners’ notices to a more proactive, risk-based, accountable oversight role.
Enforcement & Sanctions
The Commission and national Digital Services Coordinators can open formal investigations and require remedies.
Fines can reach up to 6 % of global annual turnover for the gravest breaches of VLOP / VLOSE obligations. Noncompliance with orders such as a failure to remove content can lead to periodic penalty payments.
Reputational, market, and contracting consequences are also substantial.
What the Commission Is Asking: Dissecting the Request
The Commission’s information request is not framed as a sanction or formal proceeding. At last not yet. But the substance of the questions suggests where enforcement may be heading. Below is a breakdown of what is being asked, and how it links to DSA obligations.
Scope of the Information Request
The Commission’s requests target:
1. Apple’s App Store
2. Google’s Play Store and Search
3. Booking.com
4. Microsoft’s Bing search engine
The questions focus on how these services identify, assess, mitigate, and monitor the risk of financial scams i.e. fake banking or investment apps, fraudulent listings, harmful ads, etc. on their services.
Importantly, for app stores and booking platforms, the request probes how they verify business identities (KYBC) before accepting listings or apps. For search engines, the focus is on links and advertisements that lead users to fraudulent websites.
Key Themes & Question Areas
Topic | What the Commission Seeks | Underlying DSA Link / Rationale |
---|---|---|
Assessment of fraudulent content | Detailed description of how the platform identifies scams (e.g., fake banking apps, phishing sites, fraudulent listings) | Ties to the systemic risk assessment, detection obligations, proactive content review |
Mitigation measures | Measures in place (technical, operational, human review) to reduce harms (e.g. removal, blocking, warning labels, blacklists) | Mitigation obligations, proportionate safeguards, algorithmic review |
Business identity verification (KYBC) | How the platform verifies the identity of business users / advertisers / app developers before allowing them to operate | KYBC duty aims to prevent misuse by anonymous bad actors |
Ad repositories / ad transparency | Information about ad databases, how ads are logged, how researchers / regulators access them | Transparency, traceability, and accountability of ad systems |
Advertising / links to scam sites via search | How the platforms monitor and filter ads or search results that lead to fraudulent sites | The platforms’ duty to act on illegal content in their search, ad ecosystems |
Interplay with consumer protection rules | Ensuring that DSA compliance is consistent/complementary with existing EU consumer protection regulations | The Commission explicitly references complementarity with consumer protection law |
Data, metrics & reporting | The Commission will want concrete figures (e.g. number of fraudulent apps removed, volumes of detected scam ads, false positives, timeliness of responses) | Enables the Commission to gauge compliance effectiveness |
What This Means for Other Intermediaries & Platform Adjacents
Though the Commission’s request is directed at the largest platforms, the ramifications may extend well beyond:
Smaller platforms and niche app stores: They may become subject to similar scrutiny, either via national authorities or as next-in-line ‘medium’ platforms. Preparing now may give competitive advantage.
Ad networks, DSPs, SSPs, adtech intermediaries: Given that DSA contemplates that some ad networks may themselves qualify as ‘platforms’, the pressure to improve transparency, ad provenance, and fraud detection may ripple outward.
Service providers to platform, such as identity verification, content moderation outsourcing,and AI/ML vendors: Their performance, SLAs, error rates, auditability, and contract terms may be scrutinised.
Digital service providers in regulated sectors, such as fintech, investment apps, crypto, and markets: They may be cross-referenced by enforcement authorities in parallel regimes when platforms are asked to demonstrate how they vet or block them.
Thus, any actor in the broader ecosystem should be alert to escalation of expectations concerning fraud prevention, transparency, and accountability.
Conclusion
The European Commission’s request for information from Apple, Google, Microsoft, and Booking.com marks a significant step in DSA enforcement. Although it is not yet a sanction, it functionally serves as a probing test of how seriously platforms are treating financial scams under their new burden of accountability.
For legal, compliance, and regulatory teams, this moment should catalyze internal review, coordination, and re-alignment of fraud detection, onboarding, transparency, and reporting practices. The DSA has redefined the digital landscape! Platforms can no longer treat fraud and scam content as peripheral and must now build, monitor, and defend systems of oversight.