EUROPE’S DIGITAL SERVICES ACT

By Cianan Clancy

Europe’s Digital Services Act: All You Need to Know 

The European Union’s Digital Services Act (DSA) has recently taken effect, bringing significant changes to the regulation of digital marketplaces, platforms and services. Enacted on February 17, 2024, the DSA introduces a set of rules and obligations aimed at addressing various challenges in the online space. 

From combating illegal content and products to protecting minors and ensuring accountability, the DSA imposes new requirements on digital businesses operating within the EU. In this article, we’ll provide an overview of the Digital Services Act and its implications for online businesses in the region.

Scope and Objectives of the DSA

The Digital Services Act (DSA) encompasses a broad range of digital platforms and services operating within the European Union, including online marketplaces, social media platforms, hosting services, and other digital intermediaries. 

Its primary objectives are to address several key challenges prevalent in the online environment. These include combating the dissemination of illegal content, protecting minors from harmful online experiences, and enhancing accountability among digital businesses. The DSA sets out to establish a comprehensive framework for online governance, ensuring that platforms adhere to strict obligations to safeguard user interests and maintain the integrity of the digital ecosystem. Failure to comply with the DSA regulations could result in severe penalties. 

Who is Subject to the DSA 

The DSA applies to entities of all sizes. While larger platforms face stricter regulations, smaller platforms, particularly micro/small startups with fewer than 50 employees and an annual turnover below €10 million, are granted certain exemptions. Additionally, fast-scaling startups that surpass the micro/small criteria may receive a transitional exemption for some provisions over a 12-month period.

In addition to imposing content moderation regulations on platforms and “know-your-customer” (KYC) requirements on marketplaces, the regulation also imposes certain obligations on hosting services and other online intermediaries. These intermediaries include internet service providers (ISPs), domain name registrars, and network infrastructure providers.

The European Union expects at least a thousand companies to be subject to the rules outlined in the Digital Services Act (DSA).

Legal Obligations Imposed by the DSA

The Digital Services Act imposes several legal obligations designed to ensure a safer and more transparent online environment for users. Key legal obligations include:

  • Banning Illegal Products: Under the new law, online marketplaces should not allow users to buy or sell products that are illegal in the relevant EU market. So, for example, if the purchase of guns is banned in the specific EU territory, the e-commerce platform should prohibit it. 
  • Illegal Content Moderation Rules: Platforms are required to implement effective content moderation mechanisms to remove illegal content promptly. This includes hate speech and other prohibited content according to EU laws.
  • Protection of Minors: Platforms that fall within the scope of the DSA must ensure a high level of privacy, safety, and security for minors. This includes prohibiting the use of minors’ data for targeted advertising purposes.
  • Transparency Requirements: Platforms are obligated to provide users with transparency regarding their content moderation practices. This includes providing users with a statement of reasons for content moderation decisions affecting them.
  • Cooperation with Authorities: Platforms must cooperate with relevant authorities, including providing information and cooperating with investigations related to compliance with DSA regulations.
  • Business Traceability Requirements: Platforms are required to implement know-your-customer (KYC) rules to trace and verify the identities of their users, particularly in cases involving commercial transactions.

These legal obligations aim to enhance the accountability, transparency, and user safety in the digital environment, aligning with the EU’s broader objectives of ensuring a responsible and secure online space.

Penalties for Non-Compliance

Failure to comply with the regulations set forth in the Digital Services Act (DSA) carries significant consequences for digital entities falling within its scope. Penalties for non-compliance can vary depending on the severity of the breaches committed.

Entities found to be in violation of the DSA may face fines of up to 6% of their global annual turnover for confirmed breaches. In addition to financial penalties, non-compliant entities may also face legal ramifications, including litigation and legal proceedings. The DSA empowers both regulatory authorities and individual consumers to take legal action against companies that fail to uphold their obligations under the act.

Regulatory Considerations for Different Platform Sizes

Within the scope of the Digital Services Act (DSA), distinct regulatory considerations emerge for platforms of varying sizes, from small startups to global tech giants. Understanding these distinctions is essential for effective compliance and governance within the digital landscape.

Exemptions for Small Platforms and Startups

Small platforms and startups, typically characterised by limited resources and scale, benefit from certain exemptions within the Digital Services Act (DSA). Entities employing fewer than 50 staff members with an annual turnover below €10 million are exempt from the bulk of DSA provisions. However, they must still ensure clarity in their terms and conditions and designate a contact point for authorities. Fast-scaling startups that surpass the micro/small criteria receive a targeted exemption for some DSA provisions over a transitional 12-month period.

Impact on Big Tech Companies

Major tech platforms and marketplaces face the most stringent level of regulation under the DSA. They’ve already met one compliance deadline, with a subset of rules focused on algorithmic transparency and risk mitigation in effect since late August. 

Last December, the Commission initiated its first formal investigation of a VLOP (very large online platform), targeting Elon Musk’s X (formerly Twitter), over multiple suspected breaches.

Nearly two dozen tech giants, including X, designated as subject to VLOP and VLOSE (very large online search engines) rules, are expected to comply with the DSA’s general obligations. 

This entails providing content reporting tools, enabling users to challenge moderation decisions, cooperating with trusted flaggers (third parties authorised to submit reports to platforms), producing transparency reports, and adhering to business traceability requirements.

Regarding moderation, platforms must provide users with a “statement of reasons” for each moderation decision affecting them, such as content removals or demotions. The EU is compiling these statements in a database, initially for larger platforms, amassing over four billion statements to date. The DSA also mandates platforms to disclose information about ads they run and any algorithmic recommender systems (software algorithms that analyse user data and behaviour) they employ.

The DSA specifically prohibits the use of minors’ data for advertising purposes, necessitating measures to prevent their information from being utilised in ad targeting systems. However, determining users’ age without infringing on privacy remains a complex challenge. 

“The problem is difficult to solve,” an EU official said in a press conference.“We are fully aware of the impact that [age verification] can have on privacy and we would not accept any measure for age verification… So my short answer is it’s complicated. But the long answer is that we are discussing together with Member States and with the Digital Services Coordinators, in the context of a taskforce that we have put in place already, to find which ones would be the acceptable solutions.”

Digital Services Coordinators

The Digital Services Act (DSA) introduces a new layer of oversight for tech giants operating within the European Union, placing responsibility for monitoring compliance primarily on Member State level enforcers known as Digital Services Coordinators (DSCs). Unlike the European Commission, which enforces obligations for VLOPs and VLOSEs, DSCs handle the general DSA rules, ensuring platforms adhere to regulatory standards across the region.

Country of Origin Principle and Enforcement

Under the DSA, oversight of tech giants’ compliance with general rules is based on the “country of origin” principle. This means that enforcement actions are led by authorities in the countries where the platforms are established. 

For instance, platforms like X, Apple, Meta, and TikTok, headquartered in Ireland, will likely have their compliance monitored by Ireland’s media regulator Coimisiún na Meán

Following the same logic, Amazon, which picked Luxembourg as its regional base, is likely to have its compliance with the general rules of the Digital Services Act (DSA) monitored by the competition authority of Luxembourg, known as the Autorité de la concurrence.

Platforms without a regional establishment will face enforcement by competent bodies in any Member State, potentially exposing them to greater regulatory risk. (Assuming that European authorities can enforce the law on foreign entities if they refuse to comply.)

The EU has a webpage where you can find the DSC appointed by each Member State. However, as of now, not all appointments have been made, so there are still some gaps.

Role and Composition of Digital Services Coordinators

DSCs, composed of existing regulatory agencies in the EU Member States, play a crucial role in overseeing DSA compliance. They collaborate to tap relevant expertise and ensure effective oversight of platforms and businesses falling within the DSA’s scope. Additionally, a new body called the “European Board for Digital Services” facilitates coordination among DSCs, offering guidance and producing advice on applying the law.

DSCs also serve as points of contact for citizens who want to file complaints related to the DSA. If a complaint concerns a platform that a specific authority doesn’t oversee, the DSC will be responsible for forwarding it to the relevant body.

Moreover, EU consumers have the option of resorting to collective redress litigation if a company violates their rights under the Act. This means that non-compliant platforms face the risk of being sued.

The speed at which these new digital enforcers will act remains to be seen. Based on past implementations of EU digital rules, platforms are likely to be given some leeway to adapt and comply with the new regulations. This includes allowing time for the enforcement regime to establish itself, considering the decentralised nature of enforcement. However, this decentralised law enforcement mechanism may also lead to differences in the pace of DSA interventions across the region.

Enforcement Powers and Impact

DSCs are empowered to issue fines of up to 6% of a company’s global annual turnover for breaches of the DSA. This mirrors the penalties imposed by the Commission on VLOPs/VLOSEs for violating specific obligations. The introduction of DSCs adds another layer of regulatory complexity for platforms in the EU, necessitating compliance with a myriad of obligations alongside existing laws like the General Data Protection Regulation (GDPR) , the ePrivacy Directive, and the AI Act. 

Implementation Challenges and Emerging Scenarios

The implementation of the Digital Services Act (DSA) brings forth a multitude of complexities and uncertainties, particularly concerning its application to evolving technologies and regulatory experiments. As the DSA unfolds, several key areas demand attention and scrutiny, including regulatory experiments in member states, the integration of fast-scaling AI tools, and the determination of DSA applicability to standalone AI technologies. Let’s delve into these challenges and emerging scenarios in detail: 

Regulatory Consultations in Ireland

Recent consultations by Ireland’s Coimisiún na Meán on rules for video sharing platforms signal potential regulatory shifts. While these consultations are based on EU audiovisual rules rather than the DSA, the involvement of Coimisiún na Meán as a Digital Services Coordinator (DSC) raises questions about similar approaches under the Digital Services Act. With many major platforms situated in Ireland, the DSC could initiate similar regulatory experiments affecting tech giants like Meta, TikTok, and X.

DSA Application to Fast-Scaling AI Tools

One intriguing question pertains to how the DSA will be implemented for rapidly evolving generative AI tools.

The explosive growth of AI chatbots like OpenAI’s ChatGPT occurred after EU lawmakers had already crafted and ratified the DSA. However, the intention behind the regulation was to make it adaptable and capable of encompassing new types of platforms and services as they emerge.

When questioned about this, a Commission official outlined two distinct scenarios regarding generative AI tools, TechCrunch reported. In one scenario, where a VLOP incorporates this AI into an in-scope platform (such as integrating it into a search engine or recommender system), the DSA is already applicable. 

The second scenario involves “standalone” AI tools that are not integrated into platforms already identified as falling within the scope of the regulation. In this case, DSA enforcers will grapple with the legal question of whether the AI technology qualifies as a platform or a search engine, as defined by the regulation.

According to the Commission, a lawyer will scrutinise the definition to determine if the AI tool functions as a search engine or if it essentially hosts content upon request from users and disseminates it to the public. If the criteria are met, the DSA applies, an EU official told TechCrunch.

However, the speed of this determination process remains uncertain, likely contingent upon the specific Digital Services Coordinator (DSC) involved.

Furthermore, standalone AI tools meeting the DSA’s platform or search engine definition and surpassing the threshold of 45 million monthly users could potentially be designated as VLOPs/VLOSEs in the future. In such a scenario, the regulation’s additional requirements for algorithmic transparency and systemic risks would apply, with oversight and enforcement falling under the Commission’s control. Nonetheless, the AI Act will also influence who’s responsible for what, determining whether both the AI Act and DSA would apply concurrently to such tools.

Conclusion 

In conclusion, the Digital Services Act (DSA) represents a pivotal step in regulating online platforms and services within the European Union. As the DSA comes into full application, it introduces a comprehensive framework aimed at addressing various challenges prevalent in the digital landscape, including combating illegal content, protecting minors, and enhancing accountability among digital businesses. However, the implementation of the DSA is not without its challenges.

The regulation brings about significant changes and complexities, particularly concerning its application to emerging technologies and the regulatory experiments in member states. Regulatory bodies, such as Digital Services Coordinators (DSCs), play a crucial role in overseeing compliance and enforcement, but uncertainties remain regarding the interplay between the DSA and other forthcoming regulations, such as the AI Act.

Despite these challenges, the DSA sets the stage for a more transparent, accountable, and secure online environment for users across the European Union. As businesses navigate the evolving regulatory landscape, collaboration between stakeholders, ongoing discussions, and adaptation to emerging technologies will be key to ensuring effective implementation and enforcement of the DSA in the years to come.

Spread the love