EU Digital Services Act

EU Digital Services Act

EU Digital Services Act

The EU Digital Services Act (the EU Act) was introduced in response to the rapid expansion of the digital economy. The EU Commission states the EU Act’s aims as regards to citizens are to: better protect fundamental rights, provide more control and choice, stronger protection of children online and ensure less exposure to illegal content. For businesses, the aims are to provide legal certainty and a unified EU approach, as well as providing access to EU-wide markets and a level-playing field against providers of illegal content.

The EU Act therefore imposes new responsibilities on those who host online intermediary services for internet users in the EU, such as organisations operating social networks, search engines, content sharing platforms and marketplaces.

Timeline

This legislation applies to most in-scope service providers from 17 February 2024. However, online platforms with more than 45m users in the EU are designated as “Very Large Online Platforms” (VLOPs) or “Very Large Online Search Engines” (VLOSEs) and these organisations have had to comply since August 2023. These organisations also attract a higher level of regulation. They include entities such as Amazon, Google Search, Facebook and YouTube.

Online services subject to the EU Act

The EU Act applies to service providers which have a substantial connection to the EU (for example, having a “significant” consumer base in a member state). It is not necessary for the organisation to be based in the EU to be caught. The EU Act distinguishes between five categories of service provider. Each category imposes the same responsibilities as the previous, along with additional responsibilities:

  • Intermediary service providers who merely provide conduit or caching services (fewest responsibilities)
  • Hosting service providers
  • Online platforms
  • Online consumer marketplaces
  • VLOPs and VLOSEs (most responsibilities)

Duties

The duties imposed are lengthy and detailed and there are also a number of exceptions to certain provisions, especially for smaller providers. The following is a brief summary, by category of provider:

Duties applying to all intermediary service providers

All service providers, including those who merely provide conduit or caching services, must do the following:

  • Establish a single point of contact for the relevant authorities (including the European Commission and relevant member states), and make this contact easy to access;
  • Establish a single point of contact for service recipients, enabling communication in a quick and user-friendly fashion;
  • Designate a legal representative in an EU member state if it does business in the EU without being established there;
  • Provide clear terms and conditions of use covering a range of issues (such as complaints procedures); and
  • Publish annual reports covering their moderation activities, including how many orders they have received regarding illegal content and how long it took to deal with the issue, as well as their own moderation activities such as staff training and the steps taken.

Additional duties – Hosting service providers

  • In addition to all the above, hosting service providers must do the following:
  • Include additional specified information in annual reports (for example, the number of complaints received and the grounds for complaining);
  • Have an accessible mechanism in place to allow user to notify the provider of any illegal content on their service;
  • Provide service recipients with a statement of reasons if content they have provided is refused or restricted; and
  • Inform the authorities of the relevant member state if they become aware of anything giving rise to a “suspicion” that a criminal offence involving a threat to life or safety has or is likely to take place.

Additional duties – Online platforms

  • In addition to all the above, online platforms must do the following:
  • Operate an internal, user-friendly internal complaints handling system - and include additional information on their complaints handling system in their annual reports;
  • Make information available about a user’s right to use out-of-court dispute settlement procedures if they dispute a decision made by the platform (and list such disputes in annual reports);
  • Prioritise handling any notifications made by “trusted flaggers” (entities denoted as possessing expertise in detecting illegal content);
  • Warn and suspend users who provide illegal content on their platform or who make unfounded notices or complaints (and list suspensions in annual reports);
  • Publish information on the platform’s average monthly service recipients in the EU;
  • Ensure the design of online interfaces is not deceiving or manipulative;
  • Provide clear information on any advertisements shown on the platform - including that something is an advert;
  • Include clear information in their terms and conditions about their use of “recommender systems” (which determine the prominence given to results or information); and
  • Put in place measures to protect the privacy, safety and security of minors who use their platforms.

Additional duties – Online consumer marketplaces

In addition to all the above, online consumer marketplaces must do the following:

  • Obtain specified information from traders who intend to sell on their marketplace;
  • Design their marketplace in a way that enables compliance with EU legislation (e.g. the rules on information to be provided to consumers); and
  • Once they become aware that an illegal product has been sold using their marketplace, inform the relevant consumer(s) of the product’s illegality.

Additional duties – VLOPs and VLOSEs

  • In addition to all the above, VLOPs and VLOSEs must do the following:
  •  Publish terms and conditions in all official languages of the member states they provide services in, and make terms and conditions available in a concise and readily available format;
  • Conduct annual risk assessments and establish mitigation measures for identified risks;
  • Respond to a Commission decision in the event of a public security or health crisis, to assess whether their activities have contributed to this crisis;
  • Organise annual independent audits;
  • Provide additional information on advertisements they allow on their platforms;
  • Allow the Commission or relevant member states to access necessary data for compliance monitoring purposes;
  • Establish a compliance officer who is independent from their operational functions;
  • Publish transparency reports every six months; and
  • Pay, if charged, a “supervisory fee” to cover the costs of the Commission supervising them.

The 45m threshold number for this designation will be adjusted periodically so that it continues to represent around 10% of the EU population.

Enforcement

The EU Act requires each member state to designate a Digital Services Coordinator with responsibility to enforce the EU Act within its jurisdiction. A Coordinator may impose fines for non-compliance; for any given fine, the amount is capped based on the following principles:

  • Fine for general non-compliance – 6% of annual worldwide turnover in the preceding financial year
  • Fine for failure to cooperate with an investigation – 1% of annual worldwide turnover in the preceding financial year
  • Periodic penalty payments – 5% of average daily turnover.

A Coordinator can also request the relevant legal authorities suspend the service provided.

Relation to UK Online Safety Act

Although the EU Digital Services Act is not a part of UK legislation, many UK businesses will need to comply with it due to their operations in the EU. In addition, the UK has developed its own legislation to address concerns surrounding internet safety in the form of the Online Safety Act 2023. This covers similar ground but has many differences and businesses operating within both the UK and EU will need to understand both pieces of legislation and the differences between them which will affect how they do business. For more information on the UK Online Safety Act, please see our corresponding briefing note here.

For more information or advice, please contact Beverley Flynn and Guy Cartwright in the commercial and technology team at Stevens & Bolton.

The information contained in this guide is intended to be a general introductory summary of the subject matters covered only. It does not purport to be exhaustive, or to provide legal advice, and should not be used as a substitute for such advice.

Contact our experts for further advice

Search our site