ASG Analysis: New European Proposals to Regulate Big Tech
- In December 2020 the European Commission announced proposals that would change how the digital economy is regulated: the Digital Services Act (DSA), largely focusing on content and advertising; the Digital Markets Act (DMA), addressing the market power of the “gatekeepers” through which consumers, sellers, and service providers interact; and the European Democracy Action Plan (EDAP), responding to disinformation and threats to electoral integrity.
- These proposals now move to debate in the European Parliament and the Council of member states. Adoption of final legislation is not expected until at least 2023.
- Companies that participate in the digital economy, especially those hosting controversial content or selling through digital marketplaces, have strong reason to engage on the scope and bite of each proposal. The DMA in particular could fundamentally reshape how advertisers, publishers, and consumers receive information about services, products, and news.
- Engagement may be even more important if, as expected, the proposals feature in transatlantic talks with the incoming Biden administration, a broader process that we explored in a recent ASG Analysis. If the Biden administration finds itself unable to address digital issues at home, then European regulations may become the benchmark for democratic global standards, as the General Data Protection Regulation has after coming into effect in 2018.
- The Commission’s proposals come as antitrust lawsuits have been filed in the U.S. against Facebook and Google, and the UK has released its own proposals. The U.S. lawsuits will run on their own, separate timelines, but they highlight the connection between advertising, data, and the selection of content shown to consumers, themes that similarly run throughout the Commission’s proposals.
- How, where, and when companies and governments choose to intervene will do much to shape the digital future. Litigation will inform debate in Europe, and it is also possible that transatlantic agreements could affect the outcome of the various lawsuits or that legislative proposals could be seen as a model for settlements or remedies ordered by courts.
Introduced on December 15 by European Commissioners Margrethe Vestager and Thierry Breton, the Digital Services Act (DSA) and Digital Markets Act (DMA) propose a set of regulations and responsibilities for technology companies operating in the EU. Building on the EU’s strong focus on digital sovereignty and ambition to set its own rules and standards for the digital world, the proposed regulations update the 2000 e-Commerce Directive and reinforce the bloc’s toolbox to prevent anticompetitive and harmful market behavior. The proposals are designed to protect consumers from online harm, create fairer and more competitive digital markets, and support the growth of smaller platforms and SMEs.
The DSA and DMA are indicative of future digital competition policy in the EU and provide insight as to what extent and on what issues the EU and U.S. can cooperate on tech regulation and competition. In introducing the proposals, Commissioner Vestager alluded to ongoing discussions with the U.S. and stated her expectation that the U.S. will be a more “eager partner” under President-elect Biden. Recent U.S. antitrust lawsuits against several big tech companies may also create room for the Biden administration to pursue regulatory cooperation.
It will take several years for the DSA and DMA to be finalized as legislation. Disagreement already exists within Parliament on whether the DSA should include harmful but legal content and on how the DMA defines “gatekeepers.” Member states such as France and the Netherlands have advocated strongly for restrictions on big tech, while Ireland and the Nordics have warned against strict obligations and prohibitions that would harm innovation in the sector. There are also concerns that the digital market will further evolve while the proposals are being debated over the next several years. Therefore, companies potentially affected by the DSA and DMA should expect the proposals to change more than once as they make their way through the slow EU legislative process.
This ASG Analysis covers:
Click each link to go directly to the relevant section.
The Digital Services Act aims to establish greater transparency and accountability for online platforms by harmonizing obligations and responsibilities regarding advertising, illegal content, and data. The DSA will apply to all online intermediaries, hosting services, and certain platforms that target European consumers and operate in the EU single market. The obligations included in the DSA are proportionally applied depending on the nature, size, and impact of each digital service. Services likely to fall under the regulations include internet service providers, cloud services, app stores, online travel and accommodation platforms, messaging services, marketplaces, and social networks.
The proposal also includes additional requirements for “very large online platforms,” defined differently from gatekeepers in the DMA. The “very large” category includes those companies that reach at least 45 million monthly users in the EU (or a number equivalent to 10 percent of the population) and have a significant economic and societal impact. An appointed authority will verify at least every six months whether a company’s number of monthly users is equal to or higher than 45 million and will decide which platforms to designate as “very large.” The threshold for the number of users can be adjusted by the Commission so that it reflects 10 percent of the EU population.
The requirements outlined in the DSA for all digital service providers, as well as additional obligations for very large platforms, include:
- Advertising transparency. Users must be clearly informed when content is sponsored, including who paid for the ad, why the user was targeted, and based on what data. Very large platforms will be subject to additional requirements to publish more data on their online recommender systems, such as algorithms, that target users and can lead them to illegal content; to allow users to modify the criteria used to recommend content; and to allow users to choose not to receive targeted recommendations.
- Illegal content moderation. To curb illegal content, the DSA harmonizes notice and action mechanisms across the bloc. It requires platforms to offer a mechanism for users to flag content but not to monitor and censor what users post online, meaning that liability for online speech lies with the speaker and not the platform that hosts the content. Platforms may still carry out their own “voluntary” measures to detect and remove or disable access to illegal content, and a “Good Samaritan” clause will guarantee that platforms are still not liable should they decide to carry out such investigations. When platforms remove content, they must notify users of any decision taken, offer the reason for the decision, and provide a mechanism for the user to contest the decision. Very large platforms will be required to conduct risk assessments on how they are combatting illegal content on their services and submit to an independent audit of their compliance with DSA rules.
- Data access. Platforms can be required to give researchers access to anonymized key platform data in order to examine how platforms work and how risks evolve online. Very large platforms will have additional data access requirements for regulators and researchers.
- Traceability of business users. Platforms must adhere to “Know Your Business Customer” obligations in order to help identify sellers of illegal goods and services.
The DSA will be enforced at both the national and EU level. Each member state will appoint an independent Digital Services Coordinator responsible for supervising services established within their borders and imposing penalties for noncompliance. The Coordinators will cooperate within a cross-border European Board for Digital Services that will oversee pan-EU issues and intervene where local authorities lack the resources to do so.
Failure to comply with the obligations of the DSA could result in fines of up to six percent of the service provider’s annual global income, or turnover. Regulators will also have the power to demand information from companies or make on-site visits to ensure compliance; failure to provide accurate information or to submit to on-site inspection could result in fines of one percent of annual turnover. Any platform that continually violates DSA obligations can be subject to a temporary suspension of their service via court ruling.
Each member state will set its own rules on penalties assessed to services within their jurisdiction, provided they do not exceed the maximum amounts listed above and with the exception of very large platforms. The Commission will have direct supervision over very large platforms, including the ability to impose fines and penalties. A provision in the DSA proposal also allows the Commission to request that the Coordinator in the member state where a very large platform is established investigate any suspected noncompliance.
Interestingly, the DSA does not define what is illegal online, thus leaving in place national and EU laws that specify what is illegal. Several member states have expressed concern about a provision that allows any member state authority to order a platform operating in the EU to remove illegal content regardless of where the platform is headquartered, as this could result in certain states asking for the removal of content that they deem illegal – such as LGBTQ+ information – but is not viewed as such in other member states. The DSA also does not impose removal measures on harmful but legal content, instead aiming to limit platforms’ ability to amplify such behavior. Disinformation and harmful behavior will be more specifically addressed in the European Democracy Action Plan and a revised Code of Practice on Disinformation, both of which are discussed elsewhere in this paper.
In announcing the DSA, Commissioner Vestager estimated that, at best, it will take at least one-and-a-half years to finalize the legislation and an additional six months for full implementation. Per standard EU legislative processes, the proposal will go through several years of consultation and likely changes and must be approved by both the European Parliament and Council of member states. Thus, legislation on the DSA should not be expected before 2023.
The Digital Markets Act is a set of ex-ante rules for certain “gatekeepers” that will allow the European Commission to better detect and prevent alleged unfair and anticompetitive behavior in the digital sector. The DMA defines gatekeepers as those companies that both control access to marketplaces and benefit from the data gathered from traffic through their platforms. The Commission believes that measures to limit the control of gatekeepers over information and the activities of their business users, and to give service providers, advertisers, and consumers more choice, will create a fairer and more competitive digital single market.
In defining which digital companies will qualify as gatekeepers, the DMA outlines several, cumulative thresholds that firms must meet, including:
- A strong economic position. This includes any company with an annual turnover in the European Economic Area of at least €6.5 billion in each of the last three financial years or a market value of at least €65 billion in the last financial year. Gatekeepers also provide a “core platform service” – search engines, social networking services, messaging services, operating systems, video-sharing services, online intermediation services, cloud computing services, and/or advertising products – in at least three member states.
- Large user base. The company operates a core platform service with at least 45 million monthly active end users and 10,000 yearly active business users in the EU in the last financial year.
- Stability over time. If the company met the previous criteria in each of the last three financial years, it is presumed to have an “entrenched and durable position” in the EU digital market that allows it to control rivals’ ability to reach customers.
Additional provisions allow for several platform services within a company to be designated as gatekeepers and for the Commission to evaluate specific situations where a platform may not meet all three thresholds but could still be considered a gatekeeper. The Commission retains the power to identify new gatekeepers that previously did not meet the thresholds or add core services through a market investigation tool included in the DMA. As part of the review process for identifying potential new gatekeepers or services, the Commission requires gatekeepers to inform and receive approval from Brussels when they plan to buy smaller rivals. The DMA will likely apply to a wide range of digital services, meaning more than just the U.S. tech giants could be designated as gatekeepers. Companies that meet the thresholds can avoid designation as a gatekeeper, however, by providing substantial evidence that they do not have a significant impact on the single market.
Companies identified as gatekeepers must abide by a broad set of rules, largely divided into the following categories:
- Obligations. Referred to as “do’s” or a “whitelist” of activities, these are behaviors that gatekeepers must proactively implement. They include but are not limited to: letting businesses offer their products on different platforms at different prices than those on the core service platform; providing business users with access to advertising and performance data; allowing business users to conclude contracts with customers outside a core platform; and prohibiting the use of personal data gathered from several different platforms.
- Prohibited behaviors. Referred to as “don’ts” or a “blacklist” of activities, these are behaviors that gatekeepers must refrain from engaging in but may require further specification from the Commission. Examples of such behaviors include: blocking users from un-installing any pre-installed software or apps; using data obtained from business users to compete with those businesses; treating their own services or products more favorably in search results or ranking services; forcing users to register with other services as a condition for accessing certain core platform functions; and blocking business users from accessing online marketplaces such as app stores. If the Commission finds that a gatekeeper’s reported measures for compliance with these prohibited behaviors are ineffective, it can take specific action to ensure compliance, thus giving the Commission some flexibility to tailor the obligations to individual platforms.
These changes, if adopted, could have far-reaching effects. Gatekeepers often are able to see much of the activity on their platforms, learn from it, sell or trade the information they gather, and use it to compete with rivals and smaller businesses. Each of these actions would be restricted under the DMA as proposed, potentially leading to a much different digital marketplace.
Under the DMA as proposed, companies will have six months to comply with the outlined obligations and prohibitions once designated as a gatekeeper and must report to the Commission measures it has taken. The Commission can impose fines of up to 10 percent of a company’s global annual turnover and periodic penalty payments of up to five percent of global annual turnover for noncompliance with DMA rules. In the case of systematic infringements, and if no alternative measures are available, the Commission will have the power to impose additional structural remedies such as obliging gatekeepers to break up or sell parts of their business.
One area of concern is that the market investigation tool – the second pillar of the DMA in its original framework – is greatly scaled back compared to previous drafts of the proposal, with its original intended power to impose case-by-case remedies limited. Now, investigations conducted with the market tool cannot lead to remedies but only to the addition of new core platform services to the scope of ex-ante rules; the designation of new companies as gatekeepers; and the regulation of certain new practices. The Commission still anticipates, however, that the tool will reduce the time frame for antitrust investigations by unearthing anticompetitive practices before they severely impact the digital market.
A second concern arises from the DMA’s approach to interoperability and data portability. Gatekeepers are required to allow ancillary service providers, such as payment processing apps, to use their core services under the same terms of the gatekeeper’s own ancillary services. It is unclear, however, whether the same interoperability requirement would apply to a competitor’s core services. In addition, while the DMA lets consumers access their data in real-time and transfer it to a rival service, it appears that consumers may need to retain an account with the original service and abide by the terms of that service.
As with the DSA, the DMA will be subject to negotiation and approval in European Parliament and the Council of member states before it is adopted as legislation. U.S. tech giants are expected to lobby intensely for changes to the initial proposal, and Commissioner Vestager believes that the legislation and implementation process for the DMA will be even longer than for the DSA.
European Democracy Action Plan
The European Democracy Action Plan – led by EU Commissioner for Values and Transparency Vera Jourová – combines legislative and non-legislative measures to tackle online disinformation and strengthen media freedom as well as to ensure the integrity of European elections. The plan also addresses the rise in disinformation related to Covid-19 and vaccines. While more internally focused and not a direct call for cooperation with the U.S., the EDAP echoes other proposals that have transatlantic components. Jourová herself is interested in the financial and operational transparency of social media platforms in the digital marketplace, an area of potential cooperation with U.S. regulators.
The EDAP is closely related to the DSA, DMA, and the December 3 Media and Audiovisual Action Plan and is likely to be affected by debates on each. In its focus on advertising transparency, it overlaps with the DSA as well as the Media and Audiovisual Action Plan, which is designed to help the sector recover and support media pluralism. The DMA’s emphasis on altering the power that gatekeepers have over how information is presented to consumers may affect the spread of disinformation or toxic speech.
The EDAP addresses potential measures in three key areas:
- Promoting free and fair elections. The Commission will propose legislation, expected in the third quarter of 2021, to improve the transparency of political advertising by targeting the sponsors of paid political content and clarifying the responsibilities of online platforms and advertisers in producing and distributing said content. The legislation would be designed to restrict micro-targeting and behavioral profiling techniques that use personal data to target specific voters; an outright ban on these advertising methods is not expected. The Commission also intends to revise rules on the financing of European political parties to prevent external interference and indirect funding by foreign interests. Other measures include bolstering cybersecurity and providing funds to secure elections throughout the EU.
- Strengthening media freedom and pluralism. The Commission is expected in late 2021 to recommend measures to improve journalist safety, particularly for female journalists, and an initiative to protect them from strategic lawsuits against public participation (so-called SLAPPs). The EDAP also intends to support media pluralism, transparency of media ownership, and the fair allocation of state advertising through a new Media Ownership Monitor. Several of these measures will align with and build on the Media and Audiovisual Action Plan. These provisions may prove controversial in the Council, because some EU member states have chosen to steer state advertising funds toward news outlets owned by or favorable to ruling parties.
- Countering disinformation. The main proposal of this section is to revise the existing Code of Practice on Disinformation – a 2018 voluntary framework to control disinformation online signed by all major platforms – by strengthening requirements for online platforms and implementing a permanent framework for monitoring compliance. Proposed requirements include increasing the visibility of reliable public information; limiting the placement of false ads; and restricting the algorithmic amplification of disinformation. Together with the DSA, the EDAP would establish a co-regulatory backstop to police measures included in the strengthened Code. Co-regulation has not generally been effective, however, so changes in advertising methods and algorithms may do more to alter the dissemination of disinformation than the revised Code. The Commission also proposes to improve its existing capabilities for countering foreign interference, including a potential sanctions regime for perpetrators of disinformation.
The EDAP is a statement of political intent from EU officials, rather than a specific legislative proposal. The Commission states throughout the EDAP that it will announce concrete measures beginning in 2021, but it could take several years to approve and implement any new laws. For example, the Commission plans to issue guidance on reforms to the Code of Practice on Disinformation in Spring 2021 and will then begin a long process of working with companies, regulators, and civil society to implement the new framework. EU officials explicitly note that the measures in the EDAP will be gradually enacted through the end of 2022, at which point the plan will be assessed to decide if further steps are needed. The Commission’s goal is to have a new regulatory system in place by the European Parliament elections scheduled for May 2024.
The EDAP, due to its non-legislative nature, leaves a number of unanswered questions and unexplained details. One outstanding issue is the lack of a shared definition of a political ad: does the category only include paid-for-content sponsored by traditional political parties or can it include issue ads, for example, on topics such as climate change and immigration? There are additional concerns that such a strong focus in the EDAP on political advertising fails to address how disinformation and misinformation spread online, as non-paid content represents a much larger share of false or misleading content. The EDAP also raises more questions than it answers about the EU’s authority to govern elections, which mainly fall under the purview of national governments.
Albright Stonebridge Group (ASG) is the premier global strategy and commercial diplomacy firm. We help clients understand and successfully navigate the intersection of public, private, and social sectors in international markets. ASG’s worldwide team has served clients in more than 120 countries.