Going big, one problem at a time: Europe’s regulation of digital services and markets gathers pace

Remy Chavannes, Anke Strijbos and Dorien Verhulst[1]

While European courts are still working to interpret digital laws from the early years of the century, the EU legislative machine is rapidly churning out new regulations and directives designed to protect online consumers and competitors from the perceived abuses and vast carelessness of the global tech platforms. The dominant narrative is that, after two decades of under-regulation benefiting mainly non-European companies, it is time for regulatory catch-up, with rules which are much more closely targeted at the digital services and problems of today. In the process, the contours of a European “law of the internet”[2] are fast coming into focus. Coupled with major new initiatives in the sphere of data, data governance and artificial intelligence, all signs point to the emergence of an overarching EU regime for tech regulation – albeit one still struggling for coherence and consistency.

This overview of recent developments in EU digital regulation will be published in “Legal Frontiers in Digital Media,” MLRC Bulletin (June 2021).

Introduction

Comparing the paragraph above to our EU platform regulation update in this Bulletin from two years ago,[3] the reader may be forgiven for concluding that not much has happened in this area, or at least that the authors have failed to notice. After all, in 2019 we were already in the midst of the techlash, and Europe was already engaged in a process of tech re-regulation characterized by grand ambitions, a conviction that “these platforms” should be doing both “more” and “less”, and a reluctance to make hard policy choices. Then, too, Brussels was preaching rules for artificial intelligence that would fully and equally protect all economic, social and moral imperatives. While it is true that we are still in the same process of burgeoning European assertiveness in digital regulation, the legislative proposals are becoming more far-reaching and more fundamental, driven by a broad political consensus that major steps are needed to bring online platforms and algorithms to heel. In times of pandemic, online services have proven their immense worth, but also increased concerns about their indispensability.

All the same, we should not exaggerate the speed of travel. The new EU rules which we discussed in 2019, regulating upload platforms, press publisher’s rights, video-sharing platforms and online intermediation services, have barely entered into force at the level of individual EU member states. The European Commission’s ambitious new proposals on digital services, digital markets, data and AI may not come into force before the MLRC’s Legal Frontiers in Digital Media conference of 2023.

In recent years, the European tech debate has broadened and deepened, but also has become fiercer and more political. In the cacophony of hot takes, avidly shared in meme form on those same online platforms, opportunism and self-interest are more in evidence than informed debate or consistent policy. When online platforms, after years of being described as unchecked echo chambers of hate and disinformation, finally moved to ban President Trump and other policy violators following the Capitol riots, European politicians were quick to lambast them as arbitrary enemies of free speech in urgent need of regulation. Journalists at traditional European media outlets have increasingly followed their publishers in blaming online platforms for their own commercial troubles, with “paying for news” as the new “value gap” frame. Meanwhile, tech companies have preached “better regulation” while advocating rules that protect their own interests. While the debate over artificial intelligence is still relatively technical and niche, there is a growing awareness that algorithms can inherit the biases of their creators, users and the training data on which they have been raised.

Regulation of online intermediaries: the emerging EU law of the internet

The mostly quite targeted EU laws that made it over the legislative finish line just before the 2019 European Parliament elections are beginning to come into effect at the national level, affecting for example the transparency obligations of online intermediation services,[4] youth-protection obligations of video-sharing platform services[5] and copyright liability of upload platforms.[6] The newly installed European Commission and European Parliament are now working on much more ambitious projects, including rewriting the basic rulebook for digital services set out in the E-Commerce Directive from 2000,[7] and creating a new system of preventive competition oversight of digital markets designed to curb the power of the largest “gatekeeper” platforms.[8] These new proposals are predicated on the assumption that the large online platforms are doing both “too little” and “too much”, and the fact that they can decide for themselves what, if anything, they do shows that they have too much power.

Combating illegal and undesirable content

Of all EU citizens, 65% use at least one social media service every day.[9] According to a growing number of critics, the way in which a handful of California-based tech companies are managing these services is too opaque and too arbitrary. The call for greater transparency and accountability is particularly strong with respect to content moderation: the formulation and enforcement of rules by online platforms determining what can and cannot be said on their services. This is hardly surprising: take-down or stay-up decisions about online speech – on a massive, global scale – have a major impact on freedom of information and, by extension, the democratic rule of law.[10] The most vivid illustration of this impact was unthinkable at the time of our last EU update: Facebook and Twitter denying a sitting US president access to their services; Apple and Google removing the social media app Parler, which had been used to organize the riots, from their app stores; Amazon denying its cloud services to the platform. The resulting fundamental debate about ‘de-platforming’ was not restricted to US academic and media circles.[11] If anything, the turmoil in the US caused European policymakers to redouble their efforts to ‘emancipate’ the bloc from American influence.[12] While platforms work to create self-regulatory oversight mechanisms to judge what users should and should not be able to post on their networks, the European Commission’s proposal for a Digital Services Act (DSA), discussed in more detail below, shows that the subject of content moderation will be center-stage in the coming years.

Although the role and impact of online services is incomparable to when the E-commerce directive was adopted in 2000, its rules on online service providers’ liability, and obligations to be helpful, are still being litigated. Over the past two years, the EU Court of Justice handed down judgments on issues such as the permissibility of preventative filtering measures,[13] the global reach of delisting orders,[14] and the regulatory regime applicable to ride-hailing apps.[15] Facebook, Google, Twitter, Mozilla, Microsoft and TikTok joined forces with advertisers to adopt a European code of conduct against disinformation.[16] During the pandemic, the big tech companies were quick to show responsibility, enacting and enforcing policies against misleading information about COVID-19, with mixed results.[17] The all-but-adopted Terrorist Content Regulation allows national authorities to order the removal of online terrorist content.[18] The complexity of the issue did not stop the EU legislature from imposing extremely short take-down deadlines (sometimes within an hour) and setting fines of up to 4% of global turnover.[19] In parallel, the European Commission is preparing legislation to more effectively combat child sexual abuse online.[20]

The emerging landscape of online speech regulation is vast and fragmented. There are more opinions and ways to express them online than humans are able to control. No tech company can have all the content on its service moderated, at least not by humans. No regulator can supervise this process at more than the most macro level. At best, a small fleet of patrol boats is cruising the oceans of online content.[21] Platforms receive a wide variety of notifications, from hideous criminality to difficult edge cases to naked attempts to censor unwelcome-but-legitimate speech.[22] The rules which providers must apply are European, national and internal, spread across an increasingly large range of (mostly thematic) legislative and policy instruments. With the sheer volume of requests, and an intermediary’s inherently limited view of the relevant facts, mistakes are par for the course. The largest tech companies have the resources to detect and reject unfounded requests, and litigate them if necessary. Smaller platforms have less resources to spend on automated detection, manual review, and lawyers. They logically opt for a more automated and/or risk-averse course, and will typically move more quickly to take down notified content in case of doubt. Although solid empirical research on overblocking is still scarce, the general picture is clear.[23]

Copyright: online use of press publications and liability of upload platforms

One specific, IP-focused intervention in the perceived power of large tech companies can be found in the DSM Copyright Directive (2019/790/EU), which we discussed in our previous update.

Article 15 of the Directive gives press publishers a neighboring right which they can deploy against online use of their press publications by information society service providers. It is subject to the usual exceptions and limitations, and does not apply at all to hyperlinking, private use or – CJEU reference alert – “the use of single words or very short fragments of a press publication”.[24] The press publishers’ right seeks to address a serious problem – the demise of the traditional revenue model for quality journalism – but does so through a curious and irrational mechanism that is called an IP right but walks and talks like a state aid regime for large press publishers financed by a special tax on tech companies. Time will tell if it can help quality journalism survive.[25]

Article 17 of the Directive provides that a – CJEU reference alert – “provider of an online content-sharing service” is responsible for its users’ uploads of protected content, and must therefore obtain authorization from rightholders. The hosting safe harbor set out in Article 14 of the E-commerce directive does not apply, but is replaced by a specific safe harbor in Article 17(4), which applies if the provider demonstrates that it has made “best efforts” to obtain an authorization, and “prevent the availability” of works about which rightholders have provided sufficient information to enable them to do so. This cooperation between platforms and rightholders must protect the rights of both rightholders and users, while simultaneously avoiding overblocking, general monitoring and the identification of individual users. The largest platforms and rightholders may find a way to square all those circles, but the long tail of platforms and rightholders below will struggle to understand their rights and obligations.

Although EU member states must transpose the Directive into their national laws by 7 June 2021, only one country – the Netherlands – has so far completed the necessary legislative procedures. The Dutch transposition is deliberately dull and devoid of any national interpretation, essentially for fear of getting it wrong.[26] By contrast, Germany is trying to put the impossible compromises of Article 17 into workable practice. Its legislative proposal, which is expected to pass into law in May, requires platforms to pay a remuneration to collecting societies for their users’ right to upload fragments of protected content in the form of quotations, caricatures, parodies and pastiches, and requires them not to block “presumably authorized” uploads unless rightholders invoke a “red-button procedure”.[27] The German approach is creative and detailed, but also expensive and complicated – particularly for smaller platforms. For all its drafters’ efforts, we may not know for years whether the EU Court of Justice judges the result to be compatible with the directive.

Media regulation: video-sharing platform services

We discussed the revised Audiovisual Media Services Directive in our previous update.[28] Since then, national transposition laws (belatedly) started entering into force. Video-sharing platform services, which provide access to user-uploaded content over which they do not have editorial responsibility, are now subject to a degree of media-law regulation for the first time. These platforms are now required to take certain measures to protect users, particularly minors, from various types of harmful content. Many platforms which host (some) videos are still not sure whether they qualify as video-sharing platform service, notwithstanding the European Commission’s (excessively) broad interpretation of the concept.[29]

Privacy regulation: the right to be forgotten

Since our last update, the EU Court of Justice issued two important judgments about the right of data subjects to have certain search results delisted, also known as the ‘right to be forgotten’. In a first ruling, about the territorial scope of a successful removal request, the Court ruled that the GDPR[30] does not contain a basis for an obligation to remove search results worldwide, but does not prohibit it either.[31]

In a second landmark ruling, issued on the same day, the Court dealt with the standard for the delisting of search results that refer to special categories of personal data such as data relating to a criminal conviction.[32] At first sight, the Court takes a strict approach, ruling that the GDPR’s ban on processing special personal data should also apply to search engines. However, the Court manages to avoid the seemingly inevitable conclusion that search engines therefore have to deindex all source pages containing information about any natural person’s racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, health, sex life or sexual orientation, or criminal convictions and offences (which would include most online news and human-interest content). The Court points to GDPR Article 17(3), which contains an exception to the right to be forgotten when the processing is necessary for the exercise of freedom of information, and holds that the search engine may refuse a request to be forgotten which concerns special personal data when the display of a search result is strictly necessary for the right to freedom of information of internet users. The judgment makes it clear that right to be forgotten requests will always be a balancing test between privacy on the one hand and the freedom of information on the other.

The European Data Protection Board (EDPB), comprising the 27 EU and 3 EEA EFTA national data protection authorities, published the first part of its guidelines on the right to be forgotten in July 2020.[33] The document is mostly focused on explaining differences between the former Privacy Directive and the GDPR on this topic. The second part, which would include a list of criteria for assessing requests to be forgotten, has yet to be published.

Consumer-protection law goes online

In the past two years, EU consumer-protection law has increasingly focused on online platforms and online sales in general. Consumer-protection law comprises a mishmash of mostly EU law which aims to protect consumers by (i) requiring traders to provide certain information,[34] (ii) prohibiting aggressive or misleading selling techniques,[35] and (iii) providing consumers with certain rights.[36]

Three EU Directives were adopted in 2019, which will have to be transposed into national law in the course of 2021. The Digital Content and Digital Services Directive[37] and the Directive on Contracts for the Sale of Goods[38] form a diptych, giving consumers more information and rights when buying “smart” devices, digital content or digital services. The Modernization Directive, also adopted in 2019,[39] introduces specific obligations for “online marketplaces”. These must provide mandatory information about the identity of merchants on the platform; the ranking of products displayed; and whether the marketplace guarantees that reviews actually come from users. The Directive also extends the scope of the legal information obligations to services for which consumers “pay” with their personal data.

In principle, only the trader is responsible for compliance with consumer-protection law, not the platform which the trader uses. However, in practice, the trader’s ability to comply depends in part on the design of the service. That creates a market incentive for platforms to enable compliance, but even then the extent to which a platform can be held responsible raises all sorts of questions: can there be a misleading omission on the part of a trader if there is no space to provide the relevant information in the listing on a shopping comparison service? Can regulators ask Apple to require app providers to prominently display privacy information in the App Store? Can online platforms be required to “de-platform” traders who violate consumer-protection rules?

Digital Services Act

In December 2020, the European Commission stepped into this crowded playing field with a proposal for a major expansion and tightening of the rules for online platforms: the Digital Services Act (DSA).[40] As a regulation rather than a directive, it would upon adoption apply automatically throughout the EU, without transposition by member states and all the potential for subtle or not-so-subtle differences which that entails. The proposal is 45 pages long excluding explanatory memorandum, recitals and financial statement, with 74 detailed articles which the Commission hopes will come into force within two years. While Member States and stakeholders have been providing their initial responses, the European Parliament has been wrangling over which committee will have primary responsibility for the proposal. A massive lobbying fight is looming.

The DSA leaves the E-Commerce Directive of 2000 largely intact, including the under-rated country of origin principle in Article 3.[41] The familiar safe harbors for access, caching and hosting providers, which have enabled both the modern internet and its downsides, will be retained. They are, however, moved into the DSA to prevent national transposition differences. The DSA codifies the CJEU’s case law on the safe harbor, and confirms that providers do not lose their entitlement to that safeguard by taking measures themselves to block or remove illegal content (the so-called Good Samaritan dilemma). Article 5(3) contains a quite arbitrary exception to the safe harbor in the situation where a hosting provider, in short, facilitates a remote transaction between consumer and trader in a way that suggests that the provider itself is the trader.

In essence, the DSA retains the ancient foundation of the E-Commerce Directive, and superimposes a mighty new pyramid of further obligations. All hosting providers are presented with a number of rules regarding the processing of complaints about “illegal” content and official orders to remove information or provide customer data. These rules are fairly basic, but still potentially burdensome, particularly for smaller platforms.[42]

More detailed rules apply to a subset of hosting providers which qualify as an “online platform”: a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates information to the public. This “dissemination to the public” is not quite the same as “communication to the public” in copyright law, although it is defined using similar concepts,[43] which the Court of Justice borrowed from media law 15 years ago and has attempted to explain ever since.[44] In the explanation to its website,[45] the European Commission lists online marketplaces, app stores, sharing economy platforms and social media platforms as examples of online platforms, which shows that this is a very eclectic group of services and thus raises the question of whether it is appropriate to impose exactly the same obligations on them all.

Online platforms face mandatory procedures for complaints and disputes over content removal and account termination. However, those who were critical of the banning of Donald Trump will look in vain for substantive standards for when platforms may (or must) “de-platform” content or users. Online platforms must identify certain business customers, long a cherished wish of rightholders but one that may prove unnecessarily laborious for semi-professional users of platforms like Etsy.[46] In processing content complaints, platforms must give priority to – presumably soon to be very numerous – ‘trusted flaggers’. They will have additional reporting requirements, for instance on the number of disputes, users, suspensions and the use of automatic content moderation. They will also have to show users real-time information about the origin and targeting of online advertising; some in the European Parliament are calling for a total ban on targeted online advertising.

The major online platforms (“very large online platforms” or VLOPs) will be subject to additional and far-reaching, systemic supervision. For example, they will have to publish periodically externally verified audits of (their measures to counter) systemic risks of dissemination of illegal content, manipulation and violation of fundamental rights (in particular in relation to the use of content moderation, recommendation systems and targeted advertising). The definition of VLOP in the proposal is based purely on the number of users: an online platform with more than 45 million average monthly users in the EU is a VLOP. A hard, quantitative definition might serve legal certainty, but that certainty is limited because “monthly active users” is subject to interpretation by the European Commission through delegated legislation. Moreover, a hard quantitative trigger discourages medium-sized platforms from growing further, and leaves no room for (some of) the rules to be applied to smaller platforms (or not applied to larger platforms) based on, e.g., the social risks inherent in the platform’s design or business model, or its ability to mitigate those risks. The principle of proportionality should provide some flexibility, but a formalized conditional exemption mechanism would be much better. In any case, one can expect that many of the “bad users” who are ostracized by VLOPs will find refuge with less regulated and less well-equipped non-VLOPs, which will not improve the situation overall.

All these new rules must be enforced. This will be the duty and exclusive competence of the Member State where the provider has its headquarters. A provider offering services in the EU without an establishment in the EU must designate an EU representative and will fall under the jurisdiction of the Member State where that representative is located.[47] The responsible Member State must appoint one or more regulatory authorities, and designate one of them as the “Digital Services Coordinator” (DSC). The DSC must have sufficient independence, resources and powers, and must be able, among other things, to issue “effective, proportionate and dissuasive” fines of up to 6% of annual turnover. VLOPs are subject to enhanced supervision, including serious investigative powers for DSCs and the possibility for the European Commission to intervene if a VLOP breaches its obligations, either at the request of the DSC or, conversely, at the request of other DSCs if the competent DSC is not taking sufficient action.

As an EU regulation, the DSA will have direct effect. Moreover, unlike the GDPR, the DSA leaves little room for further definition or detail at the national level. One question which Member States will have to answer is which supervisory authorities will be charged with enforcing the DSA, and which of them will have the status (in practice perhaps: bear the burden) of being the designated coordinator. Existing regulators for competition law, consumer law, media law etc. are obvious candidates, but which ones and in what relation to each other? In Member States where they exist, it will be hard to overlook the large, well-staffed, converged regulators for competition, telecoms and consumer protection, who are used to enforcing general and sector-specific rules against large companies. After all, the DSA is above all intended to strengthen sector-specific consumer protection. However, media regulators already regulate media and video platforms, and that the DSA derives its urgency primarily from the perceived need to improve the regulation of online content. It is also conceivable that Member States will divide supervisory powers among several existing regulators, by subject or even by type of platform. The European umbrella organization of media regulators ERGA, for example, has argued that the national media authorities should in any event exercise supervision over online content platforms (which, according to ERGA, should then also be extended to cover undesirable or harmful content).[48] There are indeed good arguments to give media regulators a role in the application of DSA rules to media platforms, while leaving it to other regulators to enforce with respect to, for example, price comparison services or ride-hailing platforms. We should in any case bear in mind that most national regulators will be playing quite a modest role, since many of the largest platforms currently have their European headquarters in Ireland and will thus be regulated by the Irish regulator(s).

Competition Law

Online platforms continue to be at the centre of European competition authorities’ minds. The European Commission kicked off 2019 with another multi-billion dollar fine for Google, this time for Google AdSense for Search, which displays ads around search results on third-party websites. Google had allegedly entered into exclusive agreements with these websites, making it (virtually) impossible for other providers of advertising services to appear in the search results on this website.[49] In addition, the Commission fined the gaming platform Steam for making geo-blocking agreements. [50] The Commission further launched four investigations into Amazon and Apple. The first Amazon investigation is about the use of competitively sensitive information from third-party merchants using the online marketplace.[51] In November 2020, the Commission launched a second investigation into possible preferential treatment of Amazon’s own products on the platform.[52] In June 2020, the Commission launched two investigations into Apple. The first investigation concerns the mandatory use of Apple’s payment system in the App Store (with 30% commission for Apple),[53] the second denying access to the NFC chip in IPhones.[54]

While regulators have clearly not stopped using competition law against potentially abusive conduct, they have also recognized the limitations of traditional competition law, particularly in terms of speed.[55] Regulators and legislators alike have mostly come to the conclusion that dynamic digital markets require additional competition-law regulation. Indeed, the conversation has mostly shifted from whether there should be additional regulation to what these rules should look like.

An early and targeted attempt to level the playing field was made in June 2019 with the Platform-to-Business (P2B) Regulation.[56] The Regulation, which became applicable from July 2020, notably introduces transparency and due diligence requirements for business users of online intermediation services and online search engines. According to its critics, the P2B Regulation did not tackle the problem with sufficient breadth or vigor, failing to impose substantive rules of conduct on online platforms.

The Digital Markets Act (DMA), published in December 2020 together with the DSA,[57] represents a far more radical and ambitious plan to improve the contestability of online markets. We mention three salient features of the DMA:

  1. Determination of platforms to be regulated. The Commission delineates the companies in the digital sector that fall within the scope of the DMA in the following way. First, the company must offer a “core platform service”. This includes online intermediation services (e.g. Amazon), search engines (e.g. Google), social networks (e.g. Instagram), video platforms (e.g. YouTube), number-independent electronic communication services (e.g. WhatsApp), operating systems (e.g. iOS), cloud services (e.g. AWS) and advertising services (e.g. Google Display Network). Second, the provider of a core platform service must qualify as a gatekeeper. This is determined using three criteria, all three of which are translated into quantitative benchmarks: i) significant market impact, ii) managing an important gateway for customers, and iii) firmly established and sustainable position.[58] The idea is that this will make it relatively easy for a company to determine for itself whether it will be regulated, but that may be optimistic.
  2. Obligations for gatekeepers. The DMA contains two detailed lists of obligations for gatekeepers. One list can be applied directly, the other requires further elaboration. The directly applicable list includes a prohibition on combining users’ personal data, an obligation to provide price information in advertising services, and a set of obligations aimed at preventing contractual lock-in of business users.[59] The list of obligations requiring further elaboration per platform includes obligations relating to the use of third-party data, apps installed by default, non-discrimination against third-party products, data portability and access to data generated by the platform. Both lists of obligations apply in principle to all gatekeepers.[60]
  3. European Commission and national authorities. The European Commission has the power to determine gatekeeper status, modify or add to the list of obligations, and impose sanctions if a gatekeeper structurally fails to comply with the obligations (which range from pledges and large fines to, in extreme cases, breaking up the company). The role of Member States in the proposal is limited. Member States are explicitly prohibited from introducing other regulations for gatekeepers that are intended to “level the playing field”. The role of national regulators is equally limited, they only have a say on a committee which advises the Commission on its decisions.[61]

Meanwhile, some former and current Member States have already introduced their own regulations. In the UK, the competition regulator has recommended the creation of a specialized regulator, with the power to impose individual codes of conduct on specific platforms.[62] Germany has already gone further, with Parliament approving a proposal empowering the regulator to impose specific prohibitions on certain companies, choosing from a ‘menu’ of seven – broadly defined – behaviors. What is striking about both initiatives is the freedom, relative to the European proposal, which the national regulator is afforded to impose tailor-made measures.[63]

It remains to be seen how the DMA proposal will develop, after the European Parliament, the Member States and hordes of lobbyists and academics have provided their input. One thing is sure, digital competition law is about to change fundamentally.

What is not changing, at least not for the better, is the definitional jungle of EU platform regulation which drives tech lawyers to despair. New definitions discussed here, such as intermediation service, online platform, very large online platform (DMA), gatekeeper and core platform service[64] (DMA), overlap to an as yet uncertain extent with existing definitions such as online intermediation service (P2B Regulation), video-sharing platform service (AVMS Directive), online content-sharing service (DSM Copyright Directive) and number-independent interpersonal communication service (ECC Directive). These are all definitions which determine the scope of application of significantly burdensome regulations. Whether a certain service or service provider falls under a certain definition, and therefore has to deal with particular additional obligations and regulators, will be the subject of disputes and CJEU references for years to come. This has consequences, both for companies which may or may not have to comply with the associated rules and for the companies and consumers who the rules are trying to protect. All this uncertainty has a price for the EU’s credibility as an exporter of effective tech regulation and breeding ground for the new tech champions of tomorrow.

The emerging European law of data

Open Data and Data Governance

Far from content with the GDPR, European lawmakers are continuing to legislate for the “data-driven economy” with “fair, practical and clear rules” for access to, and use of, data. The goal is to create an internal market for data, allowing data to flow freely but safely across the European Union, through all sectors. The Commission estimates the value of this data economy to be €829 billion by 2025, with 175 zettabytes of data worldwide and over 10 million people employed in the European data sector.[65]

One of the focal points of Europe’s digital strategy is access to high-quality data.[66] Since 2013, government bodies have been obliged to make public government information available for reuse, both commercial and non-commercial.[67] That regulatory regime was further strengthened with the Open Data Directive adopted in July 2019.[68] It encourages governments to make dynamic data available via APIs, in real time where possible. It limits public bodies’ ability to charge more than marginal costs, invoke sui generis database rights or agree exclusive agreements. There is a new regime for ‘high-value data sets’, which are considered to be particularly suitable for developing further applications and services and thus bring even greater benefits to society and the economy.[69] The Directive identifies thematic categories within which high-value data sets, to be defined by the European Commission, should be available.[70] The Directive has to be transposed into national legislation by 16 July 2021, but many national transpositions look likely to be late. Meanwhile, in November 2020, the European Commission published its proposal for a Data Governance Act,[71] which aims to further promote reuse with a new regime for ‘data intermediaries’, rules on the sharing of certain protected data and by facilitating ‘data altruism’.[72]

AI Regulation

The EU is also building out the regulatory framework for data, ‘big data’ and artificial intelligence from other angles. For example, at the request of the European Commission, the University of Amsterdam’s Institute for Information Law examined whether the current European IP rules are suitable for artificially created or assisted works or inventions.[73] The researchers found that the current state of AI does not yet allow for fully autonomous creation or invention by computer systems, so that limited adjustments to European copyright and patent law will suffice for the short term. The European Commission adopted the report’s conclusions in its recent IP Action Plan.[74]

The European Commission published an overarching AI White Paper in February 2020.[75] As is usual in this genre, it wants to both seize all opportunities and mitigate all risks. The Commission advocates for a risk-based regulatory framework for AI, meaning that AI will be more strictly regulated if it is applied in a sector where significant risks are to be expected and in such a way that significant risks can actually occur. The Commission suggests formulating rules regarding training data; data and registries; transparency; robustness and accuracy; human oversight; and specific regulations for certain AI applications such as remote biometric identification. Based on public responses to the White Paper, the Commission is working on a legislative proposal, which is expected to be published within weeks.[76]

Conclusion

The EU is determined to evolve into a self-aware, digitally sovereign bloc that sets the rules by which global technology companies have to operate. However, it is still struggling to make sharp choices between conflicting objectives, such as between privacy and competition, or between consumer protection and the promotion of innovation. Where the Big Tech debate in the US often seems to revolve around extremes – the absolute limits of the First Amendment, calls to “break them up”, regulation through multi-billion-dollar lawsuits – the EU’s approach focuses on detailed, problem-specific and increasingly asymmetric market regulation and targeted behavioral remedies. There is much to be said for tailor-made regulation. However, this approach is causing the EU to respond to each perceived problem with separate pieces of legislation, each with its own definitions, rules, and jurisdictional and supervisory structures. The unpredictability and heavy-handedness of the new rules risks entrenching the existing big players who can afford to understand and implement them. While each proposal now comes standard with eye-popping fines to ensure appropriate attention at C-suite level, a coherent, systematically thought-through EU tech law rulebook is still several MLRC Digital Conferences away. The journey there will be interesting.

[1] The authors are attorneys at the technology and communications law firm Brinkhof in Amsterdam, where they specialize in copyright, media, and internet litigation. They would like to thank their colleagues Ella Meijaard, Sophie ten Bosch, Hanneke Kooijman, Leonie van Sloten and Bart Tromp for their valuable contributions. This update covers the period March 2019 – March 2021.

[2] A recent addition to the academic discourse on the existence and scope of such a thing as “internet law” is R. Leenes, ‘Of Horses and Other Animals of Cyberspace’ Technology and Regulation, 2019 pp. 1-9, retrieved from https://techreg.org/index.php/techreg/article/view/3.

[3] Remy Chavannes & Dorien Verhulst, ‘Regulation of Online Platforms in the European Union – The State of Play’, MLRC Bulletin: Legal Frontiers in Digital Media 2019 No. 1, pp. 3-15, retrieved from https://blog.chavannes.net/2019/05/regulation-of-online-platforms-in-the-european-union-the-state-of-play/.

[4] Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services, applicable from 12 July 2020.

[5] Directive 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities.

[6] Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC.

[7] The so-called Digital Services Act (DSA): Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC, COM(2020) 825 final dated December 15, 2020.

[8] The so-called Digital Markets Act (DMA): Proposal for a Regulation of the European Parliament and of the Council on Contestable and Fair Markets in the Digital Sector (Digital Markets Act), COM(2020) 842 final dated December 15, 2020.

[9] ‘Social media usage in Europe – Statistics & Facts’ (Statista, February 10, 2020), https://www.statista.com/topics/4106/social-media-usage-in-europe, consulted on March 25, 2021.

[10] See in more detail: E. Douek, ‘Verified accountability: self-regulation of content moderation as an answer to the special problems of speech regulation’, Hoover Aegis Paper September 18, 2019 via lawfareblog.com; W. Benedek & M.C. Kettemann, Freedom of Expression and the Internet (second edition), Strasbourg: Council of Europe Publishing 2020.

[11] E. Douek, ‘Trump is banned, who is next?’, The Atlantic 9 January 2021; : J. Vincent, ‘Zoom cancels talk by Palestinian hijacker Leila Khaled at San Francisco State University’, The Verge September 24, 2020.

[12] M. Karnitschnig, ‘Politico Playbook: What Europe thinks of America after this week’, Politico 9 January 2021; R. Fahy et al., ‘Deplatforming politicians and the implications for Europe’, February 2021, https://www.sectorplandls.nl/wordpress/blog/deplatforming-politicians-and-the-implications-for-europe/.

[13] CJEU 3 October 2019, Case C-18/18 (Glawischnig-Piesczek / Facebook).

[14]  CJEU 24 September 2019, Case C-507/17 (Google / CNIL).

[15] CJEU 3 December 2020, Case C-62/19 (Star Taxi App).

[16] https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation.

[17] Joint Communication ‘Tackling disinformation related to COVID-19. Getting the facts right’, Brussels 10 June 2020 (JOIN(2020) 8 final) and the reports of online platforms as part of the monitoring program; see also: ‘Managing the COVID-19 Infodemic’, Joint Statement by WHO, UN and others, September 23, 2020 who.int. A. Knuutila et al, COVID-related Misinformation on YouTube: The Spread of Misinformation Videos on Social Media and the Effectiveness of Platform Policies (COMPROP Data Memo 2020.6), University of Oxford 2020.

[18] Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online (2018/0331 COD), adopted by the Council on 16 March 2021 (https://data.consilium.europa.eu/doc/document/ST-14308-2020-REV-1/en/pdf); cf. also Counter-Terrorism Agenda for the EU, December 9,2020, COM(2020) 795 final.

[19] See for a critical discussion of the proposal A. Kuczerawy, ‘The proposed Regulation on preventing the dissemination of terrorist content online: safeguards and risks for freedom of expression’, December 5, 2018, https://bit.ly/2uD8MtL and D. Keller, ‘The EU’s terrorist content regulation: expanding the rule of platform terms of service and exporting expression restrictions from the EU’s most conservative Member States’, cyberlaw.stanford.eu/blog March 25, 2019.

[20] The impact assessment for the proposal has been completed, a public consultation runs through April 15, 2021. See: https://ec.europa.eu/home-affairs/news/fighting-child-sexual-abuse-have-your-say_en. See also: EU strategy for a more effective fight against child sexual abuse COM/2020/607 final, p. 6.

[21] Notice & Takedown systems, supplemented by automated algorithmic checks, still form the basis of combating unlawful and undesirable content. Internet users can report unlawful or undesirable content to online platforms, which then remove the content if necessary. See: T. Gillespie, Custodians of the Internet; platforms, content moderation, and the hidden decisions that shape social media, Yale University Press 2018.

[22] D. Keller, ‘Empirical evidence of over-removal by internet companies under intermediation liability laws: an updated list’, February 8, 2021, see: cyberlaw.stanford.edu/blog.

[23] Id.

[24] A.-C. Lorrain, ‘Introducing an ancillary right for press publishers: a European law-making ambition for the press – but also on hyperlinking’, Computerrecht 2020/83.

[25] See e.g. Ben Thompson, ‘Publishing is Back to the Future’, 27 January 2021 (https://stratechery.com/2021/publishing-is-back-to-the-future/); and ‘Media, Regulators, and Big Tech; Indulgences and Injunctions; Better Approaches’, 14 May 2020 (https://stratechery.com/2020/media-regulators-and-big-tech-indulgences-and-injunctions-better-approaches/).

[26] Remy Chavannes, ‘The Dutch DSM copyright transposition bill: safety first (up to a point)’, Kluwer Copyright Blog June 11, 2020 (https://bit.ly/2QKCVp0).

[27] See amongst others Paul Keller, ‘German government draft on Article 17: Two steps forward, one step back’, Communia February 26, 2021 (https://bit.ly/3u83eUt).

[28] Directive 2018/1808/EU of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities.

[29] European Commission, Guidelines on the practical application of the essential functionality criterion of the definition of a ‘video-sharing platform service’ under the Audiovisual Media Services Directive, (2020/C 223/02).

[30] Strictly speaking, the case fell within the scope of the old Privacy Directive (95/46/EC), but the CJEU implied the GDPR in its judgment to ensure that the judgment is also useful in the future.

[31] CJEU 24 September 2019, Case C-507/17 (Google / CNIL).

[32] CJEU 24 September 2019, Case C-136/17 (GC/CNIL).

[33] EDPB, Guideline 5/2019 on the criteria for the right to be forgotten in the search engine business under the AVG (Part 1) Version 2.0 Adopted on July 7, 2020.

[34] See also Directive 2011/83 on Consumer Rights. Sometimes these obligations to provide information also overlap, resulting in confusion for the parties and the regulator.

[35] Unfair Trade Practices Directive 2005/29.

[36] Consumer Rights Directive and Unfair Terms Directive 93/13.

[37] Directive (EU) 2019/770 of the European Parliament and of the Council of 20 May 2019 on certain aspects concerning contracts for the supply of digital content and digital services, OJ L 2019/136, p. 1.

[38] Directive (EU) 2019/771 of the European Parliament and of the Council of 20 May 2019 on certain aspects concerning contracts for the sale of goods, OJ L 2019/136, p. 68.

[39] Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules, OJ L 2019/328, p. 7. Also referred to as the Omnibus Directive.

[40] Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC, COM(2020) 825 final dated December 15, 2020.

[41] Recital 33 suggests that the principle does not apply to official orders to remove content or provide information. This seems to us to be incorrect and, moreover, undesirable because it would open the door to official takedown orders based on the different national law of 27 EU member states.

[42] See, for example, the response from the App Association, which advocates for smaller providers of software and online services dated March 30, 2021 (https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12417-Digital-Services-Act-deepening-the-internal-market-and-clarifying-responsibilities-for-digital-services/F2163073).

[43] See Recital 14 DSA: “The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council, 39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information.”

[44] CJEU December 7, 2006, Case C-306/05 (Rafael Hoteles), referring to CJEU June 2, 2005, Case C-89/04 (Mediakabel).

[45] European Commission, “Digital Services Act: Ensuring greater security and accountability online,” https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en

[46] See Etsy’s reaction of March 31, 2021, https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12417-Digital-Services-Act-deepening-the-internal-market-and-clarifying-responsibilities-for-digital-services/F2163497.

[47] See Article 40 in conjunction with 11 DSA. Providers without an EU establishment can thus forum shop, although the question remains who would want to take on the role of representative given that it entails joint liability for fines.

[48] ‘ERGA welcomes the DSA and DMA proposals and points out ways for better enforceability’, press release with accompanying ‘Statement about the European Commission’s proposals for a Digital Services Act (DSA) and a Digital Markets Act (DMA)’, March 29, 2021, https://erga-online.eu/wp-content/uploads/2021/03/ERGA-DSA-DMA-Statement_29032021.pdf.

[49] European Commission, ‘Antitrust: Commission fines Google €1.49 billion for abusive practices in online advertising’, March 2019.

[50] European Commission, ‘Antitrust: Commission fines Valve and five publishers of PC video games € 7.8 million for “geo-blocking” practices’, January 2021.

[51] European Commission, ‘Antitrust: Commission opens investigation into possible anti-competitive conduct of Amazon’, July 2019.

[52] European Commission, ‘Antitrust: Commission sends Statement of Objections to Amazon for the use of non-public independent seller data and opens second investigation into its e-commerce business practices’, November 2020.

[53] European Commission, ‘Antitrust: Commission opens investigations into Apple’s App Store rules’, June 2020.

[54] European Commission, ‘Antitrust: Commission opens investigation into Apple practices regarding Apple Pay’, June 2020.

[55] As was also established by the European Court of Auditors: ‘The Commission’s EU merger control and antitrust proceedings: a need to scale up market oversight’, November 2020.

[56] Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services, OJ L 186/2019, p. 57.

[57] Proposal for a Regulation on Contestable and Fair Markets in the Digital Sector (Digital Markets Act) COM/2020/842 final.

[58] DMA, Article 3.

[59] DMA, Article 5.

[60] DMA, Article 6.

[61] Some competition regulators have already spoken out in opposition: ‘Give EU nations’ antitrust enforcers a role in gatekeeper platform regulation, says Dutch authority’s Snoep’, MLex, March 2, 2021.

[62] CMA, “Digital Markets Taskforce,” December 2020. Available at: https://www.gov.uk/cma-cases/digital-markets-taskforce

[63] GWB-Digitalisierungsgesetz, January 18, 2021, BGB 2021, No. 1, p. 2.

[64] The definition of core platform service, according to Article 2(2) DMA, includes another set of underlying definitions, partly from other regulations and directives: online brokering services, online search engines, online social networking services, video platform services, number-independent interpersonal communication services, operating systems, cloud computing services and advertising services.

[65] European Commission, ‘A European Data Strategy’, COM(2020) 66 final, February 19, 2020.

[66] See: https://ec.europa.eu/commission/presscorner/detail/en/fs_20_278

[67] Directive 2003/98/EC of 17 November 2003 on the re-use of public sector information, amended by Directive 2013/37/EU of 26 June 2013 on the re-use of public sector information.

[68] Directive 2019/1024 of 20 June 2019 on open data and the re-use of public sector information.

[69] See in particular Chapter V (Art. 2(10)) of the Open Data Directive.

[70] Annex 1 to the Directive provides the categories: geospatial data; earth observation and environment; meteorological data; statistics; businesses and business ownership; and mobility.

[71] European Commission proposal on European data governance (COM(2020) 767, November 25, 2020.

[72] Jay Modrall, EU Data Governance Regulation – A Wave of Regulatory and Antitrust Reform Begins, Kluwer Competition Law Blog November 30, 2020. See also the BNC fiche of 22 January 2021, published on rijksoverheid.nl.

[73] IvIR, Trends and Developments in Artificial Intelligence Challenges to the Intellectual Property Rights Framework, September 2020 (https://www.ivir.nl/nl/ivir-study-for-european-commission-on-ai-and-ip/). See also: Daniel Gervais, ‘Is Intellectual Property Law Ready for Artificial Intelligence?’, GRUR Int Vol. 69, Issue 2, February 2020; Daniel J. Gervais, ‘Exploring the Interfaces Between Big Data and Intellectual Property Law,’ Journal of Intellectual Property, Information Technology and Electronic Commerce Law 2019-3.

[74] European Commission, ‘Making the most of the EU’s innovative potential – An intellectual property action plan to support the EU’s recovery and resilience’, COM(2020) 760, November 25, 2020.

[75] European Commission, ‘White Paper on Artificial Intelligence – A European approach based on excellence and trust’, COM (2020) 65 final, February 19, 2020.

[76] https://digital-strategy.ec.europa.eu/en/policies/european-approach-artificial-intelligence.

Leave a Reply