Regulation of Online Platforms in the European Union – The State of Play

For the past two decades, the 2000 Electronic Commerce Directive has been the cornerstone of internet regulation in the European Union, and its safe harbors have played a crucial part in the creation of the modern internet.[1] In recent years, however, rightholders, politicians, civil society groups and others have increasingly challenged the status quo, arguing that Europe has naively allowed itself to be exploited by unaccountable – mostly American – tech giants. Although these critics often disagree strongly about exactly what needs to be done and why, they have convinced legislators that something must be done. As a result, dozens of laws have been passed or proposed at the EU level in the last few years, transforming the legal environment in which platforms operate on everything from privacy, copyright, and free speech to competition, labor law, and tax. By acting first and asking questions later, the EU is hard at work establishing a de facto global standard for tech regulation.[2]

The process is anything but smooth, and not everyone is convinced of the quality, cohesion or wisdom of the EU’s regulatory approach. Like it or loathe it, however, EU regulation has become a significant business concern for global tech companies. Although Europe’s population is dwarfed by those of rising powers such as China and India, its market of 500 million affluent consumers spread over 28 – more or less – coordinated member states is hard to ignore. Moreover, its regulatory innovations have a habit of catching on in other jurisdictions, and some elements of its approach to Big Tech are even gaining traction in the US.[3] The EU is also projecting power beyond its borders by promoting its regulatory agenda within international bodies like the G7 and the OECD.[4]

In this article, we discuss the current state of online platform regulation in Europe.[5] European intermediary liability has law traditionally focused on the liability of internet access and hosting providers for their users’ unlawful activities, leaving it up to individual member states’ national law to craft a wide range of Good Samaritan obligations in relation to, for example, the production of subscriber data or the blocking of infringing websites. In recent years, this discussion has become more fundamental, more wide-ranging, and more fragmented, all at the same time. The essential question is what online platforms must do to keep their services safe and lawful, and to what extent they are responsible when they fail to do so.

Publication with Dorien Verhulst in MLRC Bulletin: Legal Frontiers in Digital Media 2019/1 – now paywalled, full text continues below.

The Legal Landscape

Digital Single Market Strategy

The key political driver behind the recent flurry of EU internet legislation is the Digital Single Market (‘DSM’) strategy of the European Commission under its current President, Jean-Claude Juncker, whose mandate will end with the European elections in late May 2019. The DSM strategy wants to do many things at the same time: to make the European economy more competitive compared to the US and China, and create an ethically and environmentally respectful digital society, where fundamental rights and fairness are protected.[6]

Under the umbrella of the DSM strategy, the Commission has presented over thirty legislative proposals, covering a wide range of areas, from consumer law to copyright to media and telecoms regulation. Halfway through its term, in September 2017, most of those were still in the making.[7] In a subsequent final sprint, however, the Commission has been able to bring many of these proposals over the line, while identifying three tech files (e-privacy, e-evidence and terrorist content) as “unfinished business” for the next mandate.[8] With many of these new rules being implemented at member-state level over the next two years, both EU and non-EU based technology businesses can look ahead to a wide range of regulatory changes.

The measures that have ended up becoming law have often been heavily amended over the course of the legislative process, rarely becoming clearer in the process. The EU legislative process is slow, complicated and opaque. Many of the adopted directives and regulations[9] are heavy with impossible compromises between different member states and legislators, fought over by armies of lobbyists. The rules are often fragmented, difficult to read, diluted and downright contradictory. It then falls to the EU Court of Justice (CJEU) to ‘interpret’ the rules, and to integrate them into the existing European rulebook, one question at a time.

Between broadcasting and telecoms

From an EU legal perspective, internet platforms operate between the Audiovisual Media Services Directive (‘AVMS Directive’) on the one hand, regulating media services; and, the European rules for electronic communications on the other hand, which apply to telecommunications services. Both were revised as part of the DSM strategy, and now also cover a wide range of internet services that were previously regulated only by the light-touch Electronic Commerce Directive.

The revised AVMS Directive,[10] which member states must implement by September 2020, now also applies to ‘video-sharing platform services’, which are defined as providing programs and user-generated video for which they do not bear editorial responsibility – YouTube being the example most frequently cited. The regulatory regime for such platforms is still light and flexible compared to the regime for the traditional broadcasting and on-demand services which are subject to the editorial control of their provider. Video platform services must take ‘appropriate measures’ to protect minors from harmful content, ban certain types of criminal content, and comply with basic advertising rules.

The new rules do not just apply to providers of video platform services that are established in the EU, i.e., have the centre of their activities relating to the video platform sharing service in an EU member state.[11] The rules also apply to providers that are established outside the EU but have one or more group entities (subsidiary, parent, or other group member) in the EU.[12] Whether a video platform service actually targets one or more member states is irrelevant: if a US-based video platform opens as much as a sales office in a European city, its entire video platform stands to become subject to the revised AVMS rules and supervision by the media authority of an EU member state. Only platforms established outside the EU, with no presence in one of its member states, remain out of scope. If a service is within scope, figuring out which EU member state is competent to regulate the service can be extremely complex.[13]

On the telecoms front, a new directive establishing a European Electronic Communications Code merges four of the five existing EU telecommunications directives into a new whole. The directive must be implemented at the national level by 21 December 2020.[14] Much of the new code is familiar, there are only a few major changes. Under the new rules, over the top services such as WhatsApp, iMessage or Google Hangouts will be governed by the EU telecoms rules as a ‘number-independent interpersonal communications service’.[15] Here too, it starts with a relatively light-touch regime: regulated services must meet basic (technical and operational) security requirements, and provide notification of incidents; be transparent about their commercial offerings and conditions; and potentially be subject to interoperability obligations, but only if they have substantial coverage and use and after a lengthy consultation process.

Intermediary liability

The European Commission has refrained from proposing any explicit changes to the safe harbors regime of the E-commerce Directive.[16] Instead, it has followed what it called a ‘sectoral, problem-based approach’.[17] Nonetheless, it has effectively limited or restricted the safe harbor in various areas, and increased regulation across the board. Moreover, the issue of “opening up the E-Commerce Directive” is likely to reappear on the scene later in 2019.

A specific ‘platform-to-business regulation’ is very close to being adopted, and will apply from one year after official publication.[18] The regulation sets out minimal rules that apply between ‘online intermediation services’ and their business users, such as online sellers, app developers or hotels. Under the new rules, sudden and unexpected account suspensions are banned. In addition, providers must provide specific written justification for any account restriction or termination, with possibilities for appeal. Applicable terms and conditions must be easily available and provided in plain and intelligible language. Changes must be notified at least 15 days in advance, or longer if they are complex. Providers of search engines must set out “the main parameters, which individually or collectively are most significant in determining ranking and the relative importance of those main parameters, by providing an easily and publicly available description, drafted in plain and intelligible language, on the online search engines of those providers.”

In parallel, the EU member states failed to reach transnational agreement on the introduction of a digital services tax.[19] The matter has now heated up at the OECD, after France led a push to force big tech companies to pay more into national treasuries.[20]

The CJEU been called to rule on several of the hybrid data platforms that operate in the slipstream of internet pioneers such as Amazon, Google and Facebook. Services such as Airbnb, Uber and have frequently been at odds with traditional regulated sectors. A war of ideas runs in the background.[21] In 2017, the CJEU held that Uber does not qualify as an information society service, but is a transport service, and hence can be subject to much greater regulatory scrutiny by member states.[22] CJEU Advocate-General Szpunar recently advised the court that Airbnb should be considered an information society service.[23] If the CJEU follows this non-binding opinion, it would represent a significant victory for the platform in its battle against a French umbrella organization of hotels, travel agencies and trade unions. Meanwhile, numerous national courts have given their take on the internet companies that are disrupting their cities. In Amsterdam, for example, subdistrict court judge decided that Deliveroo meal deliverers are not self-employed, but are entitled to an employment contract.[24]

Fighting illegal and harmful content

It has become a cliché in debates about the responsibilities of online platforms to say that they “must do more” when it comes to the moral hygiene of what their users upload and share.[25] In 2017, the European Commission started modestly with a ‘communication’ on combating illegal online content. A year later, a ‘recommendation’ followed, which suggested specific operational measures.[26] The Commission urged platforms to cooperate with authorities and trusted flaggers, while accelerating, simplifying and improving notice & takedown processes.[27] In parallel, fake news and disinformation were coming to the center of Europe’s political attention.[28]

The European Commission remained dissatisfied with the speed of progress. In September 2018, it proposed a regulation aimed at fighting online terrorist content.[29] The proposal was clearly inspired by the controversial German online enforcement law Netzwerkdurchsetzungsgesetz (‘NetzDG’).[30] Under the new regulation, national authorities would have the power to order hosting providers to take down terrorist content. Platforms would have to remove manifestly unlawful content within one hour of a removal order, subject to a fine up to 4% of their global annual turnover in case of systematic and persistent non-compliance.[31] The European Parliament has approved an amended version of the regulation, which will now be subject to further negotiations. Given the upcoming elections for the European Parliament and, thereafter, the nomination of a new European Commission, the measure seems unlikely to come into force before 2020.[32]

The Commission’s attempts to fight illegal online content raise complex and controversial issues. Private companies risk being incorporated as ‘speech arbitrators’ for public policy objectives. Whether the inevitable interference in fundamental rights has a sufficiently precise statutory basis, is questionable. Many of the European initiatives in the area of content moderation lie in policy initiatives and stakeholder dialogues.[33] Critics also point to the risk that a handful of major players will effectively determine what can still be said, seen and heard online. It goes without saying that there are considerable local differences when it comes to online speech regulation. European rules and court orders are not always compatible with the US First Amendment.[34] However, this is of bigger concern to US tech companies than it is to EU lawmakers and courts.

Over the years, European courts have become fairly comfortable with the conflicts between privacy, intellectual property and freedom of expression that frequently arise in cases involving online platforms. In the Delfi and MTE cases, the European Court of Human Rights[35] ruled that online platforms cannot simply present themselves as the mere channel of their users’ expressions.[36] In Tamiz, the court stressed the important role of online intermediaries like Google in facilitating access to information and public debate online.[37] In Høines, the court confirmed that a Norwegian platform could not be held responsible for sexually charged comments about a well-known female attorney. The platform operated a team of moderators, implemented warning buttons, and had an effective complaints system and thus acted with sufficient care.[38] In the Magyar Jeti case, the ECtHR stressed the importance of hyperlinks on the internet, formulating five criteria for the assessment of a hyperlink in the context of Article 10 ECHR.[39] In the Kablis case, the ECtHR condemned the blocking of a user’s social network account by the Russian government, focusing on the collateral damage of a blocking measure for legal speech.[40] Meanwhile, in the Buivids case, the CJEU considered the uploading of a video on YouTube a processing of personal data that did not automatically fall within the scope of the journalistic exception, accentuating the tension between the two European courts.[41]

Search Engines and the ‘Right to be Forgotten’

Five years have passed since the CJEU ruled in the Costeja judgment that natural persons have, under certain circumstances, the right to require removal of certain search results that appear in a search query for their name.[42] Over that time, Google, Microsoft and other search engine providers have received and processed delisting requests for several million URL’s, and Google in particular has defended civil and administrative cases in numerous EU states, two of which have led to important new CJEU references. The first asks the question whether removals based on the right to be forgotten should be carried out globally.[43] The A-G advised the court that a removal should be applied on all top-level domains, but only for searches from the EU.[44]

The second case asking the CJEU to clarify the application of the ‘right to be forgotten’ deals with the question how the statutory prohibition on the processing of sensitive personal data should be applied in the context of search engines.[45] If this rule were applied literally, it could have the absurd result of requiring search engines to delist all source pages containing information about anyone’s health, ethnicity, politics, religious beliefs, sex life, trade-union membership or criminal convictions. The Advocate-General has suggested that search engines should be able indirectly to invoke the exception for journalistic activities, but it remains to be seen how the court will rule.[46] Whatever the outcome of this case, it may turn out to be of largely academic interest, because it pertains to the old EU Privacy Directive, which was replaced by the GDPR in 2018. Article 17(3)(a) GDPR provides a broader exception to the right to be forgotten, covering all situations in which the processing of personal data is necessary to safeguard freedom of information.

Meanwhile, the ECtHR confirmed that two German convicts did not have a right to have their names removed from news publications in the online archive of newspaper Der Spiegel.[47] In the Manni case, the Court of Justice interpreted the ‘right to be forgotten’ in the context of a companies register, creating a much higher barrier for a successful removal request in that context.[48]

Infringement of IP Rights

Of all the components of the European Commission’s DSM Strategy, the proposed directive on ‘Copyright in the Digital Single Market’ has proved to be by far the most controversial. It was marketed as a much-needed modernization of copyright rules, to regulate text and data mining, protect European news publishers, and close the ‘value gap’ between online platforms’ revenues from third-party content and their payments to rightholders. The proposals ran into vociferous opposition from tech companies, academics and NGO’s, who decried the ‘link tax’ and ‘upload filter’ provisions as endangering the free internet. After a torturous legislative process, the European Parliament approved the text in March 2019, later followed by member state governments.[49] During voting in the European Parliament, the proposal to vote on various amendments deleting the most controversial provisions was rejected by a margin of five votes; the subsequent revelation that ten legislators had accidentally voted against provided an unintended illustration of its struggles with modern technology.

Article 17 of the Directive (better known by its original numbering: Article 13) provides that an ‘online content sharing service provider’ is deemed to be making a communication to the public. As a result, the provider must obtain authorization in order to communicate works to the public, and cannot invoke the safe harbor of the E-Commerce Directive. If no authorization is granted, the provider is liable for unauthorized acts of communication to the public, unless it has made ‘best efforts’ to obtain authorization, and made best efforts to ‘prevent the availability’ of the works for which they had received necessary information, and acted expeditiously, upon receiving notice of infringement, to remove the infringing works and prevent their reappearance. All the while, providers must avoid identification of individuals, processing of personal data in violation of the GDPR, general monitoring, and any blocking of non-infringing content, including content covered by an exception or limitation.

The Directive defines an ‘online content sharing service provider’ as “a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes.” It specifies a number of specific exemptions, and the recitals suggest that a broader range of services may fall outside scope if they do not compete with licensed audio and audiovisual services for the same audiences. Assessing whether a specific service qualifies as an online content sharing service provider will be an uncertain exercise in many cases, which is particularly problematic given the stark, binary consequences of qualifying as such.

Article 15 of the Directive (formerly Article 11) gives news publishers a new exclusive right in their press publications, which is intended to allow them to receive compensation for ‘digital uses’ of (parts of) their articles. Whether this will solve the existential crisis of the newspaper sector is questionable; experiences with similar rights in Germany and Spain suggest the opposite. The new directive also provides for some rudimentary rules on copyright contract law and a number of new exceptions and limitations, including two mandatory (but partly opt-outable) exceptions for text and data mining.

The new directive, and Article 17 in particular, will throw up a large number of questions, and thus likely lead to a significant number of referrals to the CJEU. In this respect, the Directive looks set to share in the agony of the ‘right of communication to the public’ right as defined in the original EU copyright directive of 2001, which the CJEU has now spent several dozen judgments trying to interpret.[50] The net result of that case law is cryptic: even to insiders, EU digital copyright law has become incomprehensible without (and possibly with) a large-sized flowchart. Over the past two years alone, the CJEU has ‘clarified’ that the seller of a video streaming set-top box with pre-programmed links to illegal streaming sites performs an infringing communication to the public.[51] This was also the case for The Pirate Bay, which played an active role facilitating infringing torrents on its platform, by offering a search function, the removal of non-functioning trackers, and by filtering certain types of content.[52] That case arose from a long-running Dutch case on the question whether internet access providers should block access to the website of the Pirate Bay.[53] The CJEU also held that the holder of an internet connection cannot evade liability for infringements of copyright committed through that internet connection merely by suggesting that a family member could also have access to that connection.[54]

In two pending cases referred to the CJEU by German courts, pertaining to YouTube and a file-sharing site called Uploaded, the CJEU will have the opportunity to rule in a broader sense on the tension between the ever-expanding concept of communication to the public and the safe harbors in the E-commerce Directive.[55] The cases may be joined by a third case in which the Dutch Supreme Court referred questions on the same subject in a dispute between Dutch anti-piracy organization BREIN and Usenet provider NSE.[56] In another reference from Germany, the CJEU must rule on the question if framed linking to a lawfully published copyright protected work constitutes an infringement if it circumvents protective measures against such framing taken by the right holder.[57]

Competition Law

‘Oil no longer is the world’s most valuable resource, data is’, according to one headline in The Economist from 2017.[58] The article focused on the fact that the companies controlling this new commodity – Amazon, Apple, Alphabet, Facebook and Microsoft – were simultaneous the five most valuable listed companies in the world. Against this background, it is no surprise that competition law is increasingly looked at to address a range of real and perceived problems around Big Tech.[59]

Over the past two years, online platforms have been at the center of attention of EU and national competition authorities, and in the process EU Competition Commissioner Margrethe Vestager has become a familiar name in tech circles.[60] In 2016, Apple was presented with a € 13 billion tax bill after the European Commission ruled that a sweetheart tax deal between Apple and the Irish tax authorities amounted to illegal state aid.[61] The Commission subsequently imposed three fines on Google for abuse of dominance, for a total amount of € 8.25 billion. In the Google Shopping case, Google was found to have unlawfully favored its own shopping service.[62] The Android case revolved around the fact that Google linked its operating system and access to the Google Play store to pre-installation of Chrome and Google Search on new phones.[63] In the Ad-Sense case, Google was accused of limiting opportunities for online publishers to display advertisements from other ad brokers.[64] The European Commission also fined Facebook for providing misleading information during its acquisition of Whatsapp.[65] In a pending investigation, the Commission is investigating whether Amazon abuses its position as a seller and supplier of the Amazon sales platform.[66] More recently, Spotify complained to the European Commission that Apple was illegally favoring its position in the music streaming market through the App Store.[67]

While the European Commission maintains a strong stance, its approach is that of a classic competition authority. In its decisions, virtually no (general) interests other than the protection of competition in digital markets play a role. National European competition authorities are dealing with the challenges posed by online platforms in different ways. In one Facebook case, the German competition authority seems to have undertaken an activist role,[68] by arguing that Facebook was illegally exploiting consumers by requiring them to consent to data collection by third party websites in order to access the platform. In effect, it held that Facebook was abusing its dominant position by violating users’ privacy.

The role of competition law in solving social problems in relation to big tech should not be overestimated. Competition law operates in a narrow economic framework, i.e. (only) where competition has been affected. Enforcement measures on the basis of competition law come, by definition, at a late stage.[69] By then, the playing field may have changed beyond recognition.

Concluding Remarks

No matter where you look in the EU, from competition law investigations to online speech issues, the era of laissez-faire policy and law making in online issues has come to an end. Indeed, the “tech-lash” is far from over.[70] Tech companies have shown themselves sensitive to these developments, and increasingly ready to acknowledge a social responsibility that goes beyond simply building great services. Amidst concerns about election manipulation, filter bubbles, fake news and jihad recruitment, tech CEO’s have emphasized in numerous hearings, speeches and interviews that they “must do more”.[71] At the same time, concerns about the large information or market power of the major online platforms has led to complaints that they “do too much”, i.e. interfere too much with what users can post or read online.

The pivotal question, of course, is what tech companies must do more about, and how. Suggestions abound: taking more responsibility for nasty or illegal information on their platforms, removing more erroneous information, paying more to publishers and record companies, paying more to the tax authorities, paying more to employees (temporary or otherwise), or offering more space to competitors. One of the main legal and policy-related issues is how equitably to share tasks, responsibilities, costs and risks between platforms, governments, users and other stakeholders. The European approach is to try to avoid two undesirable extremes: an unregulated Wild West in which the loudest and most extreme voices prevail, and an unfree state internet in which only approved speech gets bandwidth. Europe knows it does not want to be America and does not want to be China, but it is still struggling to make the real choices that are needed for its third way to come into focus.

European legislators and regulators are still getting used to what technology can and cannot do, and to the very different capabilities, risk profiles and business models of different platforms. In relation to companies like Uber, the regulatory conversation in the EU is largely about employment conditions, safety and the relationship to regulation for the taxi industry. In relation to Amazon, it is about taxes, wages and consequences for small retailers. The advent of platforms like Airbnb has consequences for cities and housing prices, while speech-focused platforms such as Twitter are challenged in relation to hate speech and disinformation. To complicate things further, tech companies are quick to adapt their business models, change their practices, and enter new markets. By the time regulators have caught up, everyone will be making flying cars.

Nowhere are the piecemeal approach and political compromises that characterize recent EU platform regulation more obvious than in its definitions. Under recently or soon to be adopted EU rules, one single online platform could be – simultaneously or successively – a ‘video-sharing platform service’ regulated under the revised AVMS Directive;[72] a ‘number-independent interpersonal communications service’ regulated under the Electronic Communications Code;[73] an ‘online intermediation service’ regulated under the Platform-to-Business Regulation;[74] a ‘hosting service provider’ regulated under the Terrorist Content Regulation;[75] and an ‘online content sharing service provider’ regulated under the DSM Copyright Directive.[76] This legislative killer Sudoku will eventually reach the CJEU, in what undoubtedly will be a string of national referrals about different aspects of the legislation. The real meaning of these rules will not be clear for many years.

As our article explains, the EU is pioneering a distinct approach to tech regulation, which aims to protect users by giving them control over their own information, making the world a safer place online, protecting fairness and fundamental rights, and encouraging competition. There is merit to much of the criticism of individual legislative initiatives, in relation to policy choices, legal certainty and overall cohesion and consistency. While the new rules will be interpreted and refined in coming years, however, the EU is unlikely to soften its new-found ambition to be a global trendsetter in tech regulation.




* Remy Chavannes and Dorien Verhulst are attorneys at the technology and communications law firm Brinkhof in Amsterdam, where they specialize in copyright, media, and internet litigation.

[1] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’).

[2] For a discussion, see: M. Scott, ‘Want to regulate Big Tech? Sorry, Europe beat you to it’, 11 April 2019; ‘How Big Tech learned to love regulation’, 11 November 2018 (updated 19 April 2019); Europe’s tech ambition: To be the world’s digital policeman, 20 August 2017.

[3] Both in California and Washington State new privacy laws incorporated some concepts from the EU General Data Protection Regulation. See: ‘California leads the way on data regulation’, 24 February 2019; ‘New Washington State Privacy Bill Incorporates Some GDPR Concepts’, 31 January 2019.

[4] G7 Outcome document ‘Combating the use of the internet for terrorist and violent extremist purposes, April 2019, see:

[5] Although by no means every European nation is a member of the European Union or the European Economic Area, we use “Europe” and “EU” interchangeably in this article.

[6] Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, A Digital Single Market Strategy for Europe, Brussels 6 May 2015 COM(2015) 192 final. See: and the press release of the European Commission of 6 May 2015,

[7] Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions on the Mid-Term Review on the implementation of the Digital Single Market Strategy A Connected Digital Single Market for All, 10 May 2017, COM/2017/0228 final. See:

[8] ‘Europe in May 2019 Preparing for a more united, stronger and more democratic Union in an increasingly uncertain world, Annex IV Unfinished business: the Top 10 EU issues awaiting final agreement’, 30 April 2019, see:

[9] EU Regulations have direct effect and become immediately enforceable as law in all member states simultaneously. EU Directives, on the other hand, need to be transposed into national law first, which can take several years and result in differences between member states.

[10] Directive 2018/1808/EU of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities.

[11] Article 28a (1) AVMS Directive, Article 3(1) E-Commerce Directive.

[12] Article 28a(2) AVMS Directive.

[13] Article 28a(3) and (4) AVMSD.

[14] Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code, OJ EU L 321/36.

[15] The response to the pending request for a preliminary ruling whether SkypeOut and Gmail qualify as electronic communications services could therefore come (too) late to be relevant, see the cases C-142/18 (Skype) and C-193/18 (Google).

[16] See the Communication from the Commission on online platforms and the digital single market,

25 May 2016, COM(2016) 288 final, which was preceded by a public consultation for the review of the liability regime for intermediary service providers laid down in the e-Commerce Directive.

[17] See the Mid-term review of the DSM strategy. See footnote 7, supra.

[18] Proposal for a Regulation of the European Parliament and of the Council on promoting fairness and transparency for business users of online intermediation services, latest version at

[19] ‘EU states fail to agree plans for digital tax on tech giants’, Financial Times 6 November 2018.

[20] ‘Public Consultation Document Addressing the Taks Challenges of the Digitalisation of the Economy, 13 February – 6 March 2019, see:

[21] See for example: ‘Berlin had some of the world’s most restrictive rules for Airbnb rentals. Now it’s loosening up.’, The Washington Post 28 March 2018; ‘Could London Set Up A Nonprofit, Cooperative Alternative To Uber?’, 10 February 2017.

[22] Judgment by the EU Court of Justice 20 December 2017, ECLI:EU:C:2017:981 (Uber).

[23] Opinion A-G Szpunar of 30 April 2019 in case C-390/18 (Airbnb).

[24] District Court Amsterdam 15 January 2019, ECLI:NL:RBAMS:2019:198 (Deliveroo).

[25] In a communication of 28 September 2017, the European Commission already pointed online platforms to their special social responsibility with respect to making such content accessible. See: Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, ‘Tackling Illegal Content Online Towards an enhanced responsibility of online platforms’, COM 2017/555.

[26] Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, ‘Tackling Illegal Content Online Towards an enhanced responsibility of online platforms’, COM 2017/555 and Commission Recommendation on measures to effectively tackle illegal content online, C(2018)1177 final.

[27] Id., p. 6. The Commission also wanted platforms to proactively search for illegal content from now on, for example with automatic detection and filtering systems. In the Commission’s view, invoking the limitation of liability in Article 14 of the e-Commerce Directive would not preclude this; a reasonable and logical view that is not clearly based on the case law of the EU Court of Justice.

[28] Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Tackling online disinformation: a European Approach, COM/2018/236 final.

[29] Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online, a contribution from the European Commission to the Leaders’ meeting in Salzburg on 19-20 September 2018, COM/2018/640 final.

[30] The NetzDG was adopted in Germany in June 2017 with the purpose to fight terrorist and extremist content online. The act obliges social media to block or remove content on the basis of some twenty limitations. Companies must remove manifestly illegal content within 24 hours and ‘ordinary’ illegal content within 7 days. A fine of up to 50 million euros may be imposed on companies that repetitively violate the act.

[31] See for a critical review of the proposal: A. Kuczerawy, ‘The proposed Regulation on preventing the dissemination of terrorist content online: safeguards and risks for freedom of expression’,

5 December 2018, and D. Keller, ‘The EU’s terrorist content regulation: expanding the rule of platform terms of service and exporting expression restrictions from the EU’s most conservative member states, 25 March 2019.

[32] Press release of 17 April 2019, ‘Terrorist content online should be removed within one hour, says EP’, see: The Parliament’s legislative resolution is available at

[33] A. Kuczerawy, ‘Private enforcement of public policy: freedom of expression in the era of online gatekeeping’ (diss. KU Leuven), 2018, see:

[34] See for a review in the context of hate speech: N. Alkiviadou, ‘Regulating Internet Hate: A Flying Pig?’, JIPITEC 2016/3.

[35] The Strasbourg-based ECtHR is not a body of the European Union, but of the Council of Europe, a much larger organisation which includes large non-EU countries such as Turkey and Russia. The ECtHR rules on Council of Europe member states’ compliance with the European Convention on Human Rights.

[36] Judgment by the ECtHR of 16 June 2015, case no. 64569/09 (Delfi); judgment by the ECtHR of 2 February 2016, case no. 22947/13 (MTE/Hongarije).

[37] ECtHR 19 September 2017, Application no. 3877/14 (Tamiz/UK).

[38] ECtHR 19 March 2019, case no. 43624/14 (Høines/Noorwegen).

[39] ECtHR 4 December 2018, case no. 11257/16 (Magyar Jeti Zrt/Hongarije), see: ‘Magyar Jeti Zrt v. Hungary: the Court provides legal certainty for journalists that use hyperlinks’, 18 January 2019.

[40] ECtHR 30 April 2019, Applications nos. 48310/16 and 59663/17 (Kablis/Russia).

[41] CJEU 14 February 2019, case C-345/17 (Buivids), see: D. Erdos, ‘European data protection and freedom of expression after Buivids: an increasingly significant tension’, 21 February 2019.

[42] CJEU 13 May 2014, case C-131/12, Google Spain & Google Inc / AEPD and Mario Costeja González.

[43] Press release Conseil d’Etat 19 July 2017,

[44] Opinion of Advocate General Szpunar 10 January 2019, case C-507/17.

[45] Press release Conseil d’Etat 24 February 2017,

[46] Opinion of Advocate General Szpunar 10 January 2019, case C-136/17.

[47] ECtHR 28 June 2018, case nos. 60798/10 and 65999/10 (M.L. and W.W./Germany), NJ 2019/97 with note EJD.

[48] CJEU 9 March 2017, case C‑398/15 (Manni).

[49] European Parliament legislative resolution of 26 March 2019 on the proposal for a directive of the European Parliament and of the Council on copyright in the Digital Single Market (COM(2016)0593 – C8-0383/2016 – 2016/0280(COD)), available at

[50] See Remy Chavannes, ‘Communication to the public in Europe: recent developments in EU copyright law in relation to digital media services’,

[51] CJEU 26 April 2017, case C- 527/15 (Filmspeler).

[52] CJEU 14 June 2017, case C-610/15 (The Pirate Bay).

[53] Dutch Supreme Court of 29 June 2018, ECLI:NL:HR:2018:1046 (Stichting Brein/Ziggo);

[54] CJEU 18 October 2018, case C-149/17 (Lübbe); P. ten Tije, ‘Bastei Lübbe: “Fundamental Rights as a defence to circumvent enforcement of Copyright protection? No!”, says CJEU.’, Kluwer Copyright Blog 11 February 2019.

[55] Pending cases C-682/18 (YouTube) and C-683/18 (Uploaded); J. van Mil, ‘German BGH – Does YouTube Perform Acts of Communication to the Public?’, 27 January 2019.

[56] Opinion Van Peursem 13 July 2018, ECLI:NL:PHR:2018:789 (Brein/NSE).

[57] See the press release of the Bundesgerichtshof of 25 April 2019 (No. 054/2019),

[58] ‘Regulating the internet giants. The world’s most valuable resource is no longer oil, but data’, The Economist 6 May 2017.

[59] We are grateful to our Brinkhof colleague Hanneke Kooijman for her valuable expertise and input for this article in the area of competition law.

[60] ‘Vestager: ‘I do work with tax and I am a woman’, 18 July 2018.

[61] Commission Decision of 30 August 2016 on State Aid SA. 38373 (2014/C) (ex 2014/NN) (ex 2014/CP) implemented by Ireland to Apple”, C(2016) 5605 final, Brussels 30 August 2016.

[62] Decision by the European Commission, 27 June 2017 (AT.39740).

[63] Decision by the European Commission, 18 July 2018 (M.40099). Google responded by offering users a choice between search engines and browsers (Google Blog, 19 March 2019).

[64] An interesting aspect of this case is that already back in 2014 Google made the commitment to the European Commission to amend the contracts in question. At the time, the Commission thought that these commitments were not enough. In spite hereof, Google implemented these amendments. It is striking that the Commission now comes to the conclusion that after this implementation the violation of competition law ended. Why did the Commission not accept the commitments immediately?

[65] Decision by the European Commission, 17 May 2017, C(2017) 3192.

[66] Financial Times, 19 September 2018, ‘EU opens probe into Amazon use of data about merchants’.

[67] Spotify newsroom, 13 March 2019, ‘Consumers and Innovators Win on a Level Playing Field’ and

[68] Bundeskartellamt, 19 December 2017, ‘Preliminary assessment in Facebook proceeding: Facebook’s collection and use of data from third-party sources is abusive’ en Bundeskartellamt, 7 February 2019, ‘Bundeskartellamt prohibits Facebook from combining user data from different sources’.

[69] Often, it even takes many years of investigation before an authority reaches a decision. Subsequently, the implementation of the decision in the company in question must be negotiated, which is followed by years of litigations on the amount of the fine.

[70] M. Scott, ‘In 2019, the ‘techlash’ will go from strength to strength’, 30 December 2018.

[71] ‘The Metamorphosis of Silicon Valley C.E.O.s: From Big to Boring’, New York Times 12 September 2018,; ‘Can Mark Zuckerberg fix Facebook before it breaks democracy?’ New Yorker 17 September 2018,; ‘Airbnb CEO pledges to take more responsibility for impact to housing’, Reuters

23 February 2018,

[72] Art. 1(1)(b)(aa) revised AVMS Directive: “video-sharing platform service” means a service as defined by Articles 56 and 57 of the Treaty on the Functioning of the European Union, where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both, to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks within the meaning of point (a) of Article 2 of Directive 2002/21/EC and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing.

[73] Art. 2(5) Electronic Communications Code: ‘interpersonal communications service’ means a service normally provided for remuneration that enables direct interpersonal and interactive exchange of information via electronic communications networks between a finite number of persons, whereby the persons initiating or participating in the communication determine its recipient(s) and does not include services which enable interpersonal and interactive communication merely as a minor ancillary feature that is intrinsically linked to another service.

[74] Art. 2(2) Business2Platform regulation: ‘online intermediation services’ means services which meet all of the following requirements: (a) they constitute information society services within the meaning of Article 1(1)(b) of Directive (EU) No 2015/1535 of the European Parliament and of the Council; (b) they allow business users to offer goods or services to consumers, with a view to facilitating the initiating of direct transactions between those business users and consumers, irrespective of where those transactions are ultimately concluded; (c) they are provided to business users on the basis of contractual relationships between, on the one hand, the provider of those services and, on the other hand, both those business users and the consumers to which those business users offer goods or services.

[75] Art. 2(1) Terrorist Content Proposal: ‘hosting service provider’ means a provider of information society services consisting in the storage of information provided by and at the request of the content provider and in making the information stored available to third parties.

[76] Art. 2(6) DSM Directive: ” ‘online content-sharing service provider’ means a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profitmaking purposes.

Providers of services, such as not-for-profit online encyclopedias, not-for-profit educational and scientific repositories, open source software-developing and sharing platforms, electronic communication service providers as defined in Directive (EU) 2018/1972, online marketplaces, business-to-business cloud services and cloud services that allow users to upload content for their own use, are not ‘online content-sharing service providers’ within the meaning of this Directive.”


Leave a Reply