Today 47% of the world population, or just over three billion people, are using the internet. The use of and interaction in the vast enabling environment that is the World Wide Web has been made possible by the work of online intermediaries, a diverse group of actors that facilitate interaction and transactions between third parties on the internet. Intermediary roles and types vary significantly: some provide access to the web either through wire-based or wireless communication technology; others offer hosting, payment, storage and on-demand services like music streaming; and then there are those who aid in navigation of online space (e.g. search engines) and those who create social hubs where people can come together to share content, news, advice and much more. Without intermediaries there would be no access to the internet, no opportunities to share information with others and no collaborative economy that changed the way we view ownership and property use.
In the early days of dot com era intermediaries were subject to limited regulation. But as the internet evolved, pressure on intermediaries to police their content grew concomitantly. Today intermediary liability – legal responsibility of intermediaries for illegal or harmful activities performed by users through their services – can occur in many circumstances and as a result of many issues, including those related to child sexual abuse material (CSAM), hate speech, racist and xenophobic content, defamation, infringement of intellectual property (IP) rights, terrorist and extremist content, illegal offers related to gambling, pharmaceutical products and banking, the sale of counterfeit goods, as well as malware, spam and data protection infringements.
Just exactly how much of this content exists online at any one point is hard to quantify. For one, there is no single repository that aggregates data across the board. For another, different sources tend to specialise in different kinds of illegal content, making the creation of a complete picture of how much illegal online material was out there, say, a year ago rather difficult. For example, online intermediaries like Google, Twitter and Tumblr report extensively on copyrights and trademark infringements, but hardly anything on CSAM. By contrast, national hotlines prioritise CSAM and only some (e.g. hotline IE) report notices related to financial scam, violence and hate speech. Additional IP related information can be found in Lumen database and in the publications of some actors from the rightsholders camp, such as the Software Alliance, which publishes information on unlicensed software installation for several years and world regions, including Europe. But many other, equally important types (e.g. malware, spam) remain un- or under-reported.
Of course, not all content reported online is illegal. Some complaints draw attention to material that is actually legal under national laws, and some reports can be disputed thanks to counter-notification procedures provided by many online intermediaries. But even allowing for material which was wrongly identified as illegal or eventually restored, the sheer volume of illegal internet content is staggering. For example, in the year to 12 January 2017, Google removed 916 million URLs that affected 353,000 websites. Between January and June 2016, Twitter received 24,874 takedown notice requests, 75% of which led to the removal of content. Over the same period, Tumblr received 12,864 notices, which led to the removal of 97,403 pieces of content and 61,053 posts and the termination of 1,558 accounts. Considering this is just a small fraction of the actual extent of the problem, it is hardly surprising that illegal online content has very significant implications for the stakeholders affected, the economy and society at large.
Impact on Stakeholders
Take the most obvious victim – the music industry. Before Napster emerged on the file sharing scene in 1999, global music market was worth $37 billion. Although the service disappeared a couple of years later, the industry has suffered greatly from the drop in sales that file sharing platforms like Napster contributed to. (In 2015, record industry’s income amounted to just $15 billion.)
It goes without saying that online piracy and revenue loss go hand in hand. In 2014, EU observatory on IP infringements reported that European music industry lost 5.2%, or €170 million, of its total annual sales due to physical and digital piracy. Studies for other countries report similarly negative effects. For example, in his study Stephen Siwek estimated a total sales loss for the U.S. economy of $58 billion per year, along with 373,000 jobs lost as a result of global piracy. All of that translated further into an earnings loss of $16.3 billion and reduced tax collections of $2.6 billion.
The trade-off between the benefits of innovation and the consequences of its disruptive potential makes regulating digital space no easy task for a policymaker. On one side there are those who call for a more restrictive approach, arguing that laxed environment is conducive to unsustainable loss of revenue, reduced social welfare and wider economic consequences such as unemployment. On the other are those who say that internet access is a human right and that people should have access to a virtual public space just as they have the right to assemble offline. Digital rights activists further add that enforcing restrictive requirements on online content amounts to restraining people’s access to a potentially unlimited market of opportunities and many different kinds of experiences, including social and cultural. Accessing and making online content available to all can therefore be intrinsically linked to individuals’ right to freedom of expression.
Liability Regime in Europe
In the EU, intermediary liability regime is governed by the E-Commerce Directive which shields from liability intermediaries that act as mere conduit, caching and hosting providers, on condition that “the provider does not have actual knowledge of illegal activity or information and… the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information” (Article 14).
Although the directive prevents member states from imposing general monitoring obligation on service providers, it states that “member states may establish obligations for information society service providers promptly to inform the competent public authorities of alleged illegal activities undertaken or information provided by recipients of their service or obligations to communicate to the competent authorities, at their request” (Article 15). This means that there is no absolute protection in the directive for internet intermediaries and they are not immune from prosecution and liability. Such legal uncertainty, combined with the lack of clarity on the requirements for notice-and-action procedures, is a cause of considerable anxiety for many intermediaries.
The realities on the ground, coupled with findings from several consultation rounds carried out by the European Commission, point to several issues in the current liability regime that must be resolved as soon as possible, starting with the role of intermediaries in the information transmission process.
- Intermediary role. There is a need to clarify the meaning behind “mere technical, automatic and passive nature” of information transmission that intermediaries help facilitate. Some stakeholder groups, including the intermediaries themselves, find the meaning sufficiently clear. However, this is not the case with notice providers who think that owing to the lack of clarity the concept cannot be applied uniformly.
- Intermediary type. Internet changes at breakneck speed. This means, among many other things, that definitions that are valid today can become obsolete very quickly. As digital space rapidly expands with new services and platforms, it is worth asking whether the definition of an online intermediary is still apt. More than half of the respondents to the consultation (notice providers, researchers, right holders and civil rights organisations) believe that there is a need to establish new categories or to clarify existing ones.
- Notice-and-action. There is also a lack of clarity over how a typical N&A procedure may/should be implemented and very little information about the actual practices/processes is shared by some types of intermediaries, notably ISPs. Further, consultation findings show that there is a preference to have different N&A approaches towards different types of illegal content.
- Counter-notification. Counter-notification is the norm in many complaint procedures administered by tech companies like Google, Twitter, Tumblr and VIMEO, to name just a few. But should this norm be also embraced by EU-based intermediaries, and if so – which ones? (As to the first question, the answer appears to be a firm yes based on consultation results, as over 80% of respondents agreed that this mechanism should be introduced).
- Illegal content and a duty of care. Illegal content is recognised as a serious problem and many intermediaries have taken proactive – sometimes very advanced – voluntary measures to remove it from their sites. But despite this the question about the duty of care and the extent to which it should be applied continues to dominate discussions on intermediary liability. Based on consultation results, over half of the respondents reject the duty of care arguing that it will be too costly, raise entry barriers and lead to abuses. Many intermediaries prefer that duty of care remains voluntary. However, holders of IP rights believe that actions such as filtering, blocking and internal policies should be part of an obligation.