Back to top
Article

Upholding copyright or breaking the web?

10 December 18

Intellectual property briefing: the European Parliament has approved the draft Copyright Directive with its controversial article 13. What will be the impact of the current version of the draft?

by Alison Bryce

On 12 September, the European Parliament voted to approve the draft EU Directive on Copyright in the Digital Single Market. The directive seeks to alter how companies and individuals use and profit from the internet. The centre of controversy has been article 13, heralded as an “extinction-level event” by digital rights activist Cory Doctorow. The Electronic Frontier Foundation meanwhile claim that it will “break the internet”.

What is article 13?

Article 13 seeks to close what is termed the “user-generated content loophole”. This refers to a gap in protection under the current e-Commerce Directive, where legal responsibility for breach of copyright is primarily placed on the uploader. Online platforms are only required to take action once illegality has been directly brought to their attention.

The drafters of the e-Commerce Directive, working in the days of dial-up internet, could not have predicted the monolithic rise of companies such as Facebook, YouTube and its parent company, Google. For many years, such companies have evaded liability for hosting unlicensed user-uploaded copyrighted material, while profiting massively from a near unlimited supply of free user-generated content. Copyright owners, whose works have been used to drive traffic to their platforms, have been left uncompensated.

Article 13 seeks to address this problem. Under it, “online content-sharing service providers and right holders shall cooperate in good faith in order to ensure that unauthorised protected works or other subject matter are not available on their services”. The intended effect is to “improve licensing practices… reduce transaction costs and increase licensing revenues for right holders”.

How will it be implemented?

There is as yet no consensus as to how platforms are expected to identify accurately and remove infringing content. In the original draft text, this was to be achieved through “effective content recognition technologies”. In practice, this means installing filters that either prevent users from uploading potentially copyrighted materials, or verify that content is properly licensed.

The latest version of the directive removes references to such technologies, and inserts an undertaking for the European Commission and member states to work together to ensure that “automated blocking of content is avoided”. However, given the extent of the liability placed on platforms for hosting infringing material, the likelihood is that some form of filtering will be required.

Filtering technology is not in itself new. Platforms have long since developed smart content filters that employ a mixture of artificial intelligence and human review. Facebook, for example, can monitor photos to identify faces and suggest specific individuals are tagged. Such technology is also capable of identifying and flagging infringing material for review, or placing adverts on it which are monetised by the rights holder – a service already provided by YouTube for rights holders on a commercial basis.

What are the obstacles?

An issue with such filters is that they often fail to take proper account of the context in which copyrighted material is posted. This leads to inconsistent results, and systems which are incapable of distinguishing infringing content from that which adapts copyrighted materials within the confines of a legitimate exception, such as parody. The potential for stifling user-driven creativity has led to article 13 being popularly referred to as the “meme ban”, but in reality all extracts of audio, pictures, text and video shared online risk being automatically flagged and removed by the filters.

There is arguably no platform more reliant on user-generated content than YouTube, where more than 400 hours of video are uploaded every minute. YouTube’s CEO, Susan Wojcicki, a vocal critic of article 13, claims that the site may simply block European users’ access to certain content in order to avoid potential liabilities; and that “if implemented as proposed, article 13 threatens hundreds of thousands of jobs, European creators, businesses, artists and everyone they employ”.

What is the risk to SMEs?

There has also been broader concern over the potential effects of article 13 on SMEs. For most companies of this size, developing smart content filters would be expensive. Requiring this technology across the board may deter tech startups and venture capitalists, leading to a shift in investment towards jurisdictions with less constrictive rules.

Such concerns are addressed in the latest draft directive, which requires special focus on “ensuring that the burden on SMEs remains appropriate”. Of course, incentives offered to SMEs could have the unintended consequence of discouraging companies from upscaling to the point of being caught by more stringent article 13 obligations.

Ultimately, nobody truly knows what standard of compliance will be expected of online platforms, or where the line will be drawn between the tech giants and SMEs. However, some things are clear. In its current form, article 13 will provide valuable protection to individual and corporate rights holders by shifting responsibility for breach to those who have historically profited at their expense.

While there are valid concerns relating to freedom of expression online, the biggest loser is almost certainly going to be the tech giants’ profit margins.

What next for article 13?

Having passed its second reading, the directive now enters “trilogue”, an informal negotiation between the European Commission, Council and Parliament. This will result in a final draft text which will be put before the EU Legal Affairs Committee, and then a final reading before the European Parliament in January 2019.

Alison Bryce, partner, Dentons UK & Middle East LLP

Have your say