Article 13 in the Information Age

by How to Guides Published on: 29 November 2019 Last Updated on: 11 November 2024

Article 13 from 10,000 Feet:

Article 13 of the Directive on Copyright in the Digital Single Market has created significant discussion and controversy regarding the liability of social media platforms for user-generated content that infringes intellectual property (IP) rights. Article 13 was adopted and came into force on June 7, 2019, and expands EU copyright law with the aim of protecting legitimate online publications and reducing the “value gap” between the profits made by internet platforms and by content creators and encouraging collaboration between these two groups. While expert legal counsel should be retained if you anticipate your business or creative rights will be affected by the changes under the EU’s copyright directive, this article will cover the basics of what Article 13 means for user-generated content and online hosting platforms.

In practice, Article 13 obliges service providers to use “appropriate and proportionate” measures to filter and remove copyright-infringing content that is uploaded to their platform. This includes creating accessible complaints and redress mechanisms, facilitating stakeholder dialogues to define best practices for content control, and the use of content recognition technologies to automatically identify and filter infringing content.

Article 13 replaces a previous exemption for copyright infringement liability for service providers who operated merely as a conduit of infringing content. The new conditional liability exemption under Article 13 applies where a hosting platform has exercised sufficient due diligence to satisfy a “best efforts” standard for identifying and removing copyright-infringing material on their platform. If a host is found to have dissatisfactory content policing standards, they may be found liable for uploaded infringing content. In determining the “best efforts” standard for due diligence, Article 13 considers the size and resources of the hosting service provider, the volume of content regularly uploaded, and the effectiveness of the copyright infringement control mechanisms in place.

The Policy Underlying Article 13:

Article 13 is largely a response to the popularity of copyright infringement on the internet and the growing need to curtail the illegal distribution of recreational media like music, movies, and television shows. Previously, large media hosting websites, like YouTube, were not liable for damages resulting from copyright infringement and were only responsible for removing content that is reported as infringing third party rights like under the Canadian and U.S. copyright systems as discussed here. Article 13 aims to place a greater onus on hosts to invest time and resources into scrutinizing uploaded content and protecting the rights of legitimate content creators whose intellectual property might be unlawfully exploited on their infrastructure. Consequently, the aim of Article 13 is also to protect genuine content creators against lost revenue from pirating and illegal distribution of copyrighted materials and to facilitate a reciprocal relationship between content creators and hosts.

One innovative tool in combating online infringement is the use of sophisticated media recognition software that can automatically identify uploaded content that infringes copyright and then bar it from being hosted or notify the genuine copyright owner for direction as to how the infringement ought to be addressed. However, when applying auto-filtering systems in practice it is in the host’s interest to sweep content as broadly as possible to mitigate any risk of facing liability for infringement, even if it comes at the cost of also filtering out some legitimate business. The underlying reasoning is that the loss of revenue from some blocked legitimate media is likely to amount to less than the cost of resolving disputes with large media entities whose copyrighted material has been unlawfully distributed through the host’s platform.

Public Concern Over Article 13:

copyright infringement

The primary concern with Article 13’s implementation is whether automated filtering for copyright infringement will actually help solve the internet’s infringement problem and, if it does, whether that gain will come at the cost of legitimate content creators inadvertently having their content barred from being uploaded. Automated filtering systems may catch content that is not actually violating any laws and dissuade genuine contributors from incorporating a fair use or fair dealing of another’s work into their own creative content. Furthermore, businesses with large copyright portfolios may pressure hosts to remove any similar content that resides in a legal “gray area” to reduce the amount of media competing for their targeted audience’s attention.

To date, many online commentators and companies have expressed concerns about how Article 13 will affect their business model and intellectual property rights. While the digital economy itself is still in its infancy, the impact of legal instruments like Article 13 will be instrumental in shaping its evolution for years to come.

Read Also:

Mashum Mollah is the feature writer of Search Engine Magazine and an SEO Analyst at Real Wealth Business. Over the last 3 years, He has successfully developed and implemented online marketing, SEO, and conversion campaigns for 50+ businesses of all sizes. He is the co-founder of Social Media Magazine.

View all posts

Leave a Reply

Your email address will not be published. Required fields are marked *