GenAI User Content, Platform Intermediate Liability and Fundamental Rights

In previous blog posts we discussed copyright issues resulting from the training of AI models and the first rulings coming from both U.K. and German courts. Interestingly, in Germany, a Munich court found that memorisation in the AI model could amount to a reproduction (“Vervielfältigung”) under §16 UrhG and that the operators were liable for infringing outputs because they exercised control (“Tatherrschaft”) over the system after having presumably relied on the CDSM data mining exception. This article shall deal exclusively with the EU copyright aspects for platform operators.

In the EU, copyright law is largely harmonized through the Information Society Directive 2001/29/EC, the Copyright and related rights in the Digital Single Market (CDSM) Directive 2019/790 and others. EU law, by comparison, does not have a “Fair Use” doctrine as provided for in U.S. law. Exceptions are narrow and mainly confined to those in Art.5 InfoSoc Directive and Title II of the CDSM.

When it comes to liability for digital platform operators offering services within the European Union, two types of digital media uploaded by users pose a particular risk of incurring liability for copyright infringing content:

  • Explicitly copyrighted materials
  • AI generated user content that infringes copyright

What’s the GenAI risk?

While the infringing use of explicitly copyrighted materials is often outright mitigated by upload filters, the picture is less clear when it comes to AI generated content as it frequently operates in a gray zone of derivative works that may not be caught by current upload filter software.

Who may be liable for uploading Gen AI infringing content?

The answer is: it depends on who the platform operator is! The Digital Services Act (DSA) Regulation 2022/2065, as per Recital 12 applies to copyright infringement and provides part of the liability regime and also exemptions.

  • Art.4 DSA “Mere Conduitoperators – There is a liability exemption for those operators only “transmitting communications, not selecting the receiver and not selecting or modifying the information”.
  • Art.5 DSA “Cachingoperators – There is a liability exemption if, amongst other items, the operator “acts expeditiously to remove or disable access to the content…upon obtaining actual knowledge”
  • Art.6 DSA “Hostingoperators – There is a liability exemption that the operator is not liable for the information stored at the request of a recipient of the service and a.) “does not have actual knowledge of illegal activity or illegal content and…not aware acts or circumstances from which the illegal activity or illegal content is apparent”b.) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content.

The Art.6 DSA requirements for liability exemption

The seminal case of C-324/09 — L’Oréal v eBay (2011) sheds some light on how “hosting” operators may benefit from the Art.6 DSA liability exception, if:

  • The information was stored at the request of the user.
  • The operator has not “…played an active role of such a kind as to give it knowledge of, or control over, the data”, meaning it has a passive role and provides the service in a neutral way.
  • The operator has not “failed to act expeditiously” in remove infringing content after having obtained knowledge.

The Art.6 DSA requirement to “not have actual knowledge of illegal activity or illegal content” begs the question if only “actual knowledge” is enough. The court in L’Oreal v eBay clarified that not only actual knowledge suffices but also “that it is sufficient, in order for the provider of an information society service to be denied entitlement to the exemption from liability… for it to have been aware of facts or circumstances on the basis of which a diligent economic operator should have identified the illegality in question and acted in accordance…”

The Art.8 DSA – No General Monitoring Obligation, but…

Art.8 DSA stipulates clearly that “No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed…”

However, national law of EU Member States still may prescribe a “specific” obligation to monitor for certain information, explicitly called for in Art.6(4) DSA: “This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State’s legal system, to require the service provider to terminate or prevent an infringement.”

The court in C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Limited gave some guidance as to what may amount to a “specific” monitoring obligation:

  • ordering a host provider to remove information which it stores, the content of which is identical to the content of information which was previously declared to be illegal, or to block access to that information, irrespective of who requested the storage of that information
  • ordering a host provider to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be illegal, or to block access to that information

From this it can be derived that a general monitoring obligation would monitor all uploads against all possible infringements, while a specific monitoring obligation would monitor all uploads against specific content. This leaves hosting operators with considerable room for obligations depending on national obligations.

Online Content-Sharing Service Providers – not exempt under DSA!

The Digital Single Market (CDSM) Directive 2019/790 in Art.2 defines an Online Content-Sharing Service Provider (OCSSP) as:

“…a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes.

This provision is mainly aimed at social media platforms or file sharing sites of different sizes and Art.17 CDSM hereby serves as lex specialis derogat legi generali to the DSA stipulations for intermediate liability, stating:

  • an online content-sharing service provider performs an act of communication to the public or an act of making available to the public for the purposes of this Directive when it gives the public access to copyright-protected works or other protected subject matter uploaded by its users
  • When an online content-sharing service provider performs an act of communication to the public or an act of making available to the public under the conditions laid down in this Directive, the limitation of liability established in [Article 6 DSA] shall not apply

In Art.17(4) the Directive assigns primary liability to the OCSSP which has let users upload GenAI content infringing copyright by stating:

  • If no authorisation is granted [e.g. Licence agreements], online content-sharing service providers shall be liable for unauthorised acts of communication to the public, including making available to the public, of copyright-protected works and other subject matter…

The CDSM Content Monitoring Paradox

The CDSM in Art.17(8) reiterates, just like the DSA Art.8, that there should be no general monitoring obligations. However, an OCSSP may only evade liability for copyright infringing content uploaded by its users under Art.17(4) CDSM if they demonstrate to have:

  • made best efforts to obtain an authorisation [licence]”
  • made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information
  • acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads

The paradox is that EU law prohibits general monitoring, yet Article 17 seems to require it in practice to comply with copyright obligations. In order to identify content that potentially requires a licence and to employ “high industry standards of professional diligence“, as well as to identify re-offending user uploads an OCSSP is compelled to use upload filter software that may constitute “general monitoring“.

This position was addressed by the court in the case C-401/19 Republic of Poland v European Parliament and Council of the European Union, where the judge stated that it does not constitute a general monitoring obligation, because the OCSSP does not prevent the upload of contents by way of their own initiative but rather “only on condition that the rightholders concerned provide them with the relevant and necessary information with regard to that content”. In a nutshell, the court argued that this is a specific monitoring obligation and not a general monitoring obligation. The intellectual soundness of this reasoning may lie in the eye of the beholder.

The Fundamental Rights Conundrum

Another aspect of the case C-401/19 Republic of Poland v European Parliament and Council of the European Union was the impact of content upload filters on the right of freedom of expression set out in Article 11(1) of the Charter of Fundamental Rights of the European Union, which are further specified in the CDSM Art. 7 and Art.14(4) DSA.

The court clarified that a “filtering system which might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications, would be incompatible with the right to freedom of expression and information, guaranteed in Article 11 of the Charter, and would not respect the fair balance between that right and the right to intellectual property“.

It further stated that an OCSSP is not authorised to make an independent assessment of the contents legality, but is rather required to rely on the information provided by the rightsholder. Hereby the OCSSP may block content which is either “identical” or “equivalent” to the information provided by the rightsholder.

However, the technical implementation often ignores the courts fine distinctions, as it is cheaper and safer for platforms to simply block everything that triggers a match. In practice this has lead to an over enforcement by platform operators over the last years, clearly with the aim to stay within the liabilitiy exemptions.

The surge in GenAI generated content now uploaded by users does not make the situation easier to handle for platform operators. Rather the opposite may be true. The obvious choice is to employ even stricter upload detection rules, in order to identify derivative works that may or may not be subject to copyright protections, at the cost of almost certainly running into legal issues by over enforcing even more and thereby venturing deeper into Fundamental Right territory.


Disclaimer:
The content of this blog is provided for general informational purposes only and does not constitute legal advice. While we strive to ensure that the information is accurate and up to date, it may not reflect the most current legal developments or the specific circumstances of your organization. Readers should not act upon any information contained in this blog without first seeking professional legal counsel. No attorney-client relationship is created through your use of or reliance on the information provided herein.