One of the most debated issues is the proposed Art. 17 Copyright Directive (the former Art. 13 Copyright Directive) which mandates member states to provide that online content-sharing service providers make best efforts to ensure the unavailability of copyright protected works (Art. 17 para. 4b Copyright Directive) and make best efforts to prevent future uploads in accordance with the former (Art. 17 para. 4c Copyright Directive), turning the established notice-and-takedown-principle on its head. These provisions amount to filtering mechanisms that are able to detect copyright protected work and ensure the inability to upload such work as long as the upload is not licensed or covered by the exceptions of Art. 17 para. 7 Copyright Directive and recital 70 Copyright Directive (content generated for the specific purposes of quotation, criticism, review, caricature, parody or pastiche).
The mandated technology would therefore only meet the provisions of the Copyright Directive when it is not only able to recognize the content (the actual image, audio or text file) but also the context of the upload. This would require information about the circumstances of the upload which in return would inevitably require processing of personal data. The questions arising from all this are: On what legal basis would service providers be able to process said personal data? Are the obligations put upon service providers by the Copyright Directive compatible with the GDPR or will they be worn out between a rock and a hard place?
The overlap between the Copyright Directive and the GDPR
The GDPR applies to all processing of personal data carried out by a controller while Art. 17 para. 4 Copyright Directive applies to online content-sharing services as defined in Art. 2 para. 6 Copyright Directive. As the GDPR is technology and business-model-agnostic there is no indication that online content-sharing services are out of the scope of the GDPR. The GDPR defines controllers as bodies that jointly or alone determine the purposes and means of the processing of personal data. The obligation to implement upload prevention mechanisms compliant to Art. 17 para. 4 Copyright Directive requires service providers to demonstrate their best efforts. The actual burden of the decision which technologies to implement is therefore put on the service providers which in return qualifies them as controllers according to the GDPR. Depending on the actual technical implementation there will of course be room for joint controllership or controller-processor-relationships with providers of filtering services. Yet, a deeper analysis of these relationships would exceed the limits of this analysis.
It could be argued that filtering mechanisms would primarily need to detect and compare a given uploaded piece of copyright protected content with a defined license data base to determine whether the upload can proceed or needs to be blocked. Such a filtering might – on first glance – be possible without processing information about the user who attempts the upload. This however would only be true as long as it is possible to determine whether a piece of content is covered by a particular license without information that goes beyond knowledge about the content itself. This is not the case with the Copyright Directive as the context of the upload is highly relevant to determine whether a specific upload – while not being covered by a license agreement – is covered by Art. 17 para. 7 Copyright Directive.
Automated detection of parody, quotes or criticism is – if possible at all – highly dependent on knowledge about the circumstances of the upload. The same piece of content may be legally uploaded as part of a movie criticism given that it was uploaded by an arts student in one scenario while not so in another. Detecting parody is equally depended on meta information about the upload. A copyright protected piece of video (e.g. an advertisement clip) could be seen as parody at one time (e.g. after said products is revealed to be highly harmful) but not at another. The identity, place, date and time of an upload would therefore be relevant for the processing by the filtering mechanisms. This information has to be seen as personal data according to Art. 4 Nr. 1 GDPR and its analysis by the filtering algorithm as processing according to Art. 4 Nr. 2 GDPR.
The legal basis for processing data for filter mechanisms
The GDPR requires all data processing to be covered by a legal basis. This legality principle is laid down in Art. 8 para. 2 of the Charter of Fundamental Rights of the European Union (CFR) and is implemented into the GDPR in Art. 6 para. 1 GDPR which enumerates six legal bases.
It might be argued to base the necessary data processing on consent (Art. 6 para. 1a GDPR) or make it part of the terms of service (Art. 6 para. 1b GDPR). Both options seem unsuitable though. Consent would hardly be considered freely given (Art. 7 para. 4 GDPR) as the legal obligation to comply with the Copyright Directive has to be met by all services which in return leads to a lack of alternatives. Making upload filters part of the contractual agreement on the other hand seems equally shaky as a possible upload-filter-clause would have to stand the test of consumer protection law and might be seen void as far as it goes beyond what is legally required. As far as it is required by law such clause would on the other hand simply be unnecessary. The main focus of legal examination therefore lies with Art. 6 para. 1c GDPR and with the question what mechanisms exactly will be considered necessary to comply with the obligations of Art. 17 para. 4 Copyright Directive.
Service providers will have clear incentives to implement rather extensive filtering mechanisms as non-compliance with Art. 17 para. 4b and 4c Copyright Directive leads to liability for all user uploaded content according to Art. 17 para. 4 Copyright Directive. Given the technical complexity of mechanisms that are even considered to actually be able to meet the high demands of Art. 17 para. 4b and 4c Copyright Directive and the danger of liability it seems likely that especially smaller providers will – notwithstanding Art. 17 para. 5 Copyright Directive – choose to implement third party services offered by platforms that have experience with such technologies like Googles Content ID algorithm (however insufficient even these state-of-the-art technologies still are). In the same way Googles centralized advertisement platform AdSense has seen widespread adoption, this could be the case with its filtering technologies, too. The legal issues especially in the area of data protection would be similar: Centralized ad networks require the provider of the ad network to process personal data of website visitors filtering services would require to process personal data of the users that upload copyright protected material. This would effectively lead to few big service providers being able to process information about circumstances, date, time and content of uploads by the majority of users.
With regard to Art. 6 para. 1c GDPR this would lead to the conclusion that the Copyright Directive results in a legal obligation to implement technologies that require centralized filtering infrastructures as equally effective but less extensive solutions like locally implemented filtering mechanisms are hardly able to sufficiently discriminate and detect the exceptions of Art. 17 para. 7 Copyright Directive.
Compatibility of centralized filtering mechanisms with the Charta of Fundamental rights
The GDPR additionally requires all legal obligations in the meaning of Art. 6 para. 1c GDPR to be proportionate to the legitimate aim pursued (Art. 6 para. 3 sentence 4 GDPR). With this the GDPR is expressively implementing the principle of proportionality of Art. 52 para. 1 CFR. The obligations in Art. 17 para. 4b and 4c Copyright Directive can only serve as a basis for Art. 6 para. 1c GDPR when they do not infringe the fundamental rights laid down in the CFR. In this context Art. 17 para. 4b and 4c Copyright Directive must be checked against Art. 7 CFR (the right to private life) and Art. 8 para. 1 CFR (the right to protection of personal data).
In its Sabam-decision (Case C?360/10, recital 52) the Court of Justice of the European Union (CJEU) already stated that requiring a service provider to install a filtering system would not be respecting the requirement that a fair balance be struck between the right to intellectual property, on the one hand, and the freedom to conduct business, the right to protection of personal data and the freedom to receive or impart information on the other. The CJEU later went even further, saying in its Schrems-decision (Case C?362/14, recital 94) that legislation giving public bodies generalized access to the content of communication not only fails to meet the principle of proportionality but even violates the essence of the right to private life. Art. 17 para. 4b and 4c Copyright Directive would effectively require just that: Systematically accessing and processing communication between a user and the service. It does however not yet give public bodies access to this information but is undoubtedly laying down the necessary infrastructure for future legislation doing just that such as the Proposal for a Regulation on preventing the dissemination of terrorist content online (COM(2018) 640 final).
The mandated best efforts to prevent uploads according to Art. 17 para. 4b and 4c Copyright Directive must therefore be seen as a legal obligation that is in violation of Art. 7 and Art. 8 CFR and controllers are asked to choose between two equally unacceptable risks: Processing data without a legal basis because Art. 17 para. 4b and 4c Copyright Directive does not meet the requirements of Art. 6 GDPR or face the liability Art. 17 para. 4 Copyright Directive imposes on them.
Dr. Malte Engeler is a judge at the administrative court of the German state of Schleswig-Holstein and the former deputy head of the supervisory unit of the data protection authority of Schleswig-Holstein.