Germany does not yet have any regulations specifically targeting the metaverse. There have been expert hearings before the parliamentary committee for digitalisation of the German Bundestag on opportunities and risks of the metaverse and other web3.0 concepts in December 2022 that have not yet resulted in any legislative initiatives. As an EU member state, Germany will be bound by any future legislation by the European Union that may specifically target the metaverse or other web3.0 applications. The legal landscape surrounding the metaverse can be expected to be constantly evolving over the coming months and years.
However, existing laws and regulations in Germany may already apply to legal challenges that the metaverse poses. In this regard, a briefing of June 2022 by the European Parliament has identified several risks and policy implications in connection with the metaverse, such as data protection, cybersecurity, competition and liability.
Data Privacy
Given the sheer amount of personal data that is processed by a variety of actors in connection with the provision of metaverse-related services, establishing and maintaining sufficient data privacy and cybersecurity will be particularly challenging. To this end, the processing of personal data in Germany is governed by the General Data Protection Regulation (GDPR) and the German Federal Data Protection Act (Bundesdatenschutzgesetz, BDSG).
Cybersecurity
With regard to cybersecurity – apart from the requirement to implement sufficient technical and organisational measures to ensure the security of processing under the GDPR – specific rules under the NIS2 Directive of December 2022 covering critical infrastructures will have to be transposed into national law by the end of 2024. These rules target many different sectors, among them cloud computing service providers and other digital infrastructure providers, but also digital providers, such as search engines and social networking sites. These could potentially encompass providers of metaverse-related services.
Intellectual Property
Finally, the metaverse also poses challenges to the protection of intellectual property. The unauthorised use of registered trade marks in the metaverse is on the rise, as are copyright violations through other web3.0 concepts, such as non-fungible tokens (NFTs). Additionally, more and more trade marks are filed that refer to the metaverse or NFTs. Therefore, the European Intellectual Property Office (EUIPO) has issued guidance on virtual goods, NFTs and the metaverse in June 2022 for classification purposes.
In Germany, the digital economy is not governed by a single law, but rather by a multitude of different provisions scattered across several national and Union laws, some of which pose risks of significant monetary sanctions in case of non-compliance.
E-Commerce
One of the legal foundations for the digital economy in Germany is formed by the e-Commerce Directive 2000/31/EC, transposed into German law in the German Telemedia Act (Telemediengesetz, TMG). It regulates the respective liabilities of online intermediaries (such as access and host providers, as well as other services including caching and mere conduit) and defines content moderation obligations. Most notably, it stipulates a “notice and takedown” procedure for host providers in case they are made aware of illegal content on their services.
Digital Services Act
Against the backdrop of existing regulations in the German Telemedia Act and other, sector-specific regulations on the moderation of certain kinds of contents in EU law (such as the Digital Single Market (DSM) Directive on copyright, the Audiovisual Media Services (AVMS) Directive, the Regulation on Terrorist Content Online (TCO), etc) and national law (such as the NetzDG (a law stipulating content moderation obligations) and specific youth media protection acts, etc), the Digital Services Act (DSA) was introduced by the EU in 2022. It provides for a common set of rules on the obligations of intermediaries, including transparency reporting and content moderation obligations regarding illegal contents (such as hate speech) and a ban on certain types of targeted advertising.
Very large online platforms (VLOPs) and very large only search engines (VLOSEs) are subjected to stricter rules and are required to meet additional risk management, data sharing and auditing obligations. The liability concept introduced by the e-Commerce Directive and currently in place in the TMG remains largely untouched; it is expected that the corresponding provisions in the TMG will soon be repealed. The DSA is generally directly applicable from 17 February 2024. First notification requirements have already applied since 17 February 2023.
Digital Markets Act
Another “centrepiece” of the EU’s digital strategy is the Digital Markets Act (DMA) which was introduced alongside the DSA. It aims at regulating large online platforms that act as “gatekeepers” to ensure free and fair competition across the EU. It limits large online platforms’ abilities to impose unfair terms and conditions on competitors and consumers. Companies with a strong economic position that are active in multiple EU member states with a strong intermediation position between users and businesses will have to comply with a broad set of rules, among which are obligations to allow third parties to interoperate with the company’s services and to access data generated on these platforms. Additionally, these “gatekeepers” are prohibited from treating their own services more favourably in rankings as compared to other services or preventing their customers from linking up to other businesses outside of their platform. The DMA is directly applicable in all EU member states from 2 May 2023.
Codes of Conduct
Finally, there are Codes of Conduct in Germany that have been published by the German Association for the Digital Economy (Bundesverband Digitale Wirtschaft) on a variety of areas, such as affiliate and content marketing, search engine advertising (SEA) or search engine optimisation (SEO).
Cloud and edge computing are not regulated as such, but these technologies are affected by several different laws in Germany that concern data protection, cybersecurity and highly-regulated industries.
Data Privacy
As far as the processing of personal data is concerned, the General Data Protection Regulation (GDPR) applies. In this regard, the transfer of personal data to third countries regularly raises practical issues. Data transfers are governed by Chapter V of the GDPR and are subject to specific restrictions. The use of US cloud computing services is particularly challenging, specifically since the Schrems II decision by the European Court of Justice of 16 July 2020, which introduced extensive transfer impact assessment requirements to be met when exporting data to third countries.
Additionally, the use of processors, such as cloud service providers, requires the conclusion of a data processing agreement (DPA) which must meet certain substantive requirements. This often proves difficult when large service providers are involved as these processors regularly impose their own conditions that do not necessarily align with the requirements set out in Article 28 GDPR.
NIS and NIS2 Directives
Both the NIS and NIS2 Directives (Directives (EU) 2016/1148 and (EU) 2022/2555) specifically target cloud computing services and impose on them comprehensive cybersecurity obligations, such as the implementation of appropriate technical and organisational measures to tackle risks and incident reporting obligations. The NIS Directive has been transposed into German national law in the IT Security Act (IT-Sicherheitsgesetz) that amended several laws in Germany. The NIS2 Directive will have to be transposed into national law by the end of 2024.
Regulated Industries and Guidance by Supervisory Authorities
In Germany, certain sector-specific restrictions are applicable for highly-regulated industries (such as insurance and banking) which limit outsourcing via cloud services. For example, the outsourcing of activities and processes are subject to restrictions in Section 32 of the Insurance Supervision Act (Versicherungsaufsichtsgesetz, VAG), Section 5(3) of the Stock Exchange Act (Börsengesetz, BörsG), Section 80(6) of the Securities Trading Act (Wertpapierhandelsgesetz, WpHG) and Section 25b of the Banking Act (Kreditwesengesetz, KWG).
In this regard, several German supervisory authorities have issued guidance on cloud services and outsourcing, such as the independent conference of the German data protection supervisory authorities (Datenschutzkonferenz, Orientierungshilfe – Cloud Computing, 9 October 2014) or the Federal Financial Supervisory Authority (BaFin, Merkblatt – Orientierungshilfe zu Auslagerungen an Cloud-Anbieter, November 2018).
The legal landscape surrounding artificial intelligence is constantly evolving. While Germany still does not have specific legislation regulating AI, there are legislative proposals on an EU level and discussions among legal scholars concerning certain AI-related issues.
AI Act
Aiming to provide for harmonised rules on artificial intelligence, the EU is in the late stages of enacting an AI Act, the proposal of which had been published by the Commission as early as 2021. Following a risk-based approach, the proposed AI Act envisages a gradual regulatory framework which subjects different AI systems to different requirements, depending on the specific risks associated with these systems.
While some very high-risk AI systems are outright banned (such as AI systems employing subliminal techniques beyond a person’s consciousness, social scoring systems for general purposes, real-time remote biometric identification systems in publicly accessible places for certain law enforcement purposes), other AI systems are generally permitted, yet placed under certain rules.
So called “high-risk AI systems” (such as AI systems intended to be used as a safety component of a product, AI systems for purposes of biometric identification of natural persons, education, employment, law enforcement, etc). Under the proposed AI Act, high-risk AI systems may only be trained with data where data quality requirements are met. In addition, specific record-keeping, transparency, human oversight, and conformity assessment requirements apply.
For certain lower than high-risk AI systems (such as emotion recognition systems, biometric categorisation systems, or deepfakes), transparency obligations apply that shall ensure that natural persons are informed that they are interacting with an AI system.
For low-risk AI systems, the AI Act encourages the development of, and voluntary adherence to, specific codes of conduct.
Data Privacy
Training artificial intelligence (for example, through machine learning algorithms) often involves the processing of large amounts of data. These sets of training data often involve both personal and non-personal data. Where personal data is concerned, the processing of such data must be based on a lawful basis and the processing must be compatible with the purposes for which the data has been collected. Moreover, the processing for training purposes will regularly require the use of anonymised data. As it constitutes a processing of personal data as well, the anonymisation of personal data also requires a lawful basis and that the anonymisation for training purposes is compatible with the purposes for which the data was collected. In many cases, a data protection impact assessment (DPIA) will have to be conducted. As far as non-personal data is involved, Regulation (EU) 2018/1807 applies.
The independent conference of the German data protection supervisory authorities (Datenschutzkonferenz, DSK) has published a policy paper on artificial intelligence on 3 April 2019 (DSK, Hambacher Erklärung zur Künstlichen Intelligenz, 3 April 2019), outlining seven guiding principles that should govern the development and utilisation of AI:
Liability for AI Malfunctions
With regard to AI malfunctions, there are debates among German legal scholars on questions of liability. While some argue that already-existing contract or tort laws should be applied to address the infliction of damages by an AI, other voices in academia and practice propose that such damages should be covered by specific insurance solutions. Additionally, there are proposals that call for strict liability of AI manufacturers and/or operators of AI. Other solutions involve the creation of an own legal personality in AI systems (so-called “e-persons”); however, apparently this proposal has not been able to attract broad support.
On an EU level, in September 2022 the Commission adopted two proposals to adapt and modernise liability rules to meet current technical and market developments. One of the policy proposals concerns the proposal for an AI Liability Directive which aims at fostering the enforcement of claims by individuals who have been harmed by AI products through alleviating the victims’ burden of proof and facilitating access to relevant evidence.
Intellectual Property Issues
Under German copyright law (Urheberrechtsgesetz, UrhG), big data collections are not protected as such. The creation of a big data collection merely involves the systematic and indiscriminate collecting of data points which does not constitute an “author’s own intellectual creation” as is required under Section 4(2) of the UrhG to warrant protection as a database work. A protection of the collection as a right of the maker of a database pursuant to Section 87a et seq of the UrhG can generally seem conceivable under certain circumstances, if the maker of the database has made significant investments in obtaining, verifying and/or presenting the data and if they have arranged them systematically or methodically.
Finally, big data collections can be granted a sui generis protection as a trade secret under the German Trade Secrets Act.
Further, the creation of new text and art by AI machines such as ChatGPT will pose significant challenges to copyrights as the AI driven machines create text and/or art based on smaller or larger fragments of works that enjoy copyright protection. Apart from the question as to whether these fragments still enjoy copyright protection, it is entirely unclear how such copyrights could be enforced. Likely, the new AI creations will not enjoy their own copyright protection.
There is no law in Germany that specifically governs the Internet of Things (IoT). The notion of IoT usually involves devices that monitor or communicate with smart cars, smart homes, work environments or physical activities. In its Opinion 8/2014 on recent developments on the IoT of 16 September 2014, the EU’s Article 29 Working Party defined the concept of the “Internet of Things” as to refer to an “infrastructure in which billions of sensors in common, everyday devices (…) are designed to record, process, store and transfer data and, as they are associated with unique identifiers, interact with other devices or systems using networking capabilities”.
Although there is no specific legislation targeting the IoT, several laws governing the processing of personal data necessarily involved with using IoT devices apply. Among these are:
Data Privacy
Privacy-related issues connected to the IoT are manifold. Several of these issues have been identified by the Article 29 WP in its Opinion 8/2014 on recent developments on the IoT.
To address these privacy issues, developers of IoT systems must, in particular:
Cybersecurity
IoT devices can raise cybersecurity concerns, some of which had already been identified by the Article 29 WP’s Opinion 8/2014. Vulnerabilities of IoT devices can be exploited and misused for purposes of surveillance and data theft or to attack critical infrastructures. Some vulnerabilities can also be linked to a lack of regular security updates or an appropriate end-to-end-encryption.
Some of these risks will be addressed by the Cyber Resilience Act (CRA) proposed in September 2022 by the European Union which seeks to close regulatory gaps with regard to IoT products. The main objectives of this initiative are:
For these purposes, a coherent cybersecurity framework shall be established.
Audio-Visual Media Services
Due to Germany’s federal structure, audio-visual media services are governed at both the federal and state levels in Germany by the State Media Treaty (Medienstaatsvertrag, MStV) and State media laws. Under these laws, broadcasting requires permission by the respective media authorities. Broadcasting is defined as any linear provision and distribution of moving images and/or sound for simultaneous reception by the general public along a broadcasting schedule using electromagnetic transmission paths (be it, for instance, via satellite, cable, terrestrial or the internet). Audio-visual media provided “on demand” (eg, on video sharing platforms) is regularly exempt from this definition.
However, livestream content on streaming platforms can often constitute “broadcasting” within the meaning of the regulations mentioned above, especially in cases that it follows along a broadcasting schedule. The fees for the authorisation procedure can cost between EUR1,000-2,500 or, in some cases, up to EUR 10,000, depending on the administrative effort necessary for the authorities and on the economic value of the company requesting an authorisation.
Video-Sharing Platforms
Video-sharing platform services are also targeted specifically under a variety of German laws. Most importantly, the latest review to the Audio-Visual Media Services Directive (EU) 2018/1808 of 14 November 2018 has brought many changes to several media laws in Germany over the past few years. The legal situation in Germany for video-sharing platforms tends to be rather complex, as the relevant provisions are scattered across many different laws.
The obligations for video-sharing platform services are mainly laid out in:
The obligations stipulated in the provisions stated above mainly concern content moderation obligations to protect minors and the general public from hate speech, the incitement of violence and other hateful content, as well as criminal and child pornographic content.
In addition, other legislation that does not specifically target video-sharing platform services only but more generally stipulates obligations for the moderation of harmful or illegal content for many different kinds of platform or host services may apply as well, namely:
Telecommunications Regulation
In Germany, telecommunications are regulated in the Telecommunications Act (Telekommunikationsgesetz, TKG) which has transposed rules laid out in the European Electronic Communications Code (EECC ‒ Directive (EU) 2018/1972) into German law. To promote competition in the telecommunications sector and efficient telecommunications infrastructures, the TKG follows a technology-neutral approach to ensure that adequate and sufficient services can be provided across Germany. Telecommunications services regulated under the TKG are:
Interpersonal telecommunications services (also called “OTT-I” services, ie, over-the-top-I services, such as instant messaging, web mail services, internet (video) phone services, etc) are services usually provided for remuneration that enable a direct interpersonal and interactive exchange of information over telecommunications networks between a finite number of persons, whereby services which enable interpersonal and interactive telecommunications merely as a subsidiary ancillary function inextricably linked to another service are not included. OTT-I services are partially governed by the TKG, whereas OTT-II services (eg, blogs, streaming, websites, etc) are not covered by the TKG.
Pre-marketing Requirements
The Telecommunications Act regulates both communications networks and communications services. Under the EECC, the provision of such networks or services is subject to a “general authorisation” (Article 12, EECC) and thus does not require a grant of a licence or specific authorisation. However, any intended commencement, change or termination of such services must be notified to the Federal Network Agency (Bundesnetzagentur, BNetzA) without undue delay. The same applies for any changes to an operator’s or service provider’s name/company name, legal structure or address.
Technology agreements (ie, agreements relating, inter alia, to the development, sale, transfer, provision and maintenance of software, hardware or cloud solutions, etc) are not specifically regulated as such under German contract law. Instead, general contract law and the law of general terms and conditions (Allgemeine Geschäftsbedingungen, AGB) of the German Civil Code (Bürgerliches Gesetzbuch, BGB) apply. Depending on the subject of the technology contract, the rules of different contract types are applicable, such as sales contracts, service contracts, rental contracts or work contracts. In addition, antitrust rules may apply which are not covered by this analysis.
Exact Specification of the Services and Products
Since the different contract types entail different rights and obligations, especially with regard to the provision of the technology subject to the contract as well as with regard to warranties and liabilities for defects, a precise and detailed description of the services/products is necessary to clearly define the scope and the nature of the respective technology agreement. Under all circumstances, parties must make sure not to confuse different contractual elements or choose descriptions that refer to a type of contract that is not appropriate for the service or product provided and which establishes rights and obligations that are inappropriate or too far-reaching. This can often occur whenever pre-formulated standard clauses are used that are not specifically adapted to the concrete service or product. For example, Software-as-a-Service (SaaS) or software rental contracts involve different service descriptions that result in different rights and obligations for both parties.
Licensing and Liability
Software licensing can cause difficulties, especially when open-source software (OSS) is involved. Many OSS licences require the licensee, if they publish modifications based on the OSS, to make the source code available and publish it under the same licence terms as the original OSS licence (so-called “Copyleft” principle). This can lead to problems where open-source software code is combined with other open-source or proprietary software code under different, and possibly conflicting, licence terms. Here, careful consideration is necessary to avoid violating any licensing conditions.
Finally, in cases where a group licence is to be granted, attention must be given to the means by which it can be granted in a legally secure manner. Notably, in general, no additional licences can be granted on the basis of a simple licence.
Wherever general terms and conditions are used, some restrictions can also stem from German laws specifically targeting general terms and conditions. Extensive exemptions and limitations of liability are often deemed invalid by German courts, as are comprehensive disclaimers of warranties.
German law sets out specific rules on trust services in the Trust Services Act (Vertrauensdienstegesetz, VDG) and in the Trust Services Ordinance (Vertrauensdiensteverordnung). It includes various regulations on trust services, electronic signatures and seals, as well as the validation of such identification methods. These rules complement the eIDAS Regulation (EU) 910/2014 on electronic identification and trust services for electronic transactions in the internal market. Formal requirements for electronic signatures are established in the German Civil Code. It regulates the cases in which a mandatory written form can be replaced by an electronic signature.
However, electronic signatures are only catching on very slowly because they require special hardware and software, an effort that many – except for individual industries – have tended to avoid up to now.
Heinrich-Heine-Allee 12
40213
Düsseldorf
Germany
+49 211 3678 7269
cschroeder@orrick.com www.orrick.com/en/People/E/A/D/Christian-SchroederData Privacy Litigation in Germany – A Growing Concern for Businesses?
While the first few years since the inception of the General Data Protection Regulation (GDPR) were marked by ever more rigid public enforcement efforts by EU member state supervisory authorities and a growing number of administrative fines of several hundred million euro, it was the last few months in particular that saw a considerable uptick in private enforcement activities across Germany. Increasingly, German courts have had to deal with claims for immaterial damages or injunctions for alleged privacy violations. Given the potential for mass enforcement of such claims by professional litigants or through representative actions, businesses in virtually all sectors thus face a growing risk of being subjected to private enforcement efforts in the field of data privacy in Germany.
Immaterial damages for GDPR violations
Claiming compensation of immaterial damages pursuant to Article 82(1) of the GDPR for privacy violations has become one of the most common private enforcement practices in Germany. Recently, these claims have begun to cover an increasingly broad variety of supposed violations and processing situations. However, given many legal uncertainties still lingering around the provision of Article 82 of the GDPR, it is still unclear as to which kinds of privacy violations are eligible for compensation and under which conditions immaterial damages may be claimed at all. Unsurprisingly, many questions surrounding Article 82 of the GDPR have already been referred to the Court of Justice of the European Union (CJEU) for a preliminary ruling.
Among the most pressing issues is the question as to whether any kind of violation can lead to a claim for compensation of immaterial damages in principle or whether there is a “de minimis” threshold that excludes minor damages or “petty” violations of GDPR provisions that do not inflict any meaningful, perceptible damage on an affected data subject. Despite numerous decisions being rendered by German courts in the interval (far more than a hundred published decisions so far), case law still does not paint a clear picture.
Referring to a “de minimis” threshold, many local, regional, higher regional and labour courts have denied claims, arguing that a merely “perceived impairment” does not amount to any immaterial damages eligible for compensation under the GDPR. However, other local, regional and labour courts have been more generous in awarding immaterial damages for a broad variety of violations. That is, in their view, since the notion of “immaterial damages” ought to be interpreted broadly so that data subjects receive “full and effective compensation”. In addition, some courts have also argued that damage amounts granted as compensation must serve a “dissuasive” effect. Notably, the German Federal Labour Court in August 2022 formed the opinion that, generally, any violation of the GDPR automatically leads to immaterial damages eligible for compensation under the GDPR.
Thus, German courts have awarded compensation for immaterial damages with damage amounts ranging from as low as EUR25 to approximately EUR5,000, for instance, for the following GDPR violations:
Given these significant differences in the severity of the privacy violations, awarding immaterial damages does not seem appropriate in each of the cases mentioned above. Indeed, there is good reason to conclude that some of these violations are so trivial that awarding any compensation does not seem warranted. Incidentally, this view is shared by Advocate General (AG) of the CJEU Campos Sánchez-Badona, who in early October 2022 provided his opinion on questions referred to the CJEU for a preliminary ruling which concerned the existence of a “de minimis” threshold and whether compensation for immaterial damages must have a dissuasive effect, among other issues.
In the opinion of October 2022, the AG concluded that not every violation of the GDPR automatically leads to immaterial damages but that actual damages must have occurred. In fact, a “mere annoyance” or “upset” in connection with a privacy violation is not sufficient to constitute immaterial damages and there is no irrebuttable presumption that immaterial damages occur in every instance of a privacy violation. Finally, the compensation of immaterial damages is not intended to have a dissuasive effect as is the case for administrative fines, since the GDPR does not provide for punitive damages.
The CJEU is set to finally decide on these questions in 2023. Until then, it will remain an open question as to which direction the CJEU will steer the future of private enforcement in Germany. Should the CJEU decide in line with the AG’s position, this would mean great relief for many businesses, particularly those involved in large scale data processing activities.
Current risk environment in Germany
Despite the many legal uncertainties surrounding immaterial damages, the risk environment for private enforcement in Germany has been characterised by a growing volume of claims asserted by a diverse set of actors in an ever-broader range of sectors. Areas most often in the crosshairs of privacy litigation are, notably:
In many cases, privacy litigation starts with a data subject access request (DSAR). It is therefore vital to have sufficient DSAR management in place to answer requests sufficiently and in a timely manner, as even a belated or insufficient response to an access request can itself give rise to potential claims for immaterial damages. It also makes sense to adequately identify the risk associated with each type of DSAR (eg, DSARs arising in an employment or post-breach context are often of a higher risk than are DSARs sent via a standardised template form by consumers).
Another risk for privacy litigation can stem from negative press following media reports of a data breach or of an inquiry by a supervisory authority. Such reports sometimes lead to a rising number of DSARs and, potentially, subsequent claims for immaterial damages.
Litigation risks in the employment context
As regards the employment context, companies have been witnessing a proliferation of data subject access requests during termination proceedings made with the intention to uncover and potentially threaten to litigate any GDPR violations as part of subsequent settlement negotiations. Given that some labour courts have granted immaterial damages claimed during termination proceedings, the mere threat of privacy litigation often suffices to significantly leverage the negotiation position of disgruntled former employees. This once again illustrates the importance of a co-ordinated and smooth off-boarding procedure for companies.
Litigation risks regarding international transfers
Another area that has been exposed to increased scrutiny through private enforcement activities concerns international transfers of personal data to third countries outside the European Union. Given that a general uncertainty as to the legality of international transfers is now much more the norm than the exception, since the CJEU’s Schrems II decision of 16 July 2020, there is much room for error for companies ‒ room which is readily exploited by litigious individuals and professional litigants. Already, these claimants can rely on existing case law to support their claims since immaterial damages have been granted for unlawful transfers of personal data by German courts (for instance, Regional Court Munich I in January 2022 above).
Litigation risks regarding data breaches
Finally, a third important area giving rise to serious privacy litigation risks involves data breaches and cyber-attacks, including phishing mails, malware, DDoS, Advanced Persistent Threat (APT) and ransomware attacks. Given an ever-evolving cyber threat landscape, there is a growing risk for companies of every sector of being affected by cybercrime, service outages and data leaks, potentially affecting many data subjects at once. Not only have some courts already awarded immaterial damages to data subjects affected by data breaches (such as the Regional Court Munich I in December 2021 above), but specialised law firms have already started advertising their services specifically to individuals potentially affected by certain data breaches ‒ most notably those that have received great media attention.
Mass litigation and representative action
Considering recent and future developments surrounding mass claims and representative actions, it is large-scale processing of personal data in particular that can increasingly prove a liability for businesses. For instance, specialised law firms have begun representing large numbers of data subjects (in the hundreds or even thousands) affected by data breaches. In addition, there are companies that provide legal assistance to affected data subjects exerting claims for immaterial damages before court.
What is more, the European Union’s Representative Action Directive (EU) 2020/1828, which had to be transposed into national law by EU Member States by December 2022, will further incentivise mass claims on behalf of consumers affected by privacy infringements. This should prove to make enforcement of damages considerably easier in cases affecting large numbers of individuals, as is often the case in employment- or consumer-related cases. Additionally, the rules provided in the Directive regarding disclosure of evidence will also facilitate private enforcement of privacy violations.
This enhanced environment for mass claims and professional litigation is also likely to make inquiries and/or imposition of fines by supervisory authorities (eg, after breaches or unlawful international transfers) increasingly dangerous starting points for follow-up litigation by entities and individuals that pursue compensation of immaterial damages. Since the Representative Action Directive also provides for actions brought by qualified entities established in other member states, this may expose businesses to even greater risk of litigation by different interest groups across the EU. In fact, there could be an increase in forum-shopping that could lead claimants to sue in jurisdictions known to award high damage amounts, should the CJEU not follow the AG and should the particularly low bar for immaterial damages set by some German courts thus finally materialise.
Warning letters and cease and desist orders
In late 2022 in particular, Germany saw a spike in abusive claims for immaterial damages exerted via warning letters against companies and individuals. Attracted by the particularly low bar for immaterial damages in Germany, lawyers have been deliberately targeting thousands of website providers using the web font service “Google Font” after a ruling by the Munich Regional Court I in early 2022 found using the service triggers data transfers to the US that the court deemed unlawful under the GDPR. Hence, a single court decision sufficed to lead lawyers to send out warning letters to a multitude of website operators with the intention of securing a settlement payment ‒ a tactic that has already paid off given that a number of companies have already agreed to pay the proposed settlement fees for fear of litigation.
The absence of a “de minimis” threshold thus also provides for a significant incentive to file abusive claims. However, besides this, there are already new possible venues for abusive claims on the horizon: over the past few months and years, a great number of German courts have also permitted injunctions filed by data subjects against certain processing operations and practices. Perhaps not coincidentally, the Munich Regional Court I’s “Google Fonts” ruling discussed above also issued an injunction against the use of the service. It therefore seems highly likely that data subjects will increasingly assert claims for injunctive relief and, in addition, try to enforce their rights by way of interim relief. The latter, in particular, would be certain to create a whole new dynamic and speed in private enforcement developments.
Outlook
The assertion of immaterial damages claims has paved the way for ramped-up private enforcement efforts within the last two years and will likely remain one of the defining privacy topics in Germany and across Europe in 2023. With an ever-broader range of areas affected, businesses of all kinds will have to expect further increases in private enforcement activities and are well-advised to resolve any shortcomings in their privacy compliance which are particularly vulnerable to litigation. Finally, there is hope that, in the coming months, the CJEU will provide much-needed light at the end of the tunnel and put a stop to litigation practices pursuing immaterial damages in mere minor cases.
Heinrich-Heine-Allee 12
40213 Düsseldorf
Germany
+49 211 3678 7269
cschroeder@orrick.com www.orrick.com/en/People/E/A/D/Christian-Schroeder