The fundamental national legislation applicable to personal data in France is as follows:
Some other complementary national legislation applies to personal data, as follows:
Under French law, the CNIL is an independent administrative authority whose main missions are to inform, educate, advise, anticipate innovation, investigate and impose sanctions.
The CNIL's investigations may be initiated based on the news, on complaints received by the CNIL or on the CNIL's annual programme of control (published each year), and may be part of, or related to, previous investigations performed by the CNIL (follow-up further to a formal notice or to previous sanctions); they may also be part of the co-operation programme with other European Data Protection Authorities.
The CNIL's inspections are regulated by the CNIL's internal rules. Online and on-site inspections can be conducted. An official and publicly available list is published each year, identifying the agents of the CNIL authorised to carry out audits and inspections.
During Inspections
During inspections, particularly on-site inspections, CNIL agents may request access to all documents necessary to fulfil their mission (eg, records of processing activities), regardless of the medium, and make copies of them. They can access software programs and request their transmission by any appropriate processing. They can also request some documents or information to be communicated after the inspection.
At the end of the inspection, the CNIL issues an inspection report, and may request additional documents.
Consequences of an Inspection
If the information and documents provided by the data controller during and after the inspection do not call for any particular observations, the inspection procedure is closed.
However, if the inspection leads to the identification of a lack of compliance with the applicable data protection rules, the CNIL can decide to:
Appeal of a Sanction
Once a sanction has been issued by the CNIL, the company concerned can appeal it to the French Administrative Supreme Court (Conseil d'Etat) within two months of the CNIL's sanction being imposed.
As a member of the European Union, France has to comply with the European legal and regulatory framework in terms of personal data protection, which includes the GDPR, the Police-Justice Directive and the guidelines of the EDPB. The e-Privacy Regulation is still at the draft stage, but once it is adopted French companies will also have to comply with it.
France is also impacted by Brexit, as far as data transfers from France to the United Kingdom are concerned. On 28 June 2021, the European Commission adopted two adequacy decisions in relation to the UK: one under the GDPR, and one under the Police-Justice Directive. Transfers of personal data from the EU to this third country can now take place based on these decisions.
The major privacy/data protection non-governmental organisations in France are as follows:
In terms of the protection of personal data, the French legal and regulatory framework is one of the most developed and strictest in the world. Before the GDPR, the French Data Protection Act had been in place since 1978 and the CNIL was already one of the most active and influential authorities in Europe. Over the years, it has created a veritable toolbox for both professionals and individuals, with deliberations, recommendations, guidelines and practical guides. It was also the first European Authority to sanction a GAFA company, through a EUR50 million sanction on 21 January 2019 (confirmed by the Conseil d'Etat on 19 June 2020).
Furthermore, through two deliberations handed down on 7 December 2020, the CNIL imposed fines on GAFA companies (EUR100 million and EUR35 million, respectively) concerning cookies and trackers placed on the terminals of their users. It also rendered two deliberations in December 2021 sanctioning two GAFA companies regarding the choice made available to their users for refusing cookies (EUR150 million and EUR60 million, respectively).
Key developments in the past 12 months in France include the publication of the following documents (in chronological order):
At the European level, the EDPB published the following guidelines (in chronological order):
The EDPB has also published the following recommendations:
The European Commission has published the following documents:
Significant topics over the next 12 months in France are as follows:
Data Protection Officers
Article 37 of the GDPR requires the appointment of a Data Protection Officer (DPO) for public organisations and for organisations whose core activities consist of either operations that require regular and systematic monitoring of individuals on a large scale, or large-scale processing of sensitive data or data relating to criminal convictions and offences.
In case of doubt, the appointment of a DPO is strongly recommended by data protection authorities.
The French Data Protection Act does not provide additional requirements relating to the appointment of a DPO.
Criteria to Authorise Collection, Use or Other Processing
Article 6 of the GDPR defines six legal bases upon which to process personal data:
“Privacy by Design” or “by Default”
The concepts of “privacy by design” and “privacy by default” are included in Article 25 of the GDPR, and the EDPB has published Guidelines 4/2019 on this subject.
The “privacy by design” concept consists of implementing – from the very early stage of the conception of personal data processing – organisational and technical measures to implement the data protection principles and necessary safeguards to meet the GDPR requirements and protect the rights of data subjects.
The “privacy by default” concept requires the implementation of organisational and technical measures for ensuring that data protection principles are respected and the necessary safeguards applied by default and therefore at any stage of the processing.
Privacy Impact Analyses
The data protection impact assessment (DPIA) consists of analysing a processing and its risks in order to assess and reduce them. Article 35 of the GDPR requires a DPIA for the following:
In addition, the Article 29 Working Party issued guidelines in which it lists nine criteria to be assessed. If one criterion is satisfied, a DPIA is recommended; if two criteria are satisfied, a DPIA is required. The criteria are as follows:
The CNIL also published a list of processing operations for which a DPIA is required, as well as a list for which one is not required.
Adoption of Internal or External Privacy Policies
The principle of accountability (Article 5-2 of the GDPR) requires organisations to document their compliance. In particular, organisations are required to implement documentation such as information notices, records of processing activities, global privacy policies, data retention policies, a handling procedure for complaints and data subject requests, data breach procedures, security policies, "privacy by design" and "privacy by default" procedures, etc.
In France, the French Toubon Law No 94-665 of 4 August 1994, relating to the use of the French language, and French employment law require policies and procedures to be translated into French in order to be enforceable on French employees. Privacy policies should also be translated into French.
Data Subject Access Rights
Under the GDPR, data subjects' rights include the following:
In France, Law No 2016-1321 of 7 October 2016 for a Digital Republic introduces an additional right for French data subjects, which is the right to define guidelines about the processing of their personal data after their death.
Use of Data Pursuant to Anonymisation, De-identification or Pseudonymisation
De-identification is a concept not used under the GDPR, which only refers to anonymisation and pseudonymisation.
Anonymisation is a processing operation that consists of using a set of techniques in such a way as to make it impossible, in practice, to identify the person by any means whatsoever and in an irreversible manner. The GDPR does not apply to anonymised data because such data is no longer of a personal nature. Before the GDPR entered into application, the Article 29 Working Party published a detailed opinion on anonymisation techniques (Opinion 05/2014 on Anonymisation Techniques adopted on 10 April 2014), which still provides useful guidance on how to anonymise personal data properly.
Pseudonymisation is the processing of personal data in such a way that data relating to a natural person can no longer be attributed to said person without additional information. In practice, pseudonymisation consists of replacing directly identifying data in a dataset with indirectly identifying data (alias, number in a classification, etc). Pseudonymisation thus makes it possible to process data on individuals without being able to identify them directly. In practice, however, it is often possible to trace the identity of individuals using third-party data. For this reason, pseudonymised data remains personal data.
Profiling, Automated Decision-Making, Online Monitoring or Tracking, Big Data Analysis, Artificial Intelligence, Algorithms
Article 22 of the GDPR provides a framework for automated decision-making processing that produces legal or significant effects. It applies to processing operations based exclusively on decisions "producing legal effects" (a decision has legal effect when it affects human rights and freedoms) or "significantly affecting individuals" (a decision can have a significant impact, similar to a legal effect, when it has the consequence of influencing a person's environment, behaviour or choices, or when it leads to a form of discrimination).
In principle, individuals have the right not to be subject to a decision based solely on automatic processing and producing legal effects concerning them or significantly affecting them in a similar manner. However, in specific cases, a data subject may be the subject of a fully automated decision, even if it has a significant legal effect or impact on them – for instance, if the decision is based on the explicit consent of the data subject, is necessary for the conclusion or performance of a contract, etc.
The Concept of “Injury” or “Harm”
"Injury" and "harm" caused by the unlawful processing of personal data are not explained by the French Data Protection Act, its implementing Decree or the guidelines of the CNIL. There is currently no sanction in France providing additional information on the notion of harm in terms of personal data protection. Any data subject may therefore seek compensation for any damages they have suffered in connection with privacy and data protection by invoking Article 1240 of the French Civil Code, which defines the general principle of responsibility under French law. Also, it is possible to invoke Article 226-1 of the French Criminal Code, which condemns the damage of the privacy, as well as Articles 226-16 to 226-22-1 and Articles R.625-10 to R.625-13 of the French Criminal Code, imposing sanctions for failure to comply with data protection rules.
Sensitive Data
Under Article 9 of the GDPR, sensitive data includes personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation.
In principle, the collection and processing of such data is prohibited, except under specific circumstances (eg, when the data subject's consent is collected, when the information is manifestly made public by the data subject, when the processing is necessary for the protection of the vital interests of the data subject, when the processing is justified by the public interest and authorised by the CNIL, etc).
Financial Data
Financial data is not deemed sensitive. Nevertheless, in the CNIL's deliberation No 2018-303 of 6 September 2018 regarding the processing of payment card data for the sale of goods or the provision of services at a distance, financial data, including payment card data, is qualified as "highly personal data", given the serious impact on data subjects that its violation could have.
Health Data
Health data is sensitive data. Personal data concerning health is data relating to the past, present or future physical or mental health of a natural person (including the provision of healthcare services), which reveals information about the state of health of that person. Hosting providers of such health data in France must apply for a specific certification to host such data, regardless of where the data is located.
Communications Data
Communications data is governed in French law by Article L.34-5 of the PECC transposing the e-Privacy Directive. See 2.3 Online Marketing.
Voice telephony and text messaging are not deemed sensitive.
The content of electronic communications is not deemed sensitive, but is subject to specific rules and retention periods under French law. See 2.3 Online Marketing.
Children's or Students' Data
Children’s or students' data is not deemed sensitive, although minors are considered vulnerable persons according to the CNIL. Where the legal basis for a processing is consent and the data subject is a minor under the age of 15, the data controller must obtain the consent of the minor concerned and the holder(s) of parental authority over that minor, jointly. In addition, the processing of the personal data of vulnerable persons (children) is one criterion used by the CNIL to impose the conducting of DPIAs.
Employment Data
Employment data is not deemed sensitive, but the CNIL has issued practical recommendations on the processing thereof. In addition, the processing of personal data of vulnerable persons, such as employees, is one criterion used by the CNIL to impose the conducting of DPIAs.
Social Security Number (NIR)
Although not listed as sensitive data in the French Data Protection Act, the Social Security Number is considered by the CNIL to be a very specific piece of personal data. Its use in France is strictly limited to specific purposes, as detailed in Decree No 2019-341 of 19 April 2019 relating to the use of such data, which provides a detailed list of authorised purposes limited to the sectors of welfare protection, healthcare, work and employment, financial, tax and customs, etc.
Internet, Streaming and Video Issues
Cookies and beacons
The use of cookies, beacons and other tracking devices (“cookies”) is framed by e-Privacy Directive, modified by Directive 2009/136/CE. The EDPB adopted Opinion 5/2019 on 12 March 2019 on the interplay between the e-Privacy Directive and the GDPR, which also contains provisions about trackers. The CNIL issued updated guidelines on 17 September 2020, repealing its former guidelines from 2019, about operations in a user's terminal (in particular, cookies and other tracking devices), which have been completed with a recommendation to clarify the guidelines and to give practical insights and best practices. They have been enforceable since 31 March 2021.
The data controller must now obtain the prior consent of the user (in a free, specific, informed and unambiguous manner by means of a declaration or a clear positive act) before storing information on a user's equipment or accessing information already stored, particularly for cookies related to targeted advertising operations, certain audience measurement cookies, performance cookies and social network cookies generated in particular by share buttons when they collect personal data without the consent of the data subjects. Certain audience measurement cookies are exempted from consent collection if cumulative criteria defined by the CNIL are met (eg, clear and complete information is provided, an easy objection mechanism is accessible and usable on all browsers and all types of terminals, there is no cross-checking with other processing operations, there is a limited retention period, there is no possibility to follow the navigation of the internet user on other sites, etc).
In any case, all cookies, including strictly necessary cookies, require the provision of information to the user prior to their placement.
The CNIL published recommendations on 13 October 2021 concerning alternatives to "third party" cookies for advertising targeting, providing that these alternatives must always comply with the prior consent and rights of data subjects.
Location data
Large-scale processing of location data is one of the processing operations for which an impact assessment must necessarily be carried out, according to the list drawn up by the CNIL.
Do Not Track, and tracking technology
Do Not Track (DNT) is a function integrated into web browsers that allows internet users to indicate that they do not want to be tracked for advertising purposes. To this day, there are no regulations forcing sites to take this opposition into account.
Social media, search engines, large online platforms
French case law considers social networks to be hosting providers under Law No 2004-575 of 21 June 2004 on confidence in the digital economy (transposing the e-Commerce Directive 2000/31/EC), meaning that, regarding user-generated content, liability can only be sought when social networks do not act expeditiously to remove or disable access to the litigious information upon obtaining knowledge or awareness of such.
Dereferencing (“right to be forgotten”) allows the user to remove one or more results provided by a search engine after a query based on the identity (surname and first name) of a person.
On 6 December 2019, the State Council took 13 decisions following the judgment of the Court of Justice of the European Union dated 24 September 2019 (case C 136/17), in which it applied a proportionality test and weighed against the public's right to information, in light of three criteria:
The nature of data must also be taken into account in this weighing, and affects the scope and effectiveness of a delisting request.
Sensitive data and data relating to judicial proceedings or criminal convictions and offences benefit from a higher level of protection: such a request may only be refused where access to such data is strictly necessary to inform the public.
On 2 December 2019, the EDPB adopted guidelines on the criteria of the right to be forgotten in search engine cases, in the context of delisting requests, which provide six grounds upon which data subjects can rely to obtain the delisting of their personal data by search engines, but also on the exceptions that search engines may apply to them. Furthermore, these guidelines clarify that the right to be forgotten concerns not only the right for data subjects to obtain from search engines the removal of links to websites containing their personal data, but also the right to object to the processing of such data under Article 21 of the GDPR.
Hate speech, disinformation, terrorist propaganda, abusive material, political manipulation, bullying, etc
Pursuant to the French law against the manipulation of information (Fake News Law) of 22 December 2018, operators may stop the dissemination of "allegations or statements that are inaccurate or misleading in relation to a fact which may affect the upcoming vote's sincerity" for a definite period before general elections and until the publication of the results. The Fake News Law specifically targets operators of online platforms whose activity exceeds a specific threshold of connection from the French territory. This law also fights against “deepfakes”, which are punishable under the provisions of the French Penal Code.
A new law of 24 June 2020 on preventing the dissemination of hate speech (Hate Speech Bill) has been enacted. Some of the provisions of the Hate Speech Bill were deemed unconstitutional by the French Constitutional Court on 18 June 2020, particularly relating to the obligations of operators to remove certain illicit content within a very short timeframe (eg, content pertaining to terrorism and child pornography). However, the new law constitutes a new online hate observatory, placed under the authority of the ARCOM (formerly the Superior Audio-visual Council – Conseil supérieur de l’auidovisuel), which is responsible for monitoring and analysing the evolution of hate content in conjunction with operators, associations and researchers.
The Law of 24 August 2021 reinforcing the respect of the principles of the Republic partially pre-implemented the European Digital Services Act by creating content moderation obligations targeting "content-sharing" online platforms. The Law targets large platforms and very large platforms, as defined by Decree 2022-32 released on 14 January 2022.
Finally, the EU adopted Regulation 2021/784 on addressing the dissemination of terrorist content online on 29 April 2021 and Regulation 2021/1232 regarding the fight against online child sexual abuse by providers of number-independent interpersonal communications services on 14 July 2021.
Article L.34-5 of the PECC defines direct marketing as "any message intended to promote, directly or indirectly, goods, services or the image of a person selling goods or providing services."
This definition is very broad as it covers both the direct promotion (advertising campaigns by email, prospectus or brochure sent by email) and indirect marketing (any material aiming to promote the seller’s brand image) of products, services or the image of a company.
As a matter of principle, any B2C online direct marketing operation requires the data subject's explicit consent to receive direct marketing (opt-in), which must be obtained at the time of the collection of contact details.
Two exceptions exist, as follows:
Any electronic marketing communication must also provide the individual with a free, simple, direct and easily accessible means to opt out of marketing communications (eg, through a hyperlink at the bottom of the communication).
B2B electronic marketing communications are subject to a lighter regime. To send a B2B electronic marketing communication, the company must provide the following at the time of collecting the contact details:
For marketing communications carried out by post or by phone, opting out is admissible. When collecting the individual’s contact details, the company must inform the individual that their details can be used for direct marketing purposes and give them the possibility to opt out of direct marketing communications at any time (eg, through an unchecked box).
The CNIL published recommendations on direct marketing on 26 January 2022, reiterating the above rules and providing examples for the collection of opt-in/opt-out.
Some provisions of the French Labour Code complement the French data protection legal framework, as follows:
Monitoring Workplace Communications
The CNIL has issued guidelines about HR processing, which underline that employers can control and limit the use of the internet and messaging at work for personal purposes, to ensure the security of networks that may be affected by attacks and to limit the risks of abuse of personal use of the internet at work or a professional mail box.
However, the employer must define and inform the employees of the rules for personal/private use of professional devices and the internet. In addition, it cannot use key loggers to remotely record all the actions performed on a computer, except in exceptional circumstances linked to a strong security imperative.
When recording employees' calls, for instance for training purposes or improvement of the services provided, employers cannot couple a call with a screen capture system of the employee's computer workstation, and may not set up a permanent or systematic listening or recording device, except as provided for by law (eg, for some specific cases, such as emergency services). It must provide employees with telephone lines that are not connected to the recording system, or a technical device enabling them to switch off the recording, for personal calls. The same applies to calls made by staff representatives in the exercise of their duties.
Labour Organisations or Works Councils
Works Councils have an active role in France, and some personal data processing carried out by employers may require the information and consultation of the Works Council before the implementation thereof (eg, whistle-blowing schemes, geolocation of employees' vehicles, employees' performance monitoring system, etc).
Whistle-Blower Hotlines and Anonymous Reporting
On 10 December 2019, the CNIL published new guidelines relating to whistle-blowing schemes, together with FAQ, which include the new French legal framework relating to anti-bribery and the duty of vigilance. The guidelines are applicable to schemes required by law pursuant to the “Sapin 2” Law (Articles 8 and/or 17) and the French law relating to the duty of vigilance and to schemes implemented by companies on a voluntarily basis to collect alerts relating to lack of compliance with the company’s code of ethics or code of conduct.
The CNIL maintains its former position on anonymous reporting and indicates that, even if whistle-blowers can choose to remain anonymous, it is strongly recommended to encourage them to identify themselves. When exceptionally dealing with anonymous reports, companies must deploy specific and additional measures to assess the severity of the reported violation and the special care to apply.
E-discovery Issues
Discovery proceedings are investigations and investigative phases prior to civil and commercial litigation, and are essential to any legal action in the United States. Discovery requests made to French companies may include requests to produce and transfer thousands of emails from employees.
On 23 July 2009, the CNIL adopted a recommendation underlining that the required transfers of information can be subject to the French Blocking Statute and must necessarily be carried out in accordance with the Hague Convention, which is the only international convention binding France and the United States with regard to legal proceedings. In addition, documents must be exhaustively listed and have a direct and precise link with the object of the conflict in compliance with data protection applicable laws. The CNIL has not issued any updated opinion since then.
A sanction procedure can be initiated against an organisation if a breach of the GDPR or the French Data Protection Act is found following the filing of a complaint, a data breach or an inspection carried out by the CNIL. In most cases, the CNIL will issue a formal warning and offer the company a chance to remediate the infringements identified, although it is not compelled to do so.
A Rapporteur is designated to draft a report, which is communicated to the data controller and the other supervisory authorities concerned if the CNIL acts as lead supervisory authority. The data controller can then submit written observations before the hearing.
After the hearing, if the CNIL acts as lead supervisory authority, it communicates the draft decision to the other supervisory authorities concerned, which must provide the CNIL with their comments within a period of four weeks. The data controller is then notified of the CNIL's decision, which can be public or not.
The data controller can appeal the CNIL's decision before the Conseil d'Etat within two months.
Law 2022-52 of 24 January 2022 on criminal liability and domestic security created a simplified sanction procedure for simple matters, pursuant to which the administrative fine cannot exceed EUR20,000 and the penalty payment cannot exceed EUR100 per day of non-compliance.
Potential Enforcement Penalties
Non-compliance with the GDPR could result in administrative fines and criminal sanctions, as follows:
Leading Enforcement Cases
Recent major fines issued by the CNIL range from EUR3,000 to EUR150 million, concerning both small and medium companies as well as large ones. The latest decisions of the CNIL have sanctioned infringements such as lack of valid consent, lack of appropriate retention periods, lack of appropriate security and confidentiality measures, failure to inform data subjects and respect their rights, failure to comply with the supervisory authority, failure to comply with rules relating to cookies, failure to comply with rules relating to direct marketing, etc.
Private Litigation
Article 77 of the GDPR recognises the right of data subjects to lodge a complaint with a national supervisory authority. Complaints in France can be addressed to the CNIL.
In addition, the French Data Protection Act provides two types of class actions:
The leading cases are as follows:
Law enforcement’s access to data has been deeply modified and reinforced by Law No 2016-731 dated 3 June 2016 for “strengthening the fight against organised crime, terrorism and their financing, and improving the efficiency and safeguards of criminal proceedings.”
Under some circumstances, the Investigating Judge may prescribe the interception, recording and transcription of correspondence sent by electronic communications (Articles 100 to 100-8 of the French Criminal Procedure Code). Such actions are subject to several procedural safeguards and limitations, particularly in terms of duration.
Moreover, the Investigating Judge (or police officers authorised by the Liberties and Detention Judge) can – remotely and without informing the data subject – access correspondence stored via electronic communications accessible by means of a computer identifier, for a maximum period of one month, renewed under strict conditions (Articles 706-95-1 to 706-95-3 of the French Criminal Procedure Code).
The Investigating Judge may also perform investigations at the premises concerned, to make any useful findings or conduct searches (Articles 92 to 99-5 of the French Criminal Procedure Code). All objects, documents or computer data placed under judicial control shall be immediately inventoried and placed under seal.
Law No 2015-912 of 24 July 2015 on intelligence and its implementing Decree No 2016-67 of 29 January 2016 on intelligence gathering techniques define the legal framework that authorises the intelligence services to use information access techniques, particularly means of access to connection data and computer data capture, while guaranteeing the right to privacy.
For certain crimes and offences, the Investigating Judge may authorise the interception, recording and transcription of electronic correspondence. By way of exception and for the most serious crimes, it may also authorise the following, with or without the individual’s consent:
Procedural safeguards regulating such seizure are provided by Articles 76 and 97 of the French Criminal Procedure Code.
An organisation must not transfer personal data to a foreign government if such transfer is not compliant with French and European data protection laws. Indeed, the CLOUD Act explicitly provides that the service provider from which the data is requested always has the possibility to object on the grounds that the request would lead to an infringement of the legislation of a foreign country and expose them to sanctions (conflict of laws situation).
France does not yet participate in a CLOUD Act agreement with the USA. The Mutual Legal Assistance Treaty agreement between France and the United States is not efficient, and discussions have been entered into between the USA and the EU to sign an EU-wide Mutual Legal Assistance Treaty. The provider may refuse to disclose the requested data on the basis of the common law principles of comity – ie, on the basis of the principle of international comity recognised by the US courts according to which, for the application of US law, the important interests of other countries must be taken into account and, where appropriate, US legislation must not be applied or must be applied in a nuanced manner.
Several laws in France are likely to hinder requisitions by the US authorities of data stored in Europe, particularly Articles 44 et seq of the GDPR (which lay down the conditions under which personal data may be transferred to third countries or international organisations), the French blocking statute (Law No 68-678 of 26 July 1968) and French Law No 2018-670 of 30 July 2018 on the protection of business secrecy.
La Quadrature du net, the French Data Network and La Fédération des fournisseurs d'accès à internet associatifs have challenged the constitutionality of certain provisions contained in Law No 2015-912 of 24 July 2015 on intelligence, before the French Constitutional Council.
The French Constitutional Council abrogated two provisions relating to the following:
Pursuant to Articles 44 to 50 of the GDPR, transfers of personal data outside the European Union are not authorised unless the third-country recipient of the personal data ensures an adequate level of protection, or appropriate guarantees are applied.
The European Commission has so far recognised Andorra, Argentina, Canada (commercial organisations), the Faroe Islands, Guernsey, Israel, the Isle of man, Japan, Jersey, New Zealand, South Korea, Switzerland, the United Kingdom and Uruguay as providing adequate protection.
Where a third state is not recognised as offering an adequate level of protection, appropriate safeguards must be deployed, such as the execution of SCCs, Binding Corporate Rules (BCR), etc. The appropriate safeguards should be complemented by supplementary measures that are necessary to bring the level of protection of the data transferred up to the EU standard of essential equivalence if the law or practice of the third country may impinge on the effectiveness of the appropriate safeguards. The Schrems II Decision (C-311/18), the EDPB Guidelines on supplementary measures and the draft implementing the decision of the European Commission require a prior assessment of the law or practice of the data-importing jurisdiction before proceeding with personal data transfers, also called a Transfer Impact Assessment (TIA).
In the absence of adequacy decisions or appropriate safeguards, a transfer of personal data can rely on some exceptions (Article 49 of the GDPR), such as the explicit consent of the data subject, the performance of a contract between the data subject and the controller, or the implementation of pre-contractual measures taken at the data subjects' request.
A Privacy Shield is a self-certification mechanism for companies established in the United States, which has been previously recognised by the European Commission as providing an adequate level of protection for personal data transferred to companies established in the United States. The European Court of Justice invalidated the EU-US Privacy Shield as it was not circumscribed in a way that satisfies requirements that are essentially equivalent to those required under the EU law (“Schrems II”, C-311/18).
SCCs are model contracts for the transfer of personal data adopted by the European Commission. There are two types of SCC: those governing transfers between two data controllers, and those governing transfers between a data controller and a data processor. In the Schrems II case, Maximilien Schrems challenged the sufficiency of the SCC to protect personal data transfers to the United States. The Court of Justice considers that SCCs for the transfer of personal data in third countries are valid in principle. However, the decision imposes an obligation on the data exporter and the recipient of the data to verify whether the level of protection is respected in such third country, prior to any transfer; see 4.1 Restrictions on International Data Issues. New sets of SCCs were published by the European Commission on 4 June 2021, including four modules based on the parties’ qualifications:
BCR constitute intra-group data protection agreements for the transfer of personal data. They must be evaluated and validated against the EDPB's standards. BCR must be legally binding and implemented by all relevant entities of the group of companies, must expressly confer rights on data subjects regarding the processing of their personal data, and must meet the requirements set out in the GDPR.
No government notifications or approvals are required to transfer data internationally.
The French Data Protection Act does not include provisions requiring personal data to be localised in France. GDPR rules apply in France for transfers of personal data processing, and no stricter rules are imposed by the French Data Protection Act.
Public archives are considered national treasures, and French law provides that national treasures cannot be exported. Hosting data outside France is considered exporting. The following Articles of the French Heritage Code apply to exporting:
No software codes, algorithms or similar technical details are required to be communicated to the government.
Limitations on the collection and transfer of personal data in connection with foreign government data requests may be limited in France through Articles 44 et seq of the GDPR pertaining to transfers of personal data outside the European Union, the French blocking statute (Law No 68-678 of 26 July 1968) and French Law No 2018-670 of 30 July 2018 on the protection of business secrecy.
The "French blocking statute" is Law No 68-678 of 26 July 1968 relating to the communication of documents and information of an economic, commercial, industrial, financial or technical nature to foreign individuals or legal entities.
The key measures of this law are as follows:
Big Data Analytics
See 2.2 Sectoral and Special Issues (Cookies and beacons) for the regulation of analytics cookies and online trackers.
Profiling
A profiling processing operation is based on the establishment of an individualised profile, relating to a particular person, in order to evaluate certain personal aspects of that person, with a view to making a judgement or drawing conclusions about him or her.
A fully automated decision is a decision taken with respect to an individual, through algorithms applied to that individual's personal data, without any human being intervening in the process.
In practice, the two notions are closely linked: profiling a person frequently leads to a decision being made about them, and many fully automated decisions are made on the basis of profiling.
Article 22 of the GDPR regulates automated decision processing, whether it is based on profiling or not. Pursuant to this article, individuals have the right not to be subject to a decision based exclusively on automated processing and producing legal effects concerning them or significantly affecting them in a similar way. However, in certain specific cases, a person may be the subject of a fully automated decision, even if it has a legal effect or a significant impact on them. These exceptions concern:
Artificial Intelligence
Article 22 of the GDPR provides a framework for the use of personal data necessary for the operation of algorithms, as mentioned above.
Article 47 of the French Data Protection Act provides an additional exception for individual administrative decisions, provided that the processing does not involve sensitive data. Such decisions include an explicit mention to explain that the individual decision has been taken on the basis of an algorithmic processing, and the data controller must be able to explain in detail and in a form intelligible to the data subject how the processing operation has been carried out.
On 21 April 2021, the European Commission published a proposal for a Regulation laying down harmonised rules on artificial intelligence (COM/2021/206 final).
Internet of Things (IoT)
The CNIL has issued several practical guides on IoT and on securing connected devices, and in particular on connected TVs, toys, cars and kitchen robots, to raise awareness and to invite manufacturers to implement data protection by design and by default measures. In this context, the CNIL makes the following recommendations in particular:
Autonomous decision-making
See above sections on automated decision-making and artificial intelligence.
In addition, in 2017 the CNIL issued guidelines for connected vehicles for responsible data usage, which analyse three processing scenarios under which data collected in the vehicle:
The CNIL promotes the first scenario, which involves processing data locally in the vehicle, without transmission to service providers, as it provides for the best data protection guarantees.
The CNIL also states that the licence plate number and the vehicle serial number are personal data, as is any data relating indirectly to a data subject (data relating to the journeys made, the condition of the vehicle's parts, the dates of technical inspections, the number of kilometres driven, driving style, etc).
The EDPB also issued guidelines 1/2020 on Connected Vehicles, on 9 March 2021.
Although the pack today applies to connected vehicles, two of its obligations have a considerable impact on the development of autonomous vehicles: the implementation of data protection by design and by default principles, as well as security by default underlined by the GDPR.
Facial Recognition
The CNIL considers that a balanced overview is necessary in order to avoid any confusion and any blanket conclusion on facial recognition technology as it can host possible uses, from unlocking smartphones to opening bank accounts and recognising a person being sought by police in a crowd, and raises different issues.
Biometric processing, including facial recognition, has been identified by the CNIL as a key challenge under high scrutiny and is regulated by several legal frameworks.
The CNIL issued a position paper on the use of facial recognition techniques, which supported a risk-based approach to determine which risks are not acceptable in a democratic society and which ones can be assumed with appropriate safeguards. It also requires that the proportionality of the means deployed and the special protection afforded to children are both guaranteed. When the legal basis of the processing is the consent of the data subjects, the CNIL specifies that the provision of an alternative means that does not involve biometric devices, without additional cost or particular constraints for the data subjects, is essential in order to guarantee freedom of consent.
The CNIL also specifies that the data controller must regularly ensure that the automatic erasure of biometric templates is effective.
The CNIL pays high attention to facial recognition; for example, it issued a formal notice to a company providing facial recognition software for using photos and videos publicly accessible on the internet. The CNIL also issued a draft guide on intelligent and augmented video devices in places open to the public (open to public consultation until 11 March 2022).
Geolocation
Geolocation is only considered by the CNIL for specific applications – eg, geolocation of employees' vehicles or geolocation to control employees' working time. The CNIL excludes the use of geolocation of employees' vehicles for monitoring compliance with speed limits, permanent control of employees, the collection of location data outside working hours, including to fight against theft or check compliance with the conditions of use of the vehicle, or to calculate the working time of employees when another device already exists. In addition, geolocation processing is considered by the CNIL as an intrusive means and is likely to require the conducting of a DPIA.
Drones
There is no privacy-specific law provision in France dedicated to drones, but the data protection and security by default principles apply. French Law No 2016-1428 dated 24 October 2016 only contains general provisions pertaining to security and training requirements regarding the use of certain drones. On 12 January 2021, the CNIL sanctioned the French Ministry of the Interior for illegally using drones equipped with cameras, particularly to monitor compliance with COVID-19 lockdown measures. The CNIL required the Ministry to cease all drone activity until a legal framework specifically authorises such use, or until a technical system preventing the identification of individuals is implemented.
Disinformation or Other Online Harms
See 2.2 Sectoral and Special Issues (Hate speech, disinformation, terrorist propaganda, abusive material, political manipulation, bullying, etc).
Many collective and sectoral initiatives are initiated by industry stakeholders to mitigate risks relating to the use of disruptive technologies and to create insights for reasonable uses. The CNIL and industry professionals are elaborating sectoral codes of conduct to introduce best standards and harmonisation to specific and commonly used processing operations within an industry.
See 2.5 Enforcement and Litigation.
Several privacy-related issues occur in the context of due diligence processes in corporate transactions. Such implications are stressed by the recent statement of the EDPB adopted on 19 February 2020 regarding the privacy implications of mergers. Corporate transactions entail more and more privacy issues, such as the combination of personal data, the disclosure of personal data in the transaction process and securing the transaction process. For instance, the setting up of a data room as part of the due diligence process must comply with the main principles of the GDPR (security measures to be implemented for access to the data room, prohibition of shared accounts, minimisation of communicated personal data, etc).
The Network and Information System Security Directive (NIS) of 6 July 2016, transposed into French law by Law No 2018-133 of 26 February 2018 laying down various provisions adapting national law to European Union law in the field of security, and its Decree No 2018-384 of 23 May 2018, on the security of the networks and information systems of essential service operators and digital service providers. On the same matter, the EU is now working on the adoption of a revised NIS 2 Directive, following a first revision proposal of 16 December 2020. More details are provided in the Cybersecurity: France Law and Practice and Cybersecurity: France Trends and Developments chapters.
More details are provided in the Cybersecurity: France Law and Practice and Cybersecurity: France Trends and Developments chapters.
17, avenue Matignon
CS 30027
75378 Paris cedex 08
France
+33 1 5367 4747
+33 1 5367 4748
www.hoganlovells.comAdoption of a New Regulatory Framework for AI
On 21 April 2021, the European Commission published a proposal for a regulation laying down the basis of harmonised rules on artificial intelligence (AI) (COM/2021/206 final) (the AI Proposal). The AI Proposal complements the 2019 "Ethics Guidelines for Trustworthy AI", the purpose of which was to highlight the processes and best practices around AI. The European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) analysed the content of the AI Proposal and published a joint opinion on 18 June 2021, highlighting some provisions to be improved, including in order to have better correlation between the AI Proposal and the GDPR.
Among a number of provisions, the proposed regulation provides specific rules for AI systems that present a high risk to the health, safety or fundamental rights of natural persons. Following a risk-based approach, these high-risk AI systems are allowed on the European market under the strict condition that they comply with mandatory requirements listed in the regulation and that an ex ante assessment of their compliance is carried out.
In particular, these requirements include ensuring the quality and governance of data to limit risks and discriminatory effects, implementing detailed technical documentation, logging access and actions, having robust security measures, ensuring transparency, maintaining human control, etc. The EDPB and EDPS recommended these measures to be reinforced and, above all, completed with requirements to comply with data protection regulation (which is not currently a condition for market entry for AI systems, as planned by the current version of the AI Proposal).
The assessment must be carried out before deploying the AI system, and must be then regularly updated. It must take into account the specific use that can been made of the AI system as well as the context of its use.
The AI Proposal also includes a list of prohibited AI systems, which the EDPB and the EDPS found to be too restrictive. They suggested that the list could be completed by adding AI systems that impact human dignity, social scoring systems, automatic biometric systems in public spaces, etc.
The AI Proposal is particularly welcome because of a general increase of AI in various fields of our lives, notably through biometric identification techniques that raise both ethical and legal questions and elicit significant attention from data protection authorities. For example, the French Data Protection Authority (CNIL) has published several guidelines on new video usages (September 2018), facial recognition (November 2019) and the use of smart and thermal cameras in the context of the COVID-19 pandemic (June 2020). The AI Proposal will also ensure a harmonised framework within EU Member States.
Although the EU regulation is not yet in force, it provides clear insight into the future of AI regulation in Europe. Companies that intend to continue to innovate with AI must address the technology’s risks and take steps to prepare for this future regulation as well as for subsequent national regulations that are to follow, in particular because sanctions based on the AI Proposal are higher than the GDPR ones, ranging up to EUR30 million or 6% of the global annual turnover, whichever is higher.
CNIL Policy on Cookies and Online Trackers
The use of cookies and other tracking technologies is a major focus of the CNIL's 2021 investigations programme.
In September 2021, the CNIL published new guidelines and a recommendation in which it explained how provisions of the EU ePrivacy Directive should be implemented, including regarding information to be provided through cookies (purposes, categories of cookies, retention periods, etc), the possibility to refuse cookies as easily as accepting them, the possibility for a user to change preferences regarding cookies, examples of how cookies banners should be displayed, etc. Although the guidelines and recommendation are not legally binding and should be seen only as interpretation and guidance from the CNIL on how to implement the ePrivacy rules, the CNIL expects them to be strictly followed and any deviations to be duly justified.
Based on these guidelines and the recommendation, on 21 March 2021 the CNIL initiated a proactive enforcement strategy with multiple online controls on companies’ websites regarding their use of cookies and other tracking technologies. This triggered several public sanctions, particularly towards two of the GAFA companies that have been sanctioned with regards to cookie refusal modalities (EUR150 million and EUR60 million, respectively). The CNIL's position is regularly supported by the French Administrative Supreme Court (Conseil d’Etat), which had issued, in appeal procedures, decisions validating the CNIL’s sanctions (with the most recent one on cookie matters being a decision validating a EUR100 million CNIL fine issued to a GAFA in December 2020).
The enforcement of these new rules seriously impacts digital operators that rely on cookies and trackers for their ad-based revenue model, leading them to limit the use of third-party cookies and look for alternatives to develop new devices, encouraged by the CNIL's October 2021 recommendations regarding the following alternatives to third-party cookies:
The CNIL points out, however, that these practices shall still comply with ePrivacy rules and the GDPR.
Data Transfers and Digital Sovereignty
On 16 July 2020, the Court of Justice of the European Union (CJEU) invalidated Decision 2016/1250 of the European Commission on the EU-US Privacy Shield, ruling that this tool did not ensure an adequate level of protection for personal data transferred to the United States, in particular due to US surveillance laws that may trigger possible access by the US government to personal data transferred from the EU.
The CJEU also ruled on the validity of the European Commission's standard contractual clauses (SCCs) as transfer tools to third countries that are not adequate, provided that they are completed with additional safeguards. Such safeguards include a Transfer Impact Assessment (TIA) to analyse potential risks of access by authorities and measures to limit these risks, some reinforced security measures such as encryption, and the completion of a vendor questionnaire to identify measures taken to protect personal data.
On 4 June 2021, the European Commission adopted a new set of SCCs on the transfer of personal data to a third country, replacing the existing versions. They are built according to four modules depending on the parties’ qualifications:
The new SCCs increase the obligations on data exporters and importers. To help them meet these obligations, the EDBP published recommendations on measures that supplement transfer tools, including mapping data transfers to third countries, identifying the corresponding IT tools, assessing the law and practices of the recipient country, identifying necessary supplemental measures (encryption, pseudonymisation, etc) and implementing them, taking formal procedural steps if required and re-evaluating these measures on a regular basis.
Discussions around data transfers outside the EU and the remaining risks of potential access by local authorities, particularly US authorities, are increasingly leading to discussions in France of a “digital sovereignty”. For example, the CNIL publicly announced its position of not being in favour of using US cloud service providers and favouring hosting in the EU, and if possible in France.
In 2021, several European authorities, including the CNIL, approved a European Code of Conduct for Cloud Infrastructure Services (CISPE). This first authorisation of a cloud code of conduct could help European operators to position themselves and foster the emergence of digital sovereignty. In this respect, two projects should be mentioned:
The French Secretary of State for Digital Affairs, M. Cédric O, announced a EUR1.8 billion investment plan for developing a French cloud offer. The emergence of European cloud operators could also help in offering such a sovereign cloud.
17, avenue Matignon
CS 30027
75378 Paris cedex 08
France
+33 1 5367 4747
+33 1 5367 4748
www.hoganlovells.com