Data Protection & Privacy 2022

Last Updated March 10, 2022

UK

Law and Practice

Authors



Hunton Andrews Kurth has more than 1,000 lawyers located across the world, and is an international law firm with a renowned global privacy and cybersecurity practice. The lawyers are known for their deep experience, breadth of knowledge and outstanding client service. In addition to its legal practice, the firm distinguishes itself through its Centre for Information Policy Leadership, a global privacy and security think-tank that works with industry leaders, regulatory authorities and policymakers to develop global solutions and best practices for privacy and the responsible use of data to enable the modern information age. For the latest resources in privacy, data protection and cybersecurity, visit www.huntonprivacyblog.com.

UK General Data Protection Regulation and Data Protection Act 2018

The UK’s departure from the European Union took effect on 31 December 2020; from that date the EU General Data Protection Regulation EU2016/679 (EU GDPR) no longer applies directly in the UK. Instead, the EU GDPR has been incorporated into UK law (the UK GDPR) by the Data Protection, Privacy and Electronic Communications (Amendments etc.) (EU Exit) Regulations 2019 (as amended by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2020) (together, DPPEC Regulations). For practical purposes, the UK’s data protection regime is almost identical to that which applied prior to Brexit, although that position is likely to change over time.

The UK GDPR and the Data Protection Act 2018 (DPA 2018) govern the processing of personal data – ie, data that relates to an identified or identifiable individual. It has extra-territorial effect so that organisations that offer goods and services to, or monitor the behaviour of, individuals in the UK will be subject to its provisions, even where the collection and processing of personal data takes place outside the UK. Furthermore, the UK GDPR applies to processing that takes place in the context of the activities of a UK establishment (ie, an entity or branch located within the UK).

Generally, the UK GDPR applies to automated processing, but it also governs non-automated processing of personal data that forms part of a “filing system”. A filing system includes manual files that are sufficiently structured such that information about an individual is readily accessible.

"Personal data" is defined broadly to include identifiers such as name, identification number, location data, online identifiers and factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of individuals. Accordingly, an organisation will hold personal data where an individual’s identity is not known but the individual can be differentiated from other individuals – for example, by reference to a cookie identifier. The UK GDPR does not apply to data that has been irreversibly anonymised, but pseudonymised data remains within its scope.

The UK GDPR regulates controllers and processors. "Controllers" are organisations that determine the means and purposes of the processing of personal data; "processors" are organisations that process personal data on behalf of a controller. Processors have no real discretion as to how or why data is processed (although a controller’s instructions may permit limited discretion as to how those instructions are carried out by the processor).

The DPA 2018 supplements the UK GDPR by:

  • addressing issues under the UK GDPR reserved for member states to address under national law (eg, sensitive data processing, additional processing grounds, and certain exemptions);
  • implementing the Law Enforcement Directive (EU2016/680), which governs the processing of personal data for law enforcement purposes; and
  • extending the UK regime to cover intelligence services processing, and certain areas outside the scope of the UK GDPR.

Following Brexit, the DPA 2018 must be read in light of the DPPEC Regulations.

ePrivacy Directive/Privacy and Electronic Communications (EC Directive) Regulations 2003

The ePrivacy Directive supplements the UK GDPR by imposing detailed requirements for the use of personal data in electronic communications. Specifically, it regulates electronic marketing and the use of cookies, and addresses security and privacy in electronic communications services. The ePrivacy Directive was incorporated into UK law by the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). PECR overlaps with the UK GDPR when electronic communications involve the processing of personal data, such as when individuals are tracked using cookies and when they receive direct marketing at an email address from which they are identifiable.

The ePrivacy Directive is currently under review by the EU, with negotiations between the Council of the European Union, the European Parliament and the European Union on a new ePrivacy Regulation currently underway. As of the date of publication, changes to a number of areas have been agreed but a number of significant areas of disagreement remain. Once finalised, the UK will need to decide whether to adopt similar legislation.

Investigatory Powers Act 2016 and Investigatory Powers (Interception by Businesses etc. for Monitoring and Record-keeping Purposes) Regulations 2018

The provisions of the ePrivacy Directive that address confidentiality of communications now fall under the Investigatory Powers Act 2016 (IPA). The Investigatory Powers (Interception by Businesses etc. for Monitoring and Record-keeping Purposes) Regulations 2018 (IPIBR) contain exceptions to the general prohibition on the interception of communications that allow businesses to monitor communications lawfully, subject to satisfying strict requirements (see 2.4 Workplace Privacy).

Overview of Enforcement Environment

The UK’s data protection laws are enforced by the Information Commissioner's Office (ICO) (see 1.2 Regulators), which has a range of investigative, corrective and advisory powers. Long advocating a proportionate approach to enforcement, the ICO is an active regulator that is willing to utilise its investigative and corrective powers as required. In 2021, the largest single fine it imposed was for GBP500,000 (GBP20 million in 2020), the total amount of fines imposed under the GDPR was GBP535,000 (GBP40.65 million in 2020), and the total amount imposed under PECR was GBP3.27 million (GBP1.77 million in 2021). The ICO is currently leading an extended, industry-wide investigation into data brokers, and in relation to the adtech ecosystem. In November 2021, the ICO announced its provisional intent to impose a potential fine of just over GBP17 million on Clearview AI Inc.

The DPA 2018 also creates criminal offences for the unlawful obtaining of personal data (which includes procuring the disclosure of personal data without the controller’s consent), re-identifying information that has been de-identified without the controller’s consent, and altering (eg, blocking, erasing, destroying or concealing) personal data to prevent disclosure to an individual – for example, in response to an access request.

The ICO is the UK’s independent data protection and information rights regulator, responsible for regulatory oversight of, amongst others, the UK GDPR, PECR, the Network and Information Systems Regulations 2018 (NIS Regulations), Environmental Information Regulations and freedom of information matters.

Regulatory Powers

Under the UK GDPR, the ICO has the following broad categories of powers:

  • investigative – requiring a controller or processor to provide information (including by obtaining access to premises and/or processing equipment), conducting audits, and reviewing certifications;
  • corrective – issuing warnings and reprimands, requiring compliance with individuals’ requests, ordering remediation of processing, requiring notification of data breaches to individuals, restricting or banning processing, requiring rectification or restriction of processing, suspending data flows or imposing fines of up to GBP17.5 million or 4% of a company’s worldwide annual turnover for the most serious breaches of the UK GDPR;
  • authorisation and advisory – responding to the prior consultation procedure, issuing opinions (including to Parliament), approving codes of conduct, authorising processing, accrediting certification bodies and issuing certifications, authorising standard data protection contractual clauses, and approving binding corporate rules.

Under PECR, the ICO may impose fines of up to GBP500,000 per breach and has a range of enforcement powers, including the investigation of complaints, requiring information to be provided, a limited form of audit power, requiring steps to be taken or not taken, and requiring that personal data processing ceases.

The extraterritorial scope of the UK GDPR means that the ICO has power to take action against controllers and processors based outside the UK. To date, this power appears to have been used only once, when the ICO issued an enforcement notice against Canadian company Aggregate IQ (in the context of the ICO’s Cambridge Analytica investigation), requiring it to delete the personal data of UK citizens. If imposed, the ICO’s proposed GBP17 million fine against Clearview AI Inc would be the second time the ICO has used its enforcement powers against a non-UK company. In practice, the ICO’s ability to regulate foreign companies remains untested.

ICO Investigations and Enforcement

The ICO has power to initiate investigations and act on complaints utilising the following tools.

  • Information notices to require a controller, processor or another person to provide information to assist the ICO in carrying out its functions. It is a criminal offence to make a false statement in response to an information notice.
  • Assessment notices to require a controller or processor to permit the ICO to carry out an assessment of whether they comply with data protection legislation. Under these notices, the ICO can enter premises, request documents and information, and require that individuals are made available for interview.
  • Enforcement notices may be issued at the conclusion of an investigation, requiring a controller or processor to take steps, or to refrain from taking steps.
  • Monetary penalty notices impose fines for failure to comply with data protection legislation, or failure to comply with an information notice, assessment notice or enforcement notice.
  • Fixed penalties of up to GBP4,000 may apply where a controller or processor fails to pay its annual fee to the ICO.

The ICO seeks to take a proportionate approach to enforcement, prioritising cases in which there is a significant risk of harm to individuals. The process by which the ICO will assess whether and, if so, when to issue a notice and exercise its enforcement powers is set out in the ICO’s Regulatory Action Policy (currently under review), and its Statutory Guidance on Regulatory Action (recently the subject of a public consultation). There is a right of appeal against an enforcement notice to the First Tier Tribunal (Information Rights) within 28 calendar days.

Brexit and EU

On 31 January 2020, the UK left the EU and entered a transition period, which ended on 31 December 2020. On 28 June 2021, the European Commission published two adequacy decisions in respect of the UK. One for transfers under the EU GDPR and the other for transfers under the Law Enforcement Directive. These decisions contain the Commission’s assessment of the UK’s laws for protecting personal data, as well as the legislation designating the UK as adequate. The EU GDPR adequacy decision confirms that the UK provides adequate protection for personal data transferred from the EU to the UK under the EU GDPR.

The adequacy decisions are expected to last until 27 June 2025, provided always that the Commission can withdraw its adequacy decisions before this date, if it determines that the UK ceases to provide an adequate level of protection for personal data. In 2024, the Commission will decide whether to extend the UK adequacy decisions for a further period of up to a maximum of four years. In the absence of any extension or earlier withdrawal, the adequacy decisions will expire on 27 June 2025.

Following Brexit, UK organisations must consider whether any of their data processing activities continue to be governed by the EU GDPR (eg, where the UK entity offers goods or services to an individual in the EU, or monitors their behaviour per Article 3(2)). In those circumstances, the UK entity may need to appoint a representative under Article 27 of the UK GDPR. EU organisations will need to undertake a similar analysis to assess whether they are subject to the UK GDPR.

ePrivacy Regulation

The ePrivacy Regulation is set to replace the ePrivacy Directive in the EU once a final text is agreed. As a regulation, once implemented, it will apply directly in all EU member states, replacing any domestic implementation of the ePrivacy Directive. As the UK is no longer an EU member state, the ePrivacy Regulation will not automatically apply, but the UK may choose to implement similar provisions.

Data Protection NGOs

There are several specialist privacy and data protection NGOs in the UK, with the most well-known being:

  • Privacy International (with a particular focus on government surveillance);
  • Open Rights Group (government and corporate surveillance, online copyright issues);
  • MedConfidential (health data); and
  • Big Brother Watch (data protection and surveillance technologies).

These organisations lobby, participate in government consultations and sometimes engage in litigation.

Industry Self-Regulatory Organisations

Where organisations engage in advertising in the UK, they should be aware of industry self-regulatory organisations the Advertising Standards Authority (ASA) and Committee of Advertising Practice (CAP), together with the Internet Advertising Bureau UK (IAB UK).

Following Brexit, the UK’s data protection regime is essentially the same as that of the EU – namely, an omnibus data protection framework. The UK’s implementation of the ePrivacy Directive, PECR, is aligned with equivalent laws across EU member states, with permitted discrepancies. The enforcement of both legal frameworks in the UK is generally proportionate. The ICO is an active regulator.

See 1.4 Multilateral and Subnational Issues and the discussion of Schrems II under 1.8 Significant Pending Changes, Hot Topics and Issues and 4.2Mechanisms or Derogations that Apply to International Data Transfers.

UK Adequacy

As discussed in 1.4 Multilateral and Subnational Issues, the European Commission published two adequacy decisions in respect of the UK, in relation to transfers under the EU GDPR and the Law Enforcement Directive. The EU GDPR adequacy decision confirms that the UK provides adequate protection for personal data transferred from the EU to the UK under the EU GDPR.

Schrems II

In the wake of the Schrems II judgment and the invalidation of the Privacy Shield, UK organisations must undertake a Schrems II transfer risk assessment for transfers of personal data that rely on appropriate safeguards under Article 46 (ie, Standard Contractual Clauses (SCCs) and binding corporate rules). These transfer risk assessment require UK organisations to assess whether there is an adequate level of protection provided to personal data in the country of destination jurisdiction. Where this is not the case, UK organisations must consider whether additional safeguards can be implemented in connection with the transfer that would ensure an adequate level of protection. This may include legal safeguards (eg, additional contractual obligations), technical safeguards (eg, encryption of the data in transit/pseudonymisation) and organisational safeguards (eg, a procedure for handling and challenging government authorities’ requests for access to or disclosure of personal data). If no safeguards can be implemented, the transfer should not take place.

New Standard Contractual Clauses

On 4 June 2021, the European Commission adopted new SCCs for cross-border transfers to non-adequate third countries The new SCCs took effect on 27 June 2021. There is an 18-month grace period for organisations to transition any arrangements based on the old SCCs, to the new SCCs, which will last until 27 December 2022. Therefore, until this date, the old SCCs can still be relied upon as a transfer mechanism for existing arrangements entered into before 27 September 2021.

Following Brexit, the new SCCs do not apply to transfers of personal data from organisations which are subject to UK GDPR and cannot be relied upon to facilitate transfers of personal data by organisations subject to the UK GDPR. The ICO has confirmed that organisations should continue to rely on the old SCCs for UK transfers until such time as the UK publishes its own SCCs. To this end, the ICO has published its own draft SCCs, which are expected to take effect on 21 March 2022, subject to parliamentary approval.

ePrivacy Regulation

In February 2021, the Council of the European Union agreed its text for the ePrivacy Regulation, paving the way for trilogue negotiations to begin. The ePrivacy Regulation will not automatically apply in the UK, but the UK government may choose to implement similar provisions.

COVID-19

In response to the COVID-19 pandemic, the ICO published guidance reminding organisations that, while data protection laws continue to apply, these should not stop organisations from using personal data proportionately. The ICO encouraged organisations to focus on the key principles of necessity, minimisation, transparency, fairness and security, and to ensure that individuals could continue to exercise their rights.

Data Protection Principles

Overarching principles are set out under Article 5 of the UK GDPR as follows.

  • Lawfulness, fairness and transparency – controllers must provide notice of data processing activities to individuals, and to ensure that processing is fair and based on a lawful basis (discussed below).
  • Purpose limitation – data must be collected for specified, explicit and legitimate purposes and not further processed in a manner incompatible with those purposes.
  • Data minimisation – data must be adequate, relevant and limited to what is necessary in relation to the processing purposes.
  • Accuracy – data must be accurate and kept up-to-date.
  • Storage limitation – data must be kept in identifiable form for no longer than necessary.
  • Integrity and confidentiality – data must be processed in a manner that ensures appropriate security. Article 32 of the UK GDPR sets out specific security measures (see below). Appropriate safeguards for personal data depend on the state of the art, the costs of implementation and the nature and context of the processing, as well as inherent risk. The suggested security measures include pseudonymisation and encryption, and processes for regular testing, assessment and evaluation of the effectiveness of the technical and organisational measures used.
  • Accountability – the controller is responsible for, and must be able to demonstrate, compliance with these principles.

Accountability

Under Article 5(2), the principle of accountability requires controllers to implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing complies with the UK GDPR, and to keep those measures under review, updating them as required. Accountability also requires transparency with respect to the individuals whose data is processed. Increasingly, when investigating a complaint, the ICO reviews how controllers have complied with the accountability principle.

The UK GDPR permits a risk-based approach to compliance. Accordingly, controllers have a degree of flexibility and the measures implemented will differ between organisations, based on risk. Generally, the greater the risk presented by a processing activity, and the more intrusive it is to data subjects, the more the controller will need to do to implement robust security and accountability measures. Examples of accountability measures include internal policies and procedures, training, transparency measures, risk assessments, monitoring of internal compliance, ensuring senior leadership and oversight, and strong internal enforcement in response to complaints.

Lawful Basis

To ensure data processing is lawful, the controller must satisfy one of the following legal bases (provided by Article 6 of the UK GDPR).

  • Consent – the data subject must provide freely given, specific, informed and unambiguous consent to the processing of their personal data for one or more specified purposes. Consent is not appropriate where there is an imbalance of power between the data subject and the controller, such as in an employment context, since such consent generally cannot be considered freely given. Consent requires a proactive, affirmative act, and generally cannot be implied. Individuals must be able to withdraw their consent at any time.
  • Contract performance – the processing must be necessary for the performance of a contract to which the data subject is a party, or to take steps at the data subject's request prior to entering into a contract.
  • Controller's legal obligation – the processing must be necessary to enable the controller to comply with a legal obligation, other than one imposed by contract.
  • Vital interests – the processing must be necessary to protect the vital interests of the data subject.
  • Public interest – the processing must be necessary for the performance of a task carried out in the public interest or in the exercise of the controller's or a third party's official authority.
  • Legitimate interests – the processing must be necessary for the purposes of the legitimate interests pursued by the controller or by third parties to whom the data is disclosed, except where those interests are overridden by the interests of the data subject. In each instance where legitimate interests is relied on, the controller must carry out and document a balancing test.

Special Category Data

To process special category data and data relating to criminal offences, controllers must not only ensure a lawful basis for processing, but also establish that they can rely on an exemption to the UK GDPR’s general prohibition on the processing of such data under Article 9 (see 2.2 Sectoral and Special Issues).

Data Protection Officer

Article 37 of the UK GDPR requires controllers and processors to appoint a Data Protection Officer (DPO) if they are a public authority or where their core activities:

  • require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
  • consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.

Organisations may appoint a DPO voluntarily, in which case they will be held to the same standards with respect to their DPO as organisations that are obliged to appoint one.

The role of a DPO is to monitor data protection compliance, inform and advise the organisation on data protection (including in relation to Data Protection Impact Assessments (DPIAs), as discussed below) and to co-operate with the supervisory authority. For example, the DPO will ensure that appropriate training is provided to employees, and act as a contact point for regulators and for data subjects.

Data Protection Impact Assessments

Where data processing activities are likely to result in a high risk to the rights and freedoms of individuals, having regard to the nature, scope, context and purposes of the processing, controllers must undertake a DPIA under Article 35 of the UK GDPR. A DPIA is typically required where processing involves the use of new technologies, and must be undertaken where the processing involves:

  • systematic and extensive evaluation of personal aspects relating to data subjects, based on automated processing (including profiling), on which decisions are based that produce legal effects in relation to individuals;
  • large-scale processing of special category personal data or data relating to criminal convictions and offences; or
  • systematic monitoring of a publicly accessible area on a large scale.

A DPIA is a key accountability measure for organisations, intended to assist in identifying and mitigating risk. What constitutes “high risk” is not specified in the UK GDPR, but the ICO has published guidance on the matter. The ICO considers that a DPIA should always be undertaken when profiling is carried out on a large scale, where data is combined, compared or matched from multiple sources, or where profiling, automated decision-making or special category data is used to help make decisions concerning access to a service, opportunity or benefit.

Where a DPIA does not demonstrate that the risks of processing may be adequately mitigated, the controller must consult the ICO under the prior consultation procedure before commencing the processing.

Data Processing Agreements

When a controller engages a processor to perform processing activities on its behalf, it must enter into a data processing agreement that complies with Article 28 of the UK GDPR. Obligations imposed on the processor under any such agreement must, in turn, be flowed down to any subsequent processor in the chain – ie, a “sub-processor”. While not required by the UK GDPR, such contracts generally include an apportionment of liability for data breaches and other violations of data protection laws between the parties.

Security

Article 32 of the UK GDPR requires controllers and processors to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk of their processing. To determine which measures are appropriate, organisations should take into account the state of the art, the costs of implementation and the nature, scope, context and purposes of the processing, as well as the likelihood and severity of the risk for individuals’ rights and freedoms.

The UK GDPR suggests a range of measures that may be appropriate. These include pseudonymisation and encryption, measures that ensure the ongoing confidentiality, integrity, availability and resilience of processing systems, measures that ensure the availability of and access to personal data in the event of a physical or technical incident, and a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring security.

Notably, security measures form an important part of the privacy “by design and by default” approach to data protection that organisations are required to adopt under Article 25 of the UK GDPR, in addition to the measures and obligations set out above.

Breach

A personal data breach under the UK GDPR is a breach of security leading to the accidental or unlawful destruction, loss, alteration or, unauthorised disclosure of, or access to, personal data. Article 33 requires a controller to report a personal data breach to the ICO without undue delay and within 72 hours of becoming aware of the breach. Notification is not required if the breach is unlikely to result in a risk to the rights and freedoms of natural persons. Processors must notify relevant controllers of any breach, without undue delay, but not the ICO.

Under Article 34, if the breach is likely to result in a high risk of adversely affecting the rights and freedoms of the affected data subjects, the controller must notify those data subjects without undue delay.

Special Category Data

The UK GDPR prohibits the processing of “special category” data (ie, personal data concerning racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data for the purpose of uniquely identifying a natural person, health data and data relating to a natural person's sex life or sexual orientation), unless one of a limited number of exceptions applies. Personal data relating to criminal convictions and offences is not “special category” data, but forms a separate category of personal data. This data is also regarded as sensitive, and may only be processed for limited purposes. Financial data is not a sensitive data category.

Special category personal data may be processed where the individual has provided explicit consent or where the processing:

  • is carried out by a foundation or not-for-profit body with a political, philosophical, religious or trade union aim, provided that the processing relates to members and the organisation's purposes;
  • relates to data that is manifestly made public by the data subject; or
  • is necessary:
    1. in the context of employment, social security and social protection law;
    2. to protect the vital interests of the data subject;
    3. for the establishment, exercise or defence of legal claims;
    4. for reasons of substantial public interest and proportionate to the aim pursued;
    5. for the purposes of preventative or occupational medicine, for the assessment of working capacity or medical diagnosis;
    6. for reasons of public interest in the area of public health; or
    7. for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes.

Certain additional grounds for processing special category data are provided in Parts 1, 2 and 3 of Schedule 1 to the DPA 2018, which replicate and extend grounds previously included in the DPA 1998. Many of these grounds now require the controller to implement an Appropriate Policy Document to document how the processing complies with the data protection principles.

Criminal Convictions Data

The processing of criminal convictions data may be carried out under the control of an official authority or where authorised by a UK law providing appropriate safeguards for the rights and freedoms of data subjects. The DPA 2018 effectively provides that such data may be processed on the same basis as special category personal data, as described above.

Children’s Data

On 2 September 2021, the ICO’s Age Appropriate Design Code (AAD Code) came into force following a 12-month implementation period. The AAD Code consists of 15 design principles to ensure that the interests of children are central to the design and development of online services that they are likely to access. Requirements include the need for settings to be “high privacy” by default, that only the minimum amount of data is collected, that geolocation services default to “off”, and that "nudge techniques" are not utilised to weaken privacy safeguards (eg, by encouraging children to provide unnecessary data).

The ICO will enforce compliance in accordance with its Regulatory Action Policy, requiring organisations to demonstrate compliance. While the failure to comply with the AAD Code is not a breach of the UK GDPR per se, non-compliance with the AAD Code, however, makes it difficult for organisations to demonstrate compliance with the UK GDPR and/or PECR. As such, they may be subject to sanction, including warnings, reprimands, orders to cease processing data and fines.

Data Subject Rights

Chapter III of the UK GDPR provides data subjects with certain rights with respect to their data, as summarised below. The DPA 2018 contains a significant number of exemptions that determine whether the data subject rights apply in individual cases.

  • Information – a data controller must provide data subjects with the information listed in Articles 13 and 14 of the GDPR, ideally at the point at which their data is collected, or within a reasonable period after obtaining the data. This specified information is provided in a privacy notice.
  • Access – data subjects have the right to obtain confirmation from a data controller as to whether their personal data is processed and to obtain details of the processing, including information about the logic and consequences of processing in the case of automated decision-making and profiling.
  • Rectification – data subjects have the right to complete or correct inaccurate data.
  • Erasure – data subjects have a qualified right to require controllers to erase their data in certain specified circumstances, such as where the personal data is no longer necessary for the purposes for which it was collected or where consent for processing is subsequently withdrawn.
  • Restriction of processing – data subjects can restrict the controller’s processing in certain circumstances, including where the data subject challenges the accuracy of the data or where processing is unlawful but the data subject opposes erasure.
  • Data portability – data subjects have a qualified right to obtain a copy of the personal data provided to the controller in a structured, commonly used and machine-readable format, and to have that data transmitted to another controller.
  • Objection – data subjects have a qualified right to object to processing based on the public interest or legitimate interests legal bases. Data subjects also have an unqualified right to object to direct marketing.

Data subjects have a right under Article 22 not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or which similarly significantly affects them.

Controllers are required to facilitate and comply with requests from data subjects to exercise these rights, and must take the action requested without undue delay and, in any event, within one month of receipt of the request. This deadline may be extended by a further two months where necessary.

PECR and Marketing Communications

Electronic marketing communications are governed by PECR in the UK (see 1.1 Laws). PECR requires prior consent to be obtained before any unsolicited electronic communications (ie, emails and texts) are sent to individuals in their personal capacity. An exemption, known as “soft opt-in”, is available where the consumer provides their contact details during a sale (or negotiations for a sale), during which they have an opportunity to opt-out of marketing communications. Organisations may then deliver unsolicited marketing for products or services that are similar to those purchased by the consumer, provided there is an opt-out opportunity in each subsequent marketing communication.

Business recipients must be permitted to opt-out of further communications, but prior consent to electronic communications is not required.

Telephone and Fax

PECR also governs the use of marketing conducted via telephone and fax. Organisations must not make live marketing calls to numbers registered with the Telephone Preference Service (TPS) or Corporate TPS, unless the person has specifically consented to receive their calls.

Marketing activities will also fall within the scope of the GDPR where they involve the processing of personal data, even where the recipient is contacted via their business email address. Accordingly, the requirements of both regimes need to be considered.

PECR and Cookies

PECR also governs the use of cookies and similar technologies, requiring prior consent for non-essential cookies. Where consent is required under PECR, the appropriate legal basis under the UK GDPR for any associated processing of personal data collected by cookies will likely also be consent. The standard of consent required under PECR is the standard set out by the UK GDPR; accordingly, consent must be freely given, specific, informed and unambiguous. The ICO has indicated that it will focus on the use of cookies for behavioural advertising in forthcoming investigations and enforcement actions.

Monitoring

All monitoring in the workplace must be for a lawful purpose and fair and proportionate. Employers must balance their objectives against employees’ right to privacy. While employers have a legitimate interest in securing their systems, this must be balanced against the intrusiveness of any workplace monitoring. Only in limited circumstances can organisations undertake covert monitoring (eg, where there is reasonable suspicion of criminal behaviour).

Monitoring must be conducted in a transparent manner and the nature of the monitoring should be within the reasonable expectation of employees, who must be given notice. The UK GDPR’s other processing principles (eg, purpose limitation and data minimisation) should also be complied with.

Workplace monitoring of electronic communications is regulated by the IPA and the IPIBR. Under these Regulations, interception of communications during transmission by employers will be permitted only where the monitoring relates to business activities and the employer has made all reasonable efforts to inform both parties to the communication that interception may take place. Monitoring should only be conducted on communications systems provided by the employer, although personal communications sent using business systems may sometimes be intercepted.

Whistle-Blower Hotlines

Whistle-blower hotlines (whether telephone or web-based) typically involve the processing of personal data, and must comply with data protection requirements. Relevant personal data is likely to concern the whistle-blower, the subject of the report, the incident and details of any follow up. The principles described in 2.1 Omnibus Laws and General Requirements must be met. Additional complexity will arise if the hotline is part of a global reporting tool, and steps will need to be taken to determine whether personal data needs to be transferred abroad (eg, to the USA).

The European Parliament and Council of the European Union adopted the Whistleblower Directive 2019/1937, which member states were required to implement into their respective national laws by December 2021. Following Brexit, the UK is under no legal obligation to implement the Whistleblower Directive and, as of the date of writing, it has not chosen to do so. The UK was one of the countries which the European Commission already deemed to grant whistle-blowers comprehensive protection (under the Public Interest Disclosure Act 1998 as incorporated into the Employment Rights Act 1996). While much of the content of the Whistleblower Directive is already contained in UK law, the UK whistle-blower regime does not cover everything contained in the Whistleblower Directive.

Under the UK GDPR, the ICO has extensive enforcement powers, as described in 1.3 Administration and Enforcement Process. The sanctions imposed on controllers and processors that infringe the UK GDPR will depend on the severity of the breach, considering the nature, gravity and duration of the infringement, as well as whether the infringement was intentional, the degree of responsibility of the organisation for the incident, previous infringements and mitigating steps (see 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation).

Implementation of the GDPR has triggered an increase in class actions, which typically (but not exclusively) operate on an opt-in basis. The UK GDPR also explicitly provides that compensation should be available to any data subject who suffers damage (eg, where a data subject has lost money), even where the damage is not material (eg, where a data subject has suffered distress). A recent decision has clarified that compensation is not available for mere “loss of control” of personal data in the absence of material damage.

In addition, the ICO may conduct investigations and is currently investigating the adtech sector generally, signalling that enforcement will follow.

The IPA regulates the targeted and bulk interception and acquisition of electronic communications by law enforcement agencies (such as police forces) in the UK. The IPA prohibits such activities unless they are carried out with “lawful authority”.

Part 2 of the IPA allows a law enforcement agency to obtain a warrant relating to the targeted interception of communications (ie, to a particular person or organisation, or a group of persons who share a common purpose or carry out an activity together).

Part 3 of the IPA allows designated senior officers of certain public authorities (including police forces) to authorise officers to obtain data from any person that relates to a telecommunications system or is derived from a telecommunications system, where it is necessary to obtain that data for the purpose of a specific operation or investigation.

Warrants issued under Part 2 of the IPA are issued by the Secretary of State and must be approved by a Judicial Commissioner appointed under the IPA. The Investigatory Powers Commissioner is responsible for oversight of actions by public authorities carried out under the IPA.

Section 2 of the IPA imposes a general duty in relation to privacy upon public authorities (including police forces) exercising powers authorised by the IPA. In particular, public authorities must have regard to whether what is sought to be achieved by a warrant or authorisation issued under Part 2 or 3 of the IPA could reasonably be achieved by other less intrusive means.

In addition to the IPA, Part 3 of the DPA 2018 imposes a number of obligations on law enforcement agencies in relation to their processing of personal data. For example, law enforcement agencies must comply with six data protection principles (such as that processing of personal data for law enforcement purposes must be lawful and fair), data subjects may exercise rights (such as the right of access) against law enforcement agencies that process personal data, and law enforcement agencies are subject to a number of accountability obligations (such as an obligation to maintain records of personal data processing activities).

The IPA also applies to the obtaining of information by government agencies for intelligence, anti-terrorism or other national security purposes. In addition to obtaining warrants under Part 2 of the IPA or authorisation under Part 3, government agencies may also obtain data in the following ways.

  • Part 4 of the IPA grants power to the Secretary of State to require telecommunications service providers to retain communications data in certain circumstances by means of a retention notice. A retention notice may relate to a particular service provider or description of service providers, may relate to all data or a description of data, and may identify the period for which the data is to be retained.
  • Part 6 of the IPA allows government agencies and intelligence services to obtain warrants requiring the large-scale retention and acquisition of communications data by telecommunications providers in certain circumstances. Warrants under Part 6 are issued by the Secretary of State.
  • Part 7 of the IPA allows government agencies and intelligence services to obtain warrants allowing the retention or examination of bulk personal datasets retained or acquired under other parts of the Act. Warrants issued under Part 7 are issued by the Secretary of State.

Any decision to issue a retention notice under Part 4, or a warrant under Part 6 or 7, must be approved by a Judicial Commissioner.

Part 4 of the DPA 2018 imposes obligations on intelligence services in relation to their processing of personal data, although those obligations are more limited than those imposed on law enforcement agencies.

There is no specific legal basis in UK law that permits UK organisations to collect and transfer personal data in connection with a foreign government access request. In some instances, UK-based organisations may have a legitimate interest under Article 6 of the GDPR (to process personal data) and Article 49 of the GDPR (to transfer personal data). Organisations should consider any such collection and transfer of personal data carefully, as in many instances an individual’s rights and freedoms are likely to override any legitimate interest the organisation has in complying with the foreign government access request.

The UK and USA entered into a Data Sharing agreement on 7 October 2019 in connection with the US CLOUD Act 2018, for the purpose of facilitating cross-border data sharing by law enforcement agencies to counter serious crime.

UK intelligence services agencies have extensive powers under the IPA to obtain data (including personal data) for a range of purposes. There is ongoing debate about the extent to which those powers are necessary and proportionate, and whether they go beyond what is necessary and proportionate. This will be a key consideration for the European Council in determining whether to approve the European Commission’s draft adequacy decision for the UK. Even if an adequacy decision is given, privacy activists have signalled that they will likely challenge it, arguing that UK intelligence services have data acquisition and retention powers that go beyond what is necessary in a democratic society.

The UK GDPR prohibits the transfer of personal data from the UK to jurisdictions or international organisations unless the provisions of the UK GDPR are complied with. These are discussed in 4.2 Mechanisms or Derogations that Apply to International Data Transfers.

Personal data may lawfully be transferred outside the UK in the following circumstances.

  • Adequacy determination: the transfer is to a jurisdiction that has received an adequacy determination. Prior to leaving the EU, the UK recognised all existing adequacy determinations made by the EU, so that transfers from the UK are not restricted when made to the following jurisdictions: Andorra, Argentina, Canada (commercial organisations), the Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland and Uruguay. The UK also recognised the European Economic Area member countries (EU member states plus Iceland, Norway and Liechtenstein) and Gibraltar as providing adequate protection.
  • Appropriate safeguards: a data transfer mechanism, such as SCCs or binding corporate rules, is utilised to ensure appropriate safeguards. Following the decision in Schrems II, a UK data exporter must undertake a transfer risk assessment to assess the legal system of the recipient jurisdiction and government access rights. Additional technical, contractual and organisational safeguards must be added where the recipient jurisdiction does not in fact ensure sufficient protection for personal data.
  • Derogations: in limited circumstances, it may be possible to rely on a derogation, such as the data subject’s explicit consent, or important reasons of public interest.

Following the Schrems II case, the Privacy Shield is no longer recognised as a valid data transfer mechanism.

There are no general requirements to obtain government approval for international data transfers, except in the limited scenarios where organisations rely on the compelling legitimate interests derogation.

There are no data localisation requirements under UK law. There are restrictions on how personal data may be transferred internationally (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers).

There are no general requirements under UK law for software code, algorithms or similar technical detail to be shared with the government.

An organisation subject to the UK GDPR seeking to transfer personal data abroad in connection with foreign government data requests, foreign litigation proceedings or an internal investigation will need to meet the requirements of the UK GDPR. There must be an appropriate legal basis for the data processing itself (ie, collating the relevant personal data for the purpose of the request) and an appropriate legal basis for any transfer of the data abroad. Such requests must be examined carefully, on a case-by-case basis. The nature of the data and the proposed purpose of the processing will be relevant, as will the manner in which the data is to be disclosed and used abroad.

The UK does not have any “blocking” statutes in the context of data protection. Specifically, the UK has removed Article 48 of the GDPR from the UK GDPR. Previously, this provision operated as a blocking provision where a court or tribunal in a third country required an EU controller or processor to disclose personal data.

Emerging and developing technologies all raise similar compliance issues, particularly concerning transparency, proportionality and explainability. The UK GDPR and DPA 2018 do not explicitly reference new technologies, except in relation to the obligations to carry out DPIAs (see 2.1 Omnibus Laws and General Requirements).

The ICO has issued guidance on AI and machine learning, entitled Explaining decisions made with AI (developed in collaboration with the Alan Turing Institute). This is relevant also to autonomous decision-making and linked issues. It emphasises that providing explanations regarding AI-assisted decisions is one way to demonstrate that the individual has been treated fairly and in a transparent manner. This can be challenging: AI and machine learning systems are often designed to solve problems and spot patterns beyond the capability of humans. The manner in which they achieve this may not be fully understandable to those deploying the relevant technologies, let alone explainable to those whose personal data is utilised. Information provided to individuals must not be overly technical so as to enable individuals to understand the nature of a sophisticated system's processing activity.

The topic of online harms is currently under government review and a White Paper for consultation was issued in 2020, in which the government proposes specific regulation of a number of areas.

The topics of biometrics including facial recognition are not directly addressed in UK law, but see 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation for action taken by the ICO and the courts in respect of biometric data.

UK organisations are starting to develop ethical approaches to the use of personal data. These are usually intended to guide employees, but they also build trust with customers. For example, IBM published an ethical framework for big data analytics dealing with how data will be collected and used, and Vodafone published a set of privacy commitments focusing on transparency and minimising the risks associated with using people's personal data. These organisational data processing frameworks typically align with the key data protection principles and requirements outlined in the UK GDPR, often with a focus on fairness and transparency.

Fines for Data Breaches

In 2021, the ICO imposed fines on the following companies:

  • Cabinet Office (GBP500,000);
  • HIV Scotland (GBP10,000); and
  • Mermaids (GBP25,000).

In 2020, the ICO imposed fines totalling GBP39.65 million. It also issued an enforcement notice against Experian, requiring it to make changes to its data processing practices.

Voice Recognition Technology

In May 2019, the ICO issued an enforcement notice against Her Majesty's Revenue and Customs (HMRC) for processing biometric data without a valid legal basis under the GDPR. HMRC introduced a voice verification system, and collected the biometric data of 7 million data subjects, but failed to satisfy a legal basis for the processing. The consent obtained by HMRC failed to meet the GDPR standard for valid consent. The ICO did not issue a monetary penalty, but required HMRC to delete all biometric data held under the Voice ID system for which it did not have explicit consent.

Automated Facial Recognition Technology

A civil liberties campaigner, Mr Bridges, brought judicial review proceedings after South Wales Police (SWP) launched a project involving the use of automated facial recognition (AFR) technology. SWP deployed AFR technology in certain public locations where crime was considered likely to occur, matching captured images with "watchlists" of wanted persons in police databases using biometric data analysis. Mr Bridges challenged the use of AFR technology as being unlawfully intrusive, including under Article 8 of the European Convention on Human Rights (right to respect for private and family life). On appeal, the Court of Appeal held that the use of AFR technology was unlawful and violated human rights.

In November 2021, the ICO announced its provisional intent to impose a potential fine of just over GBP17 million on Clearview AI Inc. In addition, the ICO issued a provisional notice to stop further processing of the personal data of people in the UK and to delete it following alleged serious breaches of the UK’s data protection laws. The fine relates to a number of issues, including Clearview AI Inc’s use of biometrics for facial recognition.

In June 2021, the ICO published a formal Information Commissioner’s Opinion addressing privacy concerns on the use of live facial recognition technology in public places. The Opinion was informed in part by six ICO investigations into the use, testing or planned deployment of live facial recognition systems.

Class Actions and Collective Redress

There have been a number of class actions alleging privacy infringements, in part due to the GDPR providing that compensation should be available to any data subject who suffers damage, even if not material damage. Furthermore, the GDPR permits representative bodies to pursue actions on behalf of data subjects. Class actions have been brought following cybersecurity breaches, with British Airways, Marriott and easyJet all facing claims. In July 2021, British Airways settled a class action brought by thousands of customers impacted by the cybersecurity incident.

WM Morrisons Supermarkets plc v Various Claimants is a class action arising out of the deliberate and unlawful publication of employee payroll data by a disgruntled employee. Affected employees commenced litigation against Morrisons for breaches of the DPA 1998 (in force at the time of the breach), misuse of private information and breach of confidence. The claimants also argued that Morrisons was vicariously liable for the actions of its disgruntled employee. Morrisons was found not to be vicariously liable, but the Supreme Court left open the possibility of employee class actions following data breaches in future.

In November 2021, the UK Supreme Court issued its judgement in Lloyd v Google. The Supreme Court was asked to consider whether an opt-out style of representative action is permitted (it being accepted that all members of the class suffered the same loss), and whether loss of control over personal data can be compensated, in the absence of any other material harm. The Supreme Court ruled in favour of Google, finding that the representative claim against Google should not be allowed to proceed. Moving forward, the decision means that actions for collective redress in relation to violations of data protection law in the UK likely will need to be based on the affirmative opt-in of the represented claimants, and each claimant will need to demonstrate the material damage that they suffered.

Privacy issues often arise in the context of a corporate transaction, notably where the target organisation is a data-heavy business. Due diligence represents a key activity in most corporate transactions, enabling the buyer to assess the value of the target business, having regard to compliance gaps and risks associated with the target company. From a privacy perspective, the buyer will typically use the due diligence phase to examine compliance with applicable data protection laws, usually focusing on the target’s approach to employee data, consumer data, and carefully examining the development and deployment of key data processing technologies.

Most due diligence processes will involve the seller sharing personal data with the buyer, and often with the buyer’s professional advisers and a virtual data room provider. This creates certain risks, including the risk of personal data being inadvertently disclosed to, or accessed by, unauthorised third parties. Sellers and buyers typically take steps to mitigate the risks associated with conducting due diligence, including the following.

  • Transparency information: data controllers are required to provide data subjects with information about their processing activities. The seller should ensure its notices to its staff address the need to share data in the context of a sale or other corporate transaction. While the buyer and its professional advisers will be data controllers for the purposes of the diligence, the exceptions included in Article 14(5) of the UK GDPR will avoid the need to provide further notification to data subjects.
  • Data room providers: the buyer (and its advisers) will often access personal data via a virtual data room. The provider will typically be a data processor, and the seller will need to ensure that an Article 28-compliant data processing agreement is in place with the provider, and that adequate security measures are in place to protect any personal data uploaded to the data room.
  • Access controls: the seller should ensure that access to diligence documents is limited to buyer personnel and advisers that need access to this information. The seller should consider what technical controls can be implemented in the data room, including ensuring that access is limited to authorised users and that personal data can only be viewed on a read-only basis.
  • Data minimisation: the seller should disclose only personal data that is necessary for the purposes of the due diligence. This will typically involve the seller anonymising and/or pseudonymising personal data prior to disclosing it to the buyer – for example, by redacting names from supplier agreements or employment contracts. In the case of employee lists, employee names should be removed and information limited to job title, salary, etc.

There are no non-privacy/data protection-specific laws that mandate disclosure of an organisation’s cybersecurity risk profile or experience.

NIS Regulations

For completeness, operators of essential services (OES) such as utilities, and relevant digital service providers (RDSP), such as providers of digital marketplaces, online search engines and cloud services, are subject to the EU Network and Information Systems Directive, implemented in the UK by the NIS Regulations. The NIS Regulations seek to ensure common levels of security to safeguard critical infrastructure. The ICO is the competent authority for RDSPs in the UK.

The UK NIS regime also includes an implementing act for digital service providers (known as the DSP Regulation), and specifies security requirements and incident reporting thresholds for certain organisations. While the UK GDPR concerns personal data, the NIS Regulations concern the security of network and information systems. That said, there is a significant crossover between the UK GDPR and the NIS Regulations, due in particular to the UK GDPR’s security requirements. In this respect, the application of the NIS Regulations is broader as it applies to digital data, and not just personal data.

Under the NIS Regulations, RDSPs are required to notify the ICO of any incident having an actual adverse effect on the security of network and information systems. Similar to the UK GDPR, this notification must be made without undue delay and no later than 72 hours of becoming aware of it. The NIS Regulations specify the information that must be included.

The ICO’s enforcement powers in respect of the NIS Regulations include issuing enforcement notices, exercising powers of inspection and imposing monetary penalties of up to GBP17 million in the most serious cases.

Hunton Andrews Kurth (UK) LLP

30 St Mary Axe
London
EC3A 8EP
United Kingdom

+44 0 20 7220 5700

+44 0 20 7220 5772

info@HuntonAK.com www.HuntonAK.com
Author Business Card

Trends and Developments


Authors



Hunton Andrews Kurth has more than 1,000 lawyers located across the world, and is an international law firm with a renowned global privacy and cybersecurity practice. The lawyers are known for their deep experience, breadth of knowledge and outstanding client service. In addition to its legal practice, the firm distinguishes itself through its Centre for Information Policy Leadership, a global privacy and security think-tank that works with industry leaders, regulatory authorities and policymakers to develop global solutions and best practices for privacy and the responsible use of data to enable the modern information age. For the latest resources in privacy, data protection and cybersecurity, visit www.huntonprivacyblog.com.

UK Data Protection After Brexit: Challenges and Opportunities

Data protection regulation is exploding around the globe. Driven by the necessity of processing personal data, rapid developments in technology, and evolving consumer expectations, countries are increasingly turning to omnibus privacy frameworks, frequently modelled on the EU’s General Data Protection Regulation (GDPR).

Within this rapidly evolving regulatory environment, the UK is perhaps uniquely placed. It has a long history of regulation in this area, dating back to the 1960s and 1970s, ratifying the Convention for the Protection of Individuals with regard to automatic processing of personal data (Treaty 108), and then passing its first comprehensive data protection law, the Data Protection Act, in 1984. While the UK’s post-Brexit data protection framework is the UK GDPR, now that the UK is no longer a part of the EU, there is an opportunity for the UK to mastermind its own data protection future. What does this mean for data protection regulation in the UK going forward, and what should companies expect?

The UK’s post-Brexit data protection framework

In June 2021, the European Commission adopted a decision recognising the UK’s data protection regime to be adequate. Notably, the European Commission’s decision mandates a review after four years, with adequacy lapsing unless reaffirmed within the four-year timeframe. Any significant changes to the UK’s data protection regime, particularly those that weaken the rights of data subjects, will be considered by the European Commission in reviewing the decision and may result in the decision not being reaffirmed.

The UK government has expressed a desire to reform the UK’s post-Brexit data protection regime, and to that end published a consultation on potential reforms to the UK’s regime, with the goals of reducing barriers to innovation, reducing burdens on business, reducing barriers to international data flows, and reforming the Information Commissioner’s Office (ICO). In the coming months and years, the UK will need to balance this potential liberalisation of its data protection regime against the possibility that significant changes may result in the non-renewal of the UK’s adequacy decision and limitations on the free flow of personal data from the EU to the UK.

Data transfers

One of the most discussed post-Brexit data protection topics is data transfers. In the post-Schrems II world, the GDPR data transfer framework continues to challenge businesses. The ICO has published guidelines relating to how organisations should assess the risks associated with international data transfers flowing from the Schrems IIdecision. The ICO has also published an International data transfer agreement and an Addendum to the EU SCCS that will be available to use on 21 March 2022. The International Data Transfer Agreement will be the UK’s answer to the EU SCCs, and will constitute a form of appropriate safeguards under the UK GDPR. The Addendum to the EU SCCs may be appended to the EU SCCs by organisations subject to the UK GDPR, and doing so will allow those organisations to rely on the EU SCCs in relation to data transfers under the UK GDPR.

Those operating in the UK need to navigate several issues, such as:

  • the need to undertake data transfer adequacy assessments, in accordance with the ICO’s guidelines, when relying on appropriate safeguards under Article 45 (such as standard contractual clauses (SCCs) and binding corporate rules);
  • replacing existing EU SCCs with the UK’s International Data Transfer Agreement (or with the new EU SCCs in conjunction with the UK Addendum to the EU SCCs); and
  • prioritising data transfers for remediation where vendors had relied on the now invalid EU-US Privacy Shield (to the extent this has not already been addressed).

It remains to be seen to what extent the ICO will seek to bring enforcement actions in relation to international data transfers from the UK, but this has not historically been an area of focus of the ICO.

Global privacy framework

The concept of exporting applicable rights and obligations together with data is not unique to the EU, and is also a feature of the Asia Pacific Economic Cooperation (APEC) privacy framework’s Cross Border Privacy Rules (CBPR). The approach creates challenges for data importers who must ensure that they can honour these requirements across their platforms, even where the requirements differ from their own domestic data protection laws. This is one of the factors driving interest in the creation of a global data protection standard. The GDPR is becoming widely adopted around the globe, or at least is forming the basis of emerging data protection laws in many countries.

However, the GDPR is far from the only approach. Treaty 108 has recently been updated, and the APEC privacy framework (based on the OECD Guidelines) continues to expand. Unlike the EU approach, the APEC privacy framework does not require members to have the same laws, but requires a consistent approach, based on nine privacy principles. APEC’s Cross-Border Privacy Rules (or CBPRs) enable data transfers between signatories to the APEC privacy framework. As the limits of the EU’s adequacy approach emerge, notably the challenge of ensuring "essential equivalence", the search for creative solutions to enable global data flows while ensuring strong data protection will continue.

The UK is well-placed to build bridges with other privacy regimes, and may be incentivised to do so given that data has become an inextricable part of global trade negotiations. In August 2021 the UK government published its mission statement entitled International data transfers: building trust, delivering growth and firing up innovation, which underlines its intention to ensure the free flow of data between the UK and key global economies by recognising those countries as adequate, or by adopting alternative data transfer mechanisms that will facilitate data transfers to those countries. Priority countries identified by the UK for the adoption of adequacy decisions include the USA, Australia, Brazil, India, Korea and Singapore, although (as of the date of writing) none have been adopted as yet. 

Localisation

One consequence of the Schrems II decision, and the more conservative approach that it signals to EU data exports, has been increased discussion of data localisation and data sovereignty. Anecdotally, vendors are ramping up their EU data processing capacity, creating EU clouds and EU service capability so that EU clients’ personal data does not need to be transferred outside the region. There has been open discussion at a political level of the merits of Europe pursuing a data localisation strategy. While localisation and increased data protectionism is not new, and invariably arises in discussions about big data analytics, increasingly these issues are on the table in trade negotiations. It is notable that the EU-UK Trade and Cooperation Agreement does not contain a localisation provision. As the UK explores trade opportunities outside the EU, it is hoped that it will seek to maintain, rather than constrain, the free flow of personal data across borders.

Regulatory overlap

An emerging trend that is gathering pace is the potential for regulatory overlap with data protection principles. The proposed EU Digital Markets Act, the Digital Services Act and the e-Privacy Regulation raise questions as to how the need to regulate important aspects of the digital ecosystem will sit alongside or overlap with the core data protection framework of the GDPR. In addition, there is growing convergence between competition regulators, consumer rights regulators and data protection regulators, each with a different focus, seeking to regulate an increasingly similar set of issues. The UK is attempting to address these issues and, outside of the EU, may now have the freedom to create solutions that better meet the UK’s needs.

Regulatory enforcement

There has been a more limited use of the ICO’s fining powers in the last 12 months since the significant fines it issued in 2020 against British Airways (GBP20 million), Marriott (GBP18.4 million) and Ticketmaster (GBP1.25 million). In November 2021, the ICO announced its provisional intent to fine Clearview AI Inc. approximately GBP17 million in relation to its collection of a database of more than 10 billion facial images, which it used for carrying out biometric identification of individuals, but there were no other significant fines issued in 2021. Notably, both the BA and Marriott fines were reduced significantly from the amounts originally announced by the ICO (down from GBP183.39 million in BA’s case and down from GBP99 million for Marriott), and Ticketmaster has appealed the fine issued by the ICO, although the hearing has been stayed until 2023.

The reason for the ICO’s more limited enforcement action in 2021 compared to 2020 is unclear, but it is possible that the reduction in the BA and Marriott fines has caused the ICO to adopt a more conservative stance in issuing significant monetary penalties. It is unclear whether that stance will continue into the future or whether the slowdown in fines is a temporary hiatus following its decisions to reduce the BA and Marriott fines.

Private rights of action

The past year has also seen important developments in relation to private rights of action for data protection infringement. Historically, under the old Data Protection Act 1998 (DPA 1998), recovery of damages for breach of data protection was rare. Typically, the loss that arises from data protection infringement is distress, rather than financial loss, and it was generally thought that the DPA 1998 required the establishment of financial loss before non-financial loss could be considered by the courts. That position changed in Vidal-Hall v Google Inc. [2015] EWCA Civ 311, when the Court of Appeal, noting that the purpose of data protection legislation is not to protect economic rights, determined that financial loss is not a prerequisite to damages in data protection claims. Now, under the UK GDPR, it is clear that compensation can be awarded for both material and non-material damage (such as mental distress).

In addition, the Supreme Court in Lloyd v Google LLC [2021] UKSC 50 considered the question of whether compensation for damages under UK data protection law can be awarded for mere “loss of control” of personal data in the absence of any material damage (ie, financial loss or distress). The Supreme Court ultimately decided that compensation is recoverable only when violation of the law results in tangible and material damage to impacted individuals. The decision will provide reassurance that technical breaches of data protection law that do not give rise to real harm to individuals will not be sufficient grounds for a compensatory award.

Usually, claims for breach of the DPA 2018 are made in conjunction with other claims, such as for breach of privacy, misuse of private information, breach of confidence, defamation and breach of employment rights. Damages awards for breach of the DPA 1998 and the DPA 2018 have varied. The majority to date have been for modest sums, but that position may well evolve in future.

Growth in UK class action litigation

Of interest to potential defendants in the UK is the emergence of class action litigation involving data protection claims. In the UK, these actions generally take one of two forms: (i) representative actions, in which the claimants must have the same interest in the claim, and (ii) group litigation orders, in which each individual effectively makes a separate claim. Group litigation orders proceed on an opt-in basis (unlike US class actions, which are opt-out). There has been an increase in class action claims under the GDPR, with most of the larger cybersecurity breaches spawning such claims, and specialist class action law firms ready to facilitate them. Accordingly, British Airways, Marriott and easyJet are all facing claims, among others. The Supreme Court in Lloyd v Google [2019] ECWA Civ 1599severely curtailed the ability of claimants to bring representative actions, finding that a representative action can proceed only when each individual represented in the claim has suffered the same loss, and that Lloyd was unable to demonstrate that each representative claimant had, in fact, suffered the same loss. The decision sets a high bar for collective redress under UK data protection law.

Going forward, actions for collective redress generally will need to proceed on an opt-in basis, and each claimant will need to demonstrate the material damage that they have suffered. Representative action claims generally will be appropriate only in circumstances where there is a very high degree of conformity in the damages suffered by each of the represented claimants, such that there is no need for each claimant to individually demonstrate the damage they have suffered. Further developments in this area are expected in the coming years.

The ICO’s role, post-Brexit

Against the backdrop of Brexit, it is reasonable to consider what the future role of the ICO should be. Following the UK’s departure from the EU, the ICO has no formal role within the European Data Protection Board (EDPB), and is no longer part of the EU GDPR’s consistency mechanism or one-stop shop. Once the largest data protection regulator in the EU (with a headcount of 822 permanent staff as of March 2021), the ICO provided significant support to the work of the EDPB, and was known for its proportionate and pragmatic approach to issues. Now, the ICO must find its feet outside of the EU. Indeed, this task began some time ago, marked by publication of the ICO’s International Strategy in 2017.

The ICO is influential within the international community of data protection regulators, currently chairing the Global Privacy Assembly and the International Conference of Information Commissioners. Following Brexit, the ICO is in an important position to bridge the gap between the APEC nations and the GDPR nations, and ideally placed to be the voice of reason challenging some of the more conservative views on regulatory enforcement. There is growing interest in the role of smart regulation in data protection, and in regulatory approaches that do not merely wield a stick, but incentivise responsible data usage to create efficiencies and value, while respecting individual rights. The ICO should be a leading voice in this debate, which is of increasing importance in the context of converging technologies and ever-smarter data processing. The debate is also of practical importance to the UK as it navigates new trade opportunities following Brexit.

Just as the UK must seize the opportunity to forge a relevant role on the international stage following Brexit, the ICO must also create a meaningful international role as a data protection regulator with regulatory roots in the EU GDPR. There is a true opportunity for the UK to contribute fresh thinking and creativity to advance (if not solve) some of the more difficult data protection problems, including cross-border data transfers and smart regulation. Organisations with interests in the UK should monitor these developments closely.

Hunton Andrews Kurth (UK) LLP

30 St Mary Axe
London
EC3A 8EP
United Kingdom

+44 (0) 20 7220 5700

+44 (0) 20 7220 5772

info@HuntonAK.com www.HuntonAK.com
Author Business Card

Law and Practice

Authors



Hunton Andrews Kurth has more than 1,000 lawyers located across the world, and is an international law firm with a renowned global privacy and cybersecurity practice. The lawyers are known for their deep experience, breadth of knowledge and outstanding client service. In addition to its legal practice, the firm distinguishes itself through its Centre for Information Policy Leadership, a global privacy and security think-tank that works with industry leaders, regulatory authorities and policymakers to develop global solutions and best practices for privacy and the responsible use of data to enable the modern information age. For the latest resources in privacy, data protection and cybersecurity, visit www.huntonprivacyblog.com.

Trends and Developments

Authors



Hunton Andrews Kurth has more than 1,000 lawyers located across the world, and is an international law firm with a renowned global privacy and cybersecurity practice. The lawyers are known for their deep experience, breadth of knowledge and outstanding client service. In addition to its legal practice, the firm distinguishes itself through its Centre for Information Policy Leadership, a global privacy and security think-tank that works with industry leaders, regulatory authorities and policymakers to develop global solutions and best practices for privacy and the responsible use of data to enable the modern information age. For the latest resources in privacy, data protection and cybersecurity, visit www.huntonprivacyblog.com.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.