Data Protection update - July 2022

Data Protection update - July 2022

Welcome to our data protection bulletin, covering the key developments in data protection law from July 2022.  

Data protection

Cyber security 

Enforcement

Civil litigation

Data protection

Data Protection and Digital Information Bill introduced to Parliament

On 18 July 2022, the Data Protection and Digital Information Bill (the "Bill") was introduced to Parliament. The aim of the Bill is to update and simplify the UK's data protection framework which the UK government hopes will reduce burdens on organisations, while still maintaining high data protection standards. In this summary we provide an overview of the key changes proposed by the Bill.

Definition of personal data

The Bill retains the UK's GDPR's basic definition of personal data as being any information relating to an identified or identifiable individual. However, the Bill provides clarity as to when an individual should be considered as being "identifiable", making their information personal data. Rather than identifiability in the hands of anyone, it only needs to be considered whether the controller or processor or others who are reasonably likely to receive the information are reasonably likely to be able to identify the individual. This more subjective approach to identifiability is likely to narrow the scope of what is considered personal data.

Lawfulness of processing

The Bill has created a list of certain "recognised" legitimate interests that do not require the completion of an assessment to balance the interests and rights of the data subject against the legitimate interests of the organisation. The current proposed list includes matters of "public interest", such as national security, public security, defence, emergencies, preventing crime, safeguarding and democratic engagement. Under the current law, if the legitimate interests basis is relied upon as a lawful ground for processing data, the rights and interests of data subjects must be assessed in all cases.

Purpose limitation principle

The purpose limitation principle under the UK GDPR provides that personal data should be collected for specified, explicit and legitimate purposes and not further processed in a manner incompatible with those purposes. The Bill clarifies that controllers only need to judge compatibility of purpose against their own purpose for obtaining personal data, not against any purpose for which the data was originally obtained by a third party. It also introduces additional scenarios in which processing for a new purpose will be compatible with the original purpose, including to enable controllers to comply with their legal obligations.

Vexatious or excessive requests and time limits for responding to access requests

The "manifestly unfounded or excessive" threshold for refusing data subject access requests ("DSARs") under the UK GDPR is replaced with a "vexatious or excessive" threshold. This is designed to reduce the burden on businesses, as the new threshold expands the circumstances under which a DSAR may be refused.   In addition, the Bill introduces several factors to consider when determining if a request meets the new threshold, such as the resources of the controller, requests intended to cause distress or not made in good faith and requests that are an abuse of process.

Information to be provided to data subjects

The Bill makes it less onerous for businesses to comply with transparency obligations when processing personal data for research, archiving or statistical purposes. Transparency information does not need to be provided, even where data is obtained from a third party, if it is impossible or requires disproportionate effort. The UK GDPR already contains similar provisions but only where personal data has not been obtained directly from the data subject. 

Automated decision-making

Under the UK GDPR, data subjects have a right not to be subject to decisions based solely on automated decision-making. This requirement has been loosened by the Bill, which only restricts automated decision-making where it involves the processing of special category data. Additional safeguards must be put in place for such decision-making, for example the ability to contest decisions and the requirement to provide the data subject with information about solely automated decisions.

Representatives for controllers or processors outside the UK

The Bill significantly simplifies compliance by removing the requirement for controllers and processors to appoint a UK representative if they are not established in the UK.

Duty to keep records

The Bill maintains the obligation for controllers and processors to keep records of their processing activities, although the contents requirements are less prescriptive than at present. The exemption from these requirements has also been expanded. The current exemption under the UK GDPR is that organisations of under 250 persons are exempt where processing is unlikely to result in a high risk, is not occasional and does not include special category or criminal data. Under the Bill, any organisation with less than 250 persons which does not conduct high-risk processing is exempt from record-keeping requirements, allowing these businesses more autonomy in deciding how to demonstrate accountability.

Consulting the ICO prior to processing

The Bill removes the consultation requirement under the UK GDPR that requires controllers to consult the ICO where the results of a DPIA find that processing is high risk and measures are not in place to mitigate the risk.

International transfers

The regime for transferring personal data to third countries and international organisations largely maintains the current regime under the UK GDPR. The Bill does not give greater flexibility in meeting the legal requirements to legitimate international data transfers. However, it does introduce a "data protection test" for the Secretary of State to apply when making adequacy regulations and for exporters to apply when sending personal data overseas in reliance on safeguards such as the standard contractual clauses. This outcomes-based test is that the standard of protection in the recipient jurisdiction must not be "materially lower" than the standard in the UK.

Processing personal data for research purposes

The Bill facilitates broad consents to processing for scientific research purposes and clarifies that scientific research has a broad meaning that includes privately funded research.

Cookies and similar technologies

The Bill raises the level of fines for breaches of the privacy and electronic communications regime from the current £500,000 to the levels under the UK GDPR; namely 4% of annual turnover or £17.5 million. It also broadens the list of exemptions to the cookie consent requirements, not requiring opt-in consent: (i) for the purposes of collecting statistical information about an information service in order to bring improvements; (ii) for the installation of necessary security updates to a device; and (iii) to identify the geolocation of an individual in an emergency.

Direct marketing

The general rule under the UK GDPR is that electronic mail used for direct marketing purposes requires prior consent, unless organisations have obtained the contact details of individuals in a previous sale or provision of goods or services (with an ability to opt-out). The Bill seeks to extend the application of this soft opt-in to non-commercial organisations for the purposes of furthering charitable, political or other non-commercial objectives, if they have obtained contact details from an individual expressing interest.

Enforcement powers

The Bill retains the basic framework of UK data protection law and does not represent a radical departure from EU data protection law. However, as the Bill makes its way through Parliament (with a second reading to take place in the autumn after the Parliamentary recess), any further changes may threaten the UK's adequacy status which could create huge administrative stumbling blocks for British businesses operating in the EU.

Katie Hewson, Data Protection Partner at Stephenson Harwood, has expressed concerns that diverging too far from the current legal framework could threaten the UK's adequacy status, commenting that "this could create huge administrative stumbling blocks for British businesses operating in the EU".

ICO launches draft ICO25 strategic plan for consultation

On 14 July 2022, the ICO launched its draft three-year strategic plan, ICO25. The ICO25 contains four strategic objectives: (i) safeguarding and empowering people; (ii) empowering responsible innovation and sustainable economic growth; (iii) promoting openness, transparency and accountability; and (iv) continuously developing the ICO's culture, capability and capacity. The ICO25 also includes an annual action plan setting out the ICO's priorities for the coming year. The ICO is now consulting on its purpose, objectives and performance measures as described in the ICO25. The consultation will close on 14 September 2022 and the findings will be used to inform the final version of the ICO25 plan, to be published in autumn. To access the call for views, click here.

Joint statement released by US and UK on Data Access Agreement

The US and the UK have released a joint statement setting out an intention to bring into force a data access agreement which will be designed to counter serious crime (the "Agreement") on 3 October 2022. The Agreement will allow investigators from each country to gain better access to vital data to combat serious crime. Information and evidence held by service providers within each country relating to serious crime will also be more readily accessible. The joint statement also clarified that the Agreement will maintain democratic and civil liberties standards, although it is unclear if the Agreement will have any implications for the UK's EU adequacy status given the additional powers that US and UK authorities will have to access personal data.

European Parliament formally adopts Digital Markets Act and Digital Services Act

On 5 July 2022, the European Parliament formally adopted its Digital Services Package comprising of the Digital Markets Act ("DMA") and Digital Services Act ("DSA"). The DMA will regulate the main services provided by the biggest online platforms operating in the EU and will require these platforms, known as gatekeepers, to perform a series of obligations and prohibitions in their daily operations to ensure fair and open digital markets. The DSA will regulate how online platforms handle illegal or potentially harmful online by establishing a powerful transparency and accountability network and placing obligations on online companies proportionate to their role and size. Both the DSA and DMA must now be formally adopted by the Council of the European Union and then be published in the Official Journal. The acts are expected to enter into force later this year.

Personal data issues to consider for Hong Kong businesses providing services to Mainland China

Stephenson Harwood's Hong Kong office in association with law firm, Wei Tu, has recently published a briefing note on personal data issues that are important to businesses operating in Hong Kong ("HK") which provide services to individuals within Mainland China (the "PRC"). The PRC's primary data protection legislation is the Personal Information Protection Law (the "PIPL") and in HK it is the Personal Data (Privacy) Ordinance (the "PDPO"). The briefing note covers:

  1. the extent to which businesses outside of the PRC and HK have to comply with the PIPL and PDPO;
  2. steps that businesses should take after obtaining consent to process personal data;
  3. recommendations for handling employee personal data;
  4. the consequences of breaching the PIPL and PDPO; and
  5. requirements relating to cross-border transfers of personal data.

To read the briefing note in full click here.

Cyber Security

ICO calls for government to stop conducting business on WhatsApp

The ICO has published its report detailing a yearlong investigation into the use of private correspondence channels (including private email and WhatsApp) by Ministers and officials at the Department of Health and Social Care ("DHSC") during the pandemic. The investigation found that important information relating to the government's response to the pandemic had been lost or insecurely handled due to a lack of clear controls and a rapid increase in the use of messaging apps. The ICO recognised that the use of private channels throughout the pandemic brought some real operational benefits but concerns were raised that such practices continued without any review of their appropriateness or risks they presented. The ICO made several recommendations including ordering the DHSC to improve its management of FOI requests and to address inconsistencies in existing FOI guidance, and called for the government to set up a review of private correspondence channels. A reprimand was also issued under the GDPR requiring the DHSC to improve its processes and procedures around the handling of personal information through private correspondence channels.

ICO and NCSC advise against paying ransomware demands in joint letter

The ICO and the National Cyber Security Centre ("NCSC") have written a joint letter to the Law Society and Bar Council to remind members of the legal profession to advise clients not to pay ransoms in the event of a cyber-attack. The ICO has clarified that the payment of monies to criminals who have attacked an IT system will not be considered to mitigate the risk of harm to individuals involved in a data breach and this will therefore not reduce any penalties incurred through ICO enforcement action. However, mitigation of risk is recognised by the ICO where organisations try to fully understand and learn about what has happened, they have raised the incident with the NCSC where appropriate or can demonstrate compliance with appropriate NCSC guidance and support.

Enforcement

ICO investigates whether AI systems show racial bias

The ICO has announced that it plans to investigate whether AI systems are showing racial bias when dealing with job applications.

The ICO said AI-driven discrimination could have "damaging consequences for people’s lives", including leading to someone being rejected for a job or wrongfully denied a bank loan. The ICO will investigate "concerns over the use of algorithms to sift recruitment applications, which could be negatively impacting employment opportunities of those from diverse backgrounds," said the ICO.

Article 22 of UK GDPR provides that data subjects: "shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”, which such conduct may potentially offend (it may also represent a breach of other legislation such as the Equality Act 2010).

Amazon will have to pay 35 million euros for slipping on cookies

Amazon has failed in seeking to persuade the Council of State, France's highest national Court, to overturn the French supervisory authority's ("CNIL") December 2020 decision imposing a fine of €35 million arising out of Amazon's breaches of the obligation under the French Data Protection Act to obtain explicit consent to deploy cookies that enable targeted adverts personalised for individual citizens. The CNIL had held that "the information provided [by Amazon] was neither clear nor complete in the information banner of the site" and "only contained a general and approximate description of the purposes of all cookies, without details on their role [targeted advertising], the right to refuse and how". This decision, and the significant fine imposed in circumstances where Amazon had rectified the identified issues,  serves as an important reminder to tech-companies of the CNIL's increased focus on protecting the privacy of internet users.   

Facebook threatened with EU data processing ban

Facebook faces the threat of being blocked across the EU unless it implements a radical change to its data handling, as Ireland's privacy regulator, the Irish Data Protection Commission ("DPC"), doubled down on its order to stop the firm's data flows to the United States.

In a draft decision which is subject to review by other European Data Protection Authorities, the DPC has told Facebook its current set-up breaches GDPR rules and cracks down on Meta's (Facebook's parent company) last legal resort to transfer large chunks of data to the US.

In 2020, the European Court of Justice (ECJ) invalidated the EU-US data flow pact, Privacy Shield, because of fears over U.S. surveillance practices.  This ruling (often referred to as the Schrems II judgment) also made it more difficult to use standard contractual clauses (SCCs) to transfer personal data to the US. The DPC's decision now means that Facebook could be forced to stop relying on SCCs too.

Meta has repeatedly warned that such a decision would shutter many of its services in Europe, including Facebook and Instagram. In a filing to the U.S. Securities and Exchange Commission in March 2021 Meta said: "If a new transatlantic data transfer framework is not adopted and we are unable to continue to rely on SCCs or rely upon other alternative means of data transfers from Europe to the United States, we will likely be unable to offer a number of our most significant products and services, including Facebook and Instagram, in Europe".

This blocking order, if confirmed by the group of European national data protection regulators, will likely also have a knock-on effect for the wider business community, amidst the uncertainty surrounding how data from Europe to the US can lawfully be sent following the ECJ's ruling in 2020.

This draft decision also comes in the background of EU and US negotiations in respect of a new data-transfer text that would allow companies like Meta to continue to ship data across the Atlantic irrespective of the Irish order. A preliminary agreement was reached in March 2022 at the political level, but negotiations on the legal fine print have stalled and it is understood that a final deal is unlikely to be agreed before the end of the year.

Denmark bans Google Workspace over GDPR non-compliance

The Danish supervisory authority ("Datalisynet") has become the latest European regulator to find the use of Google’s Workspace productivity suite incompatible with GDPR as a result of Google’s non-compliant international data transfers. Datalisynet has imposed a general ban on the processing of personal data with Google Workspace until adequate documentation and impact assessment have been carried out and until the processing operations have been brought in line with the GDPR.

Datalisynet's ruling follows a risk assessment of personal data processing by primary schools in the Helsingør municipality, in North-Eastern Denmark, and will require that from August 2022 all public sector organisations in the region stop using the Google Workspace, which includes GMail and the Google Docs suite of apps, as well as its Chromebook laptops. Datalisynet has further stated that parties who do not comply with the ban could be imprisoned.

Greek regulator fines Clearview €20m and orders biometric data to be deleted

Facial recognition platform, Clearview AI, has been fined €20 million ($20.1 million) by a further data regulator in Europe for violating several articles of the GDPR with its facial recognition service. Like regulators in UK, Italy and France, the Greek regulator has also banned Clearview from processing the personal data of people living in Greece. It has also ordered it to delete any data on Greek citizens that it has already collected.

The decision refers to a series of events in 2021, beginning with a request made by data subject, Marina Zacharopoulou, in March 2021 to know what data of hers was in Clearview’s database of images scraped from public internet pages.

The request was acknowledged but Clearview responded to a reminder email in April, saying it could not find the original request, and asked for a photo to use in the process, according to the Greek agency. The company notes a formatting issue which may have been behind the difficulty finding the initial email. This prompted the data subject to make a complaint to the regulator.

The regulator found that Clearview had violated the GDPR, and also levied its sizable fine in view of the: "nature, gravity and duration of the infringement, which is not an isolated one incident, but it is systematic and concerns the basic principles of its legitimacy processing." Other factors were also cited, including a "lack of cooperation," whereby Clearview declined to attend its hearing on this matter.

In response, Clearview raised the same argument that had been raised in the actions taken against them by other European regulators: in responding to the decision, Clearview's CEO Hoan Ton-That stated that: "Clearview AI does not have a place of business in Greece or the EU, it does not have any customers in Greece or the EU, its product has never been used in Greece, and does not undertake any activities that would otherwise mean it is subject to the GDPR". 

Garante warns TikTok Italy and TikTok Technology in relation to planned advertising activities based on legitimate interest

The Italian supervisory authority ("Garante") recently issued a formal warning to TikTok under Article 58(2)a GDPR and Section 154(1)(f) of Italy’s data protection law that processing data stored in users' devices on the basis of its ‘legitimate interest’ would be in conflict with the current regulatory framework (Article 5(3) of directive 2002/58/EC and national law transposing it) which state explicitly that the data subjects’ consent is the only legal basis for "the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user".

Garante was also concerned about the measures in place to protect child users registered with the platform, given the difficulties encountered by TikTok in implementing adequate age verification measures to access and the associated risk that ‘personalised’ ads including those with an unsuitable content would be visible to children under 14 years of age based on the company’s legitimate interest.

Garante has reserved its right to take additional measures, including urgent measures, if proved necessary. While declining to comment further given the ongoing evaluation by Garante, TikTok have confirmed their commitment to "respecting the privacy of our users, being transparent about our privacy practices, and operating in compliance with all relevant regulations.”

CNIL fines Totalenergies €1,000,000

CNIL has fined the French energy company, Totalenergies France, €1 million after identifying multiple breaches of GDPR arising out of 18 complaints. These breaches included failing to: respond to data subject access requests (e.g. in failing to disclose when and how personal data was collected); respond to requests for access and deletion; or offer the option to object to processing for marketing purposes.

Civil litigation

DSG Retail Limited v ICO (EA/2020/0048, 6 July 2022)

The Upper Tribunal has recently handed down judgment in DSG Retail Limited v ICO (EA/2020/0048, 6 July 2022), arising out of an appeal by DSG Retail Limited ("DSG") to a Monetary Penalty Notice ("MPN") issued by the Information Commissioner on 7 January 2020 fining DSG £500,000, the maximum available under the Data Protection Act 1998, for a number of data security failings in the context of a significant data breach in 2017-8.

The First Tribunal ("FTT") determined that, although the MPN was "wrong in law", a penalty of £250,000 was appropriate. The FTT applied a "holistic approach […] to…. compliance [with the seventh data protection principle ("DPP7") (i.e. the obligation to "allow" "a degree of permissiveness in the exercise of judgement" and declined to treat post-breach remedial actions as indicative of earlier breaches of DPP7.

DPP7 requires organisations to take appropriate technical and organisational measures against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data. Of the ten breaches of DPP7 relied on by the ICO in the MPN, the FTT found that only two were made out on the facts (relating to DSG's failure to maintain up to date security patches and issues with DSG’s password policy, which issues had been flagged to DSG’s senior management but not rectified). Given the nature and volume of personal data which was put at risk by DSG's conduct, the FTT imposed a significant, albeit reduced, fine.

CJEU AG Sets High Bar for Responses to Data Subject Access Requests

 

Advocate General Giovanni Pitruzzella (the "AG") has recently delivered an opinion (the "Opinion") regarding the interpretation of an individual’s rights of access under Article 15 GDPR. Specifically, the Opinion addresses an individual’s right to access information about “the recipients or categories of recipient to whom the personal data have been or will be disclosed”, pursuant to Article 15(1)(c) GDPR. The AG delivered the Opinion in the context of Case C-154/21 (the Case), which is currently pending before the ECJ. Whilst the ECJ is not bound to follow the Opinion, the ECJ generally follows AG's opinions.

The Opinion sets out:

  • The wording of Article 15(1)(c) GDPR is not, of itself, sufficient to provide a definitive answer on the referred question, noting that: (i) the terms 'recipients' and 'categories of recipient' are used in Article 15(1)(c) in succession, in a neutral way, without any order of priority, and that (ii) Article 15(1)(c) does not expressly specify whether a choice may be made between 'recipients' or 'categories of recipient' or who (i.e., the data subject or the data controller) might be entitled to make such determination;
  • However, the main purpose of the right of access is to enable data subjects to be aware of the processing activities involving their personal data and to verify the lawfulness of such processing, including that personal data has only been disclosed to authorised recipients. The AG finds that restricting the information to categories of recipients would not allow data subjects to achieve that purpose; and
  • The GDPR requires data controllers, in response to a data subject access request, to identify the specific recipients of the data subject’s personal data with the caveat that in at least two circumstances controllers may respond with information limited to categories of recipients, namely: (i) if it is materially impossible to provide details of specific recipients (arguably this could apply if the recipients have not been identified by the controller), or (ii) if the request is manifestly unfounded or excessive (the burden of proving this lays with the controller).

If the CJEU follows the AG’s Opinion (which, as discussed above, it generally does) organisations are likely to be expected to identify specific recipients of personal data as a matter of course when responding to data subject access requests. For many organisations, effectively identifying and mapping specific recipients of personal data disclosures would significant enhance the administrative burden placed on them. The ruling of the CJEU is therefore required to provide greater clarity on the likely enhanced expectations placed on controllers.

Dutch Court closes VoetbalTV case, but balance to the GDPR Legitimate Interest is yet to be restored

The Dutch Administrative Jurisdiction Division of the Council of State ("RvS") has published its judgment in the VoetbalTV v Dutch Data Protection Authority ("DPA") which rejected the appeal of the DPA which had imposed a €575,000 fine on VoetbalTV for "unlawfully" relying on the legitimate interest ground in Article 6(1)(f) GDPR to process personal data for purely commercial purposes by broadcasting amateur football through camera systems installed along the fields. The fine was already cancelled by the Court of First Instance in 2020.

The central question in this case was what exactly constitutes a legitimate interest and under what conditions the processing of personal data for that purpose is lawful. The RvS ruled that the organisation that processes the data must indicate what its interests are and why the processing of that data is necessary for this and that it is then up to the DPA to assess the actual activities of the data controller, whether these interests correspond to the stated interests and whether those interests are served by the data processing and whether those interests are justified.

In finding that the fine was wrongly imposed, the RVS found that the Dutch DPA did not correctly assess all interests that VoetbalTV had put forward and therefore failed to appreciate that the interests were not strictly commercial in nature. Importantly, the Dutch DPA declined to address whether purely commercial interests alone can qualify as a valid legitimate interest, while not closing the door to this, and instead elected to refer this question to the CJEU.