Share icon

Case Studies: High-Profile Cases of Privacy Violation

Contributor.

Smith Gambrell & Russell weblink

Case Studies: Recent FTC Enforcement Actions - High-Profile Cases of Privacy Violation: Uber, Emp Media, Lenovo, Vizio, VTech, LabMD

Uber Technologies

The scenario: In August 2018, the FTC announced an expanded settlement with Uber Technologies for its alleged failure to reasonably secure sensitive data in the cloud, resulting in a data breach of 600,000 names and driver's license numbers, 22 million names and phone numbers, and more than 25 million names and email addresses.

The settlement: The expanded settlement is a result of Uber's failure to disclose a significant data breach that occurred in 2016 while the FTC was conducting its investigation that led to the original settlement. The revised proposed order includes provisions requiring Uber to disclose any future consumer data breaches, submit all reports for third-party audits of Uber's privacy policy and retain reports on unauthorized access to consumer data. 2

Emp Media Inc. (Myex.com)

The scenario: The FTC joined forces with the State of Nevada to address privacy issues arising from the "revenge" pornography website, Myex.com, run by Emp Media Inc. The website allowed individuals to submit intimate photos of the victims, including personal information such as name, address, phone number and social media accounts. If a victim wanted their photos and information removed from the website, the defendants reportedly charged fees of $499 to $2,800 to do so.

The settlement: On June 15, 2018, the enforcement action brought by the FTC led to a shutdown of the website and permanently prohibited the defendants from posting intimate photos and personal information of other individuals without their consent. The defendants were also ordered to pay more than $2 million. 3

Lenovo and Vizio

The scenario: In 2018, FTC enforcement actions led to large settlements with technology manufacturers Lenovo and Vizio. The Lenovo settlement related to allegations the company sold computers in the U.S. with pre-installed software that sent consumer information to third parties without the knowledge of the users. With the New Jersey Office of Attorney General, the FTC also brought an enforcement action against Vizio, a manufacturer of "smart" televisions. Vizio entered into a settlement to resolve allegations it installed software on its televisions to collect consumer data without the knowledge or consent of consumers and sold the data to third parties.

The settlement: Lenovo entered into a consent agreement to resolve the allegations through a decision and order issued by the FTC. The company was ordered to obtain affirmative consent from consumers before running the software on their computers and implement a software security program on preloaded software for the next 20 years. 4 Vizio agreed to pay $2.2 million, delete the collected data, disclose all data collection and sharing practices, obtain express consent from consumers to collect or share their data, and implement a data security program. 5

The scenario: The FTC's action against toy manufacturer VTech was the first time the FTC became involved in a children's privacy and security matter. The settlement: In January 2018, the company entered into a settlement to pay $650,000 to resolve allegations it collected personal information from children without obtaining parental consent, in violation of COPPA. VTech was also required to implement a data security program that is subject to audits for the next 20 years. 6

The scenario: LabMD, a cancer-screening company, was accused by the FTC of failing to reasonably protect consumers' medical information and other personal data. Identity thieves allegedly obtained sensitive data on LabMD consumers due to the company's failure to properly safeguard it. The billing information of 9,000 consumers was also compromised. The settlement: After years of litigation, the case was heard before the U.S. Court of Appeals for the Eleventh Circuit. LabMD argued, in part, that data security falls outside of the FTC's mandate over unfair practices. The Eleventh Circuit issued a decision in June 2018 that, while not stripping the FTC of authority to police data security, did challenge the remedy imposed by the FTC. 7 The court ruled that the cease-and-desist order issued by the FTC against LabMD was unenforceable because the order required the company to implement a data security program that needed to adhere to a standard of "reasonableness" that was too vague. 8

The ruling points to the need for the FTC to provide greater specificity in its cease-and-desist orders about what is required by companies that allegedly fail to safeguard consumer data.

1 15 U.S.C. § 45(a)(1)

2 www.ftc.gov/news-events/press-releases/2018/04/uber-agrees-expanded-settlement-ftc-related-privacy-security

3 www.ftc.gov/system/files/documents/cases/emp_order_granting_default_judgment_6-22-18.pdf

4 www.ftc.gov/news-events/press-releases/2018/01/ftc-gives-final-approval-lenovo-settlement

5 www.ftc.gov/news-events/press-releases/2017/02/vizio-pay-22-million-ftc-state-newjersey-settle-charges-it

6 www.ftc.gov/news-events/press-releases/2018/01/electronic-toy-maker-vtech-settlesftc-allegations-it-violated

7 The United States Court of Appeals for the Third Circuit has rejected this argument. See FTC v. Wyndham Worldwide Corp., 799 F.3d 236, 247-49 (2015).

8 www.media.ca11.uscourts.gov/opinions/pub/files/201616270.pdf

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Photo of Marcia M. Ernst

United States

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Home

U.S. Government Accountability Office

Data Protection: Actions Taken by Equifax and Federal Agencies in Response to the 2017 Breach

Hackers stole the personal data of nearly 150 million people from Equifax databases in 2017.

How did Equifax, a consumer reporting agency, respond to that event? Equifax said that it investigated factors that led to the breach and tried to identify and notify people whose personal information was compromised.

In addition, three federal agencies that use Equifax services made their own security assessments and modified contracts with Equifax. Moreover, other federal agencies that oversee consumer reporting agencies started investigating Equifax and gave further advice to consumers on how to protect themselves against security breaches.

Hackers can make intrusions into your computer and steal personal information

Photo of a person putting personal information into a computer that could be hacked by an intruder

Photo of a person putting personal information into a computer that could be hacked by an intruder

What GAO Found

In July 2017, Equifax system administrators discovered that attackers had gained unauthorized access via the Internet to the online dispute portal that maintained documents used to resolve consumer disputes (see fig.). The Equifax breach resulted in the attackers accessing personal information of at least 145.5 million individuals. Equifax's investigation of the breach identified four major factors including identification, detection, segmenting of access to databases, and data governance that allowed the attacker to successfully gain access to its network and extract information from databases containing personally identifiable information. Equifax reported that it took steps to mitigate these factors and attempted to identify and notify individuals whose information was accessed. The company's public filings since the breach occurred reiterate that the company took steps to improve security and notify affected individuals.

The Internal Revenue Service (IRS), Social Security Administration (SSA), and U.S. Postal Service (USPS)—three of the major federal customer agencies that use Equifax's identity verification services—conducted assessments of the company's security controls, which identified a number of lower-level technical concerns that Equifax was directed to address. The agencies also made adjustments to their contracts with Equifax, such as modifying notification requirements for future data breaches. In the case of IRS, one of its contracts with Equifax was terminated. The Department of Homeland Security offered assistance in responding to the breach; however, Equifax reportedly declined the assistance because it had already retained professional services from an external cybersecurity consultant. In addition, the Bureau of Consumer Financial Protection and the Federal Trade Commission, which have regulatory and enforcement authority over consumer reporting agencies (CRAs) such as Equifax, initiated an investigation into the breach and Equifax's response in September 2017. The investigation is ongoing.

How Attackers Exploited Vulnerabilities in the 2017 Breach, Based on Equifax Information

How Attackers Exploited Vulnerabilities in the 2017 Breach, Based on Equifax Information

Why GAO Did This Study

CRAs such as Equifax assemble information about consumers to produce credit reports and may provide other services, such as identity verification to federal agencies and other organizations. Data breaches at Equifax and other large organizations have highlighted the need to better protect sensitive personal information.

GAO was asked to report on the major breach that occurred at Equifax in 2017. This report (1) summarizes the events regarding the breach and the steps taken by Equifax to assess, respond to, and recover from the incident and (2) describes actions by federal agencies to respond to the breach. To do so, GAO reviewed documents from Equifax and its cybersecurity consultant related to the breach and visited the Equifax data center in Alpharetta, Georgia, to interview officials and observe physical security measures. GAO also reviewed relevant public statements filed by Equifax. Further, GAO analyzed documents from the IRS, SSA, and USPS, which are Equifax's largest federal customers for identity-proofing services, and interviewed federal officials related to their oversight activities and response to the breach.

Recommendations

GAO is not making recommendations in this report. GAO plans to issue separate reports on federal oversight of CRAs and consumer rights regarding the protection of personally identifiable information collected by such entities. A number of federal agencies and Equifax provided technical comments which we incorporated as appropriate.

Full Report

Gao contacts.

Michael Clements Director [email protected] (202) 512-8678

Nick Marinos Managing Director [email protected] (202) 512-9342

Office of Public Affairs

Sarah Kaczmarek Acting Managing Director [email protected] (202) 512-4800

  • Talk to Expert
  • Machine Identity Management
  • October 20, 2023
  • 9 minute read

7 Data Breach Examples Involving Human Error: Did Encryption Play a Role?

David Bisson - profile photo

Despite an overall increase in security investment over the past decade, organizations are still plagued by data breaches. What’s more, we’re learning that many of the attacks that result in breaches misuse encryption in some way. (By comparison, just four percent of data breaches tracked by Gemalto’s Breach Level Index were “secure breaches” in that the use of encryption rendered stolen data useless). Sadly, it’s often human error that allows attackers access to encrypted channels and sensitive information. Sure, an attacker can leverage “gifts” such as zero-day vulnerabilities to break into a system, but in most cases, their success involves provoking or capitalizing on human error.

Human error has a well-documented history of causing data breaches. The 2022  Global Risks Report  released by the World Economic Forum, found that 95% of cybersecurity threats were in some way caused by human error. Meanwhile, the  2022 Data Breach Investigations Report  (DBIR) found that 82% of breaches involved the human element, including social attacks, errors and misuse. 

I think it’s interesting to look at case studies on how human error has contributed to a variety of data breaches, some more notorious than others. I’ll share the publicly known causes and impacts of these breaches. But I’d also like to highlight how the misuse of encryption often compounds the effects of human error in each type of breach.

SolarWinds: Anatomy of a Supersonic Supply Chain Attack

SolarWinds: Anatomy of a Supersonic Supply Chain Attack

Data breach examples.

Here is a brief review of seven well-known data breaches caused by human error.

1. Equifax data breach—Expired certificates delayed breach detection

In the spring of 2017, the U.S. Department of Homeland Security's Computer Emergency Readiness Team (CERT) sent consumer credit reporting agency Equifax a notice about a vulnerability affecting certain versions of Apache Struts. According to former CEO Richard Smith, Equifax sent out a mass internal email about the flaw. The company’s IT security team should have used this email to fix the vulnerability, according to Smith’s testimony before the House Energy and Commerce Committee. But that didn’t happen. An automatic scan several days later also failed to identify the vulnerable version of Apache Struts. Plus, the device inspecting encrypted traffic was misconfigured because of a digital certificate that had expired ten months previously. Together, these oversights enabled a digital attacker to crack into Equifax’s system in mid-May and maintain their access until the end of July.

How encryption may become a factor in scenarios like this:  Once attackers have access to a network, they can install rogue or stolen certificates that allow them to hide exfiltration in encrypted traffic. Unless HTTPS inspection solutions are available and have full access to all keys and certificates, rogue certificates will remain undetected.

Impact:  The bad actor is thought to have exposed the personal information of 145 million people in the United States and more than 10 million UK citizens. In September 2018, the Information Commissioner’s Office  issued Equifax a fine of £500,000, the maximum penalty amount allowed under the Data Protection Act 1998, for failing to protect the personal information of up to 15 million UK citizens during the data breach.

2. Ericsson data breach—Mobile services go dark when the certificate expires

At the beginning of December 2018, a digital certificate used by Swedish multinational networking and telecommunications company Ericsson for its SGSN–MME (Serving GPRS Support Node—Mobility Management Entity) software expired. This incident caused outages for customers of various UK mobile carriers including O2, GiffGaff, and Lyca Mobile. As a result, a total of 32 million people in the United Kingdom alone lost access to 4G and SMS on 6 December. Beyond the United Kingdom, the outage reached 11 countries including Japan.

How encryption may become a factor in scenarios like this: Expired certificates do not only cause high-impact downtime; they can also leave critical systems without protection. If a security system experiences a certificate outage , cybercriminals can take advantage of the temporary lack of availability to bypass the safeguards.

Impact:  Ericsson restored the most affected customer services over the course of 6 December. The company also noted in a  blog post  that “The faulty software [for two versions of SGSN–MME] that has caused these issues is being decommissioned.”

3. LinkedIn data breach—Millions miss connections when the certificate expires

On 30 November, a certificate used by business social networking giant LinkedIn for its country subdomains expired. As reported by The Register , the incident did not affect www.linkedin.com, as LinkedIn uses a separate certificate for that particular domain. But the event, which involved a certificate issued by DigiCert SHA2 Secure Server CA, did invalidate us.linkedin.com along with the social media giant’s other subdomains. As a result, millions of users were unable to log into LinkedIn for several hours.

How encryption may become a factor in scenarios like this:  Whenever certificates expire, it may indicate that overall protection for machine identities is not up to par. Uncontrolled certificates are a prime target for cybercriminals who can use them to impersonate the company or gain illicit access.

Impact:  Later in the afternoon on 30 November, LinkedIn deployed a new certificate that helped bring its subdomains back online, thereby restoring all users’ access to the site.

4. Strathmore College data breach—Student records not adequately protected

In August 2018, it appears that an employee at Strathmore secondary college accidentally published more than 300 students’ records on the school’s intranet. These records included students' medical and mental health conditions such as Asperger’s, autism and ADHD. According to The Guardian , they also listed the exposed students’ medications along with any learning and behavioral difficulties. Overall, the records remained on Strathmore’s intranet for about a day. During that time, students and parents could have viewed and/or downloaded the information.

How encryption may become a factor in scenarios like this:  Encrypting access to student records makes it difficult for anyone who doesn’t have the proper credentials to access them. Any information left unprotected by encryption can be accessed by any cybercriminals who penetrate your perimeter.

Impact:  Strathmore’s principal said he had arranged professional development training for his staff to ensure they’re following best security practices. Meanwhile, Australia’s Department of Education announced that it would investigate what had caused the breach.

5. Veeam data breach—Customer records compromised by unprotected database

Near the end of August 2018, the Shodan search engine indexed an Amazon-hosted IP. Bob Diachenko, director of cyber risk research at Hacken.io, came across the IP on 5 September and quickly determined that the IP resolved to a database left unprotected by the lack of a password. The exposed database contained 200 gigabytes worth of data belonging to Veeam, a backup and data recovery company. Among that data were customer records including names, email addresses and some IP addresses.

How encryption may become a factor in scenarios like this:  Usernames and passwords are a relatively weak way of securing private access. Plus, if an organization does not maintain complete control of the private keys that govern access for internal systems, attackers have a better chance of gaining access.

Impact:  Within three hours of learning about the exposure, Veeam took the server offline. The company also reassured  TechCrunch  that it would “conduct a deeper investigation and… take appropriate actions based on our findings.”

6. Marine Corps data breach—Unencrypted email misfires

At the beginning of 2018, the Defense Travel System (DTS) of the United States Department of Defense (DOD) sent out an unencrypted email with an attachment to the wrong distribution list. The email, which the DTS sent within the usmc.mil official unclassified Marine domain but also to some civilian accounts, exposed the personal information of approximately 21,500 Marines, sailors and civilians. Per Marine Corp Times , the data included victims’ bank account numbers, truncated Social Security Numbers and emergency contact information.

How encryption may become a factor in scenarios like this:  If organizations are not using proper encryption, cybercriminals can insert themselves between two email servers to intercept and read the email. Sending private personal identity information over unencrypted channels essentially becomes an open invitation to cybercriminals.

Impact:  Upon learning of the breach, the Marines implemented email recall procedures to limit the number of email accounts that would receive the email. They also expressed their intention to implement additional security measures going forward.

7. Pennsylvania Department of Education data breach—Misassigned permissions

In February 2018, an employee in Pennsylvania’s Office of Administration committed an error that subsequently affected the state’s Teacher Information Management System (TIMS). As reported by PennLive , the incident temporarily enabled individuals who logged into TIMS to access personal information belonging to other users including teachers, school districts and Department of Education staff. In all, the security event is believed to have affected as many as 360,000 current and retired teachers.

How encryption may become a factor in scenarios like this: I f you do not know who’s accessing your organization’s information, then you’ll never know if it’s being accessed by cybercriminals. Encrypting access to vital information and carefully managing the identities of the machines that house it will help you control access.

Impact:  Pennsylvania’s Department of Education subsequently sent out notice letters informing victims that the incident might have exposed their personal information including their Social Security Numbers. It also offered a free one-year subscription for credit monitoring and identity protection services to affected individuals.

How machine identities are misused in a data breach

Human error can impact the success of even the strongest security strategies. As the above attacks illustrate, this can compromise the security of machine identities in numerous ways. Here are just a few:

  • SSH keys grant privileged access to many internal systems. Often, these keys do not have expiration dates. And they are difficult to monitor. So, if SSH keys are revealed or compromised, attackers can use them to pivot freely within the network.
  • Many phishing attacks leverage wildcard or rogue certificates to create fake sites that appear to be authentic. Such increased sophistication is often required to target higher-level executives.
  • Using public-key encryption and authentication in the two-step verification makes it harder to gain malicious access. Easy access to SSH keys stored on computers or servers makes it easier for attackers to pivot laterally within the organization.
  • An organization’s encryption is only as good as that of its entire vendor community. If organizations don’t control the keys and certificates that authenticate partner interactions, then they lose control of the encrypted tunnels that carry confidential information between companies.
  • If organizations are not monitoring the use of all the keys and certificates that are used in encryption, then attackers can use rogue or stolen keys to create illegitimate encrypted tunnels. Organizations will not be able to detect these malicious tunnels because they appear to be the same as other legitimate tunnels into and out of the organization.

How to avoid data breaches

The best way to avoid a data breach to make sure your organization is using the most effective, up-to-date security tools and technologies. But even the best cybersecurity strategy is not complete unless it is accompanied by security awareness training for all who access and interact with sensitive corporate data. 

Because data breaches take many different forms and can happen in a multitude of ways, you need to be ever vigilant and employ a variety of strategies to protect your organization. These should include regular patching and updating of software, encrypting sensitive data, upgrading obsolete machines and enforcing strong credentials and multi-factor authentication.

In particular, a zero-trust architecture will give control and visibility over your users and machines using strategies such as least privileged access, policy enforcement, and strong encryption. Protecting your machine identities as part of your zero trust architecture will take you a long way toward breach prevention. Here are some machine identity management best practices that you should consider: 

  • Locate all your machine identities.  Having a complete list of your machine identities and knowing where they’re all installed, who owns them, and how they’re used will give you the visibility you need to ensure that they are not being misused in an attack.
  • Set up and enforce security policies.  To keep your machine identities safe, you need security policies that help you control every aspect of machine identities — issuance, use, ownership, management, security, and decommissioning. 
  • Continuously gather machine identity intelligence.  Because the number of machines on your network is constantly changing, you need to maintain intelligence their identities, including the conditions of their use and their environment. 
  • Automate the machine identity life cycle.  Automating he management of certificate requests, issuance, installation, renewals, and replacements helps you avoid error-prone manual actions that may leave your machine identities vulnerable to outage or breach. 
  • Monitor for anomalous use.  After you’ve established a baseline of normal machine identity usage, you can start monitoring and flagging anomalous behavior, which can indicate a machine identity compromise.
  • Set up notifications and alerts.  Finding and evaluating potential machine identity issues before they exposures is critical. This will help you take immediate action before attackers can take advantage of weak or unprotected machine identities.
  • Remediate machine identities that don’t conform to policy.  When you discover machine identities that are noncompliant, you must quickly respond to any security incident that requires bulk remediation.

Training your users about the importance of machine identities will help reduce user errors. And advances in AI and RPA will also play a factor in the future. But for now, your best bet in preventing encryption from being misused in an attack on your organization is an automated machine identity management solution that allows you to maintain full visibility and control of your machine identities. Automation will help you reduce the inherent risks of human error as well as maintain greater control over how you enforce security policies for all encrypted communications. 

( This post has been updated. It was originally published Posted on October 15, 2020. ) 

Related posts

  • Marriott Data Breach: 500 Million Reasons Why It’s Critical to Protect Machine Identities
  • Breaches Are Like Spilled Milk: It Doesn’t Help to Cry
  • The Major Data Breaches of 2017: Did Machine Identities Play a Factor?

Summit 2024 Teaser Photo

Machine Identity Security Summit 2024

Help us forge a new era of cybersecurity

☕ We're spilling all the machine identiTEA Oct. 1-3, but these insights are too valuable to just toss in the harbor! Browse the agenda and register now.

  • Data Breach

GDPR: Key cases so far

  • 7 February 2019
  • Data Protection & GDPR

Loretta Maxfield

GDPR: Key cases so far

Google fined by national French data protection regulator

On 21 January, Google LLC (Google’s French arm) was fined €50million by the Commission Nationale de l’information et des Liberties (CNIL) for various failings under GDPR.

The main failing CNIL found was that individuals using Google’s services were not furnished with the requisite “fair processing information” (the information usually provided in privacy notices) by seemingly omitting to inform individuals about why Google processed their personal data how long their data was kept. The ruling also attacked the accessibility of the information saying that although most of the information was there, it was scattered around it site via various different “links”. The second key failing was not meeting the GDPR standard of “consent” when providing personalised advert content. Under GDPR, consent must be sufficiently informed, specific, unambiguous, granular and be gained through a form of active acceptance. In the first instance the CNIL did not consider the consent to be informed enough as it ruled users were not given enough information about what giving their consent would mean in terms of the ad personalisation services Google would then push. The fine was also imposed in light of Google not ensuring that consent met the GDPR threshold through using pre-ticked boxes and not separating out consents for advert personalisation from other processing by Google.

The takeaways for your organisation are to ensure it’s easy for your customers or service users to understand what you do with their data. Privacy notices should be clearly signposted, and be as accurate as possible about what data is collected and why it is used. It also reminds us of the strict threshold consents must reach before they are valid. Businesses are certainly becoming more savvy when it comes to making sure individuals an give consent for different purposes, but it’s not uncommon to still come across the pre-ticked box! If your organisation relies on consent and would like Thorntons to review how you use it, please get in touch and we can give advice on whether you are meeting the GDPR standard.

Marriot International suffer unprecedented data breach

On 19 November last year, Marriott International announced that the personal data of 500 million of its customers had been compromised. The group, which operates hotel chains under the brands W Hotels, Sheraton, and Le Méridien among many others, said that they had reason to believe that certain of their computer systems had been hacked in 2014 which has now led to this breach. The number of people affected, which data relates to customer bookings from 2014 onwards, has now been revised and whilst they still cannot state the exact number, it believes the number of customer records now totals around 383 million. This remains an extremely large number of affected customers, and the hackers were able to access personal details, passport numbers, and in some cases payment information.

Although a breach of this scale is rare, there are various pointers that all organisations can take from this case. Firstly, it’s a reminder to continuously monitor the technical and organisational security measures protecting personal data. Testing and monitoring of your organisation’s security should be subject to regular review. Secondly, it’s a reminder to have in place a practical guide for how to respond to a data breach. As well as having a clear process for how to report and assess breaches internally, your guide should be clear on what kind of breaches should be reported to the ICO, and perhaps statements to release to the media. Lastly, this case is a reminder of conducting regular audits of data held so that your organisation is always aware of how much data it actually holds. Marriott’s reduced forecast of the number of data subjects affected is based on the fact they have now discovered that many of the accounts compromised actually relate to the same individual. If Marriott had an up-to-date list of active customers it potentially could have been able to respond more quickly.

The ICO takes action against organisations for failing to pay the new data protection fee

At the end of September, the ICO announced that it had begun formal enforcement action against organisations for failing to pay the new data protection fee. Since 25th May when GDPR came into force, organisations which are classified as data controllers have been required by the Data Protection (Charges and Information) Regulations 2018 to register with the ICO, and pay the applicable fee. Whilst the specific organisations have not been named, the ICO has confirmed they have issued 900 notices of intent to fine organisations which span “the public and private sector including the NHS, recruitment, finance, government and accounting”. Of those 900, to-date 100 penalty notices have been issued which range from £400 to £4000, although the ICO has confirmed that the maximum could be £4350 depending on aggravating factors. If you are unsure whether your organisation is required to pay a fee, please get in touch and we can advise accordingly.

The ICO issues its first Enforcement Notice for a breach of GDPR

The ICO has issued its first formal notice under the GDPR to AggregateIQ Data Services Ltd (“AIQ”). AIQ, a Canadian company, was involved in targeting political advertising on social media to individuals whose information was supplied to them by various political parties and campaigns (such as Vote Leave, BeLeave, Veterans for Britain, and DUP Vote to Leave).

After an investigation by the ICO, AIQ was found not to have adequately complied with its obligations as a controller under the GDPR by: (1) not processing personal data in a way that the data subjects were aware of, (2) not processing personal data for purposes for which data subjects expected, (3) not having a lawful basis for processing, (4) not processing the personal data in a way in a way which was compatible with the reasons for which it was originally collected, and (5) not issuing the appropriate fair processing information to those individuals (commonly communicated through a privacy notice).

As well as those practical failings, the ICO also considered that it was likely that those individuals whose information was passed to AIQ and used for targeted advertising were likely to cause those individuals damage or distress through not being given the opportunity to understand how their personal information would be used.

The most interesting point about this case is that although the company is based in Canada, the ICO has still exercised its authority over those organisations which process data of those in the UK and ordered that AIQ must now erase all the personal data it holds on individuals in the UK. For a company which mainly deals in data and analytics, this could have a detrimental impact on its business operations in the UK. Although AIQ was passed the personal data from other organisations, this enforcement action demonstrates that it is still AIQ’s responsibility to ensure that their use of the data was not incompatible with any of the purposes for which it was originally intended, and still incumbent on them to ensure individuals were aware of what they were doing with it. In addition, whilst there has been and continues to be a lot of emphasis in the media of the risk of large fines under GDPR, it is notable that no monetary penalty has been issued by the ICO, although the ICO has reserved its ability to do so should AIQ not comply with this notice.

Morrisons held liable for the wrongful acts of its rogue employee by the Court of Appeal (England)

The circumstances of this interesting case centre around an employee whose rogue actions were still considered by the court to be attributable to the employer as a breach of the Data Protection Act 1998. The employee was employed by Morrisons Supermarkets as an internal IT auditor who in 2014, knowingly decided to copy the personal data of around 100,000 of Morrisons’ employees onto a USB stick. At home, the employee then posted the personal data, which included names, addresses and bank details, onto the internet under the name of another Morrisons employee in an attempt to cover his tracks.

In finding that Morrisons was vicariously liable for the actions of the rogue employee, the Court concluded that there was a sufficiently close link between the employee’s job role, and the wrongful action. That the wrongful event occurred outside the workplace was irrelevant, as the Court found that the employee in question was acting “within the field of activities assigned”. Because the employee had access to the compromised personal data in the course of carrying out his role in facilitating payroll, he was specifically entrusted with that kind of information in order to do his job, so the Court decided that there was a sufficient link between the job role and the wrongful disclosure.

The key, striking, message from this case is that it is possible for employers to be held liable for rogue actions taken by its employees. Although this particular employee was obviously not acting within the expected confines of his job role, it is interesting that the Court still determined that employers may be liable for acts that it would normally reasonably consider out of its control. Although this incident occurred in 2014 and therefore decided under the Data Protection Act 1998, this case demonstrates how vital it is that organisations put in place appropriate technical and organisational security measures adequate for the type of data that is being held and also taking into account the risk of disgruntled employees and what they may do with their access to the information. This case also acts as a reminder of ensuring your staff are trained and aware of data protection and the role they personally can play in the protection of data, not just focusing on technical computer security which a lot of organisations pay more attention to. As remarked in this judgment, it also serves as a reminder of having adequate insurance in place in the event of a major data breach.

The ICO receives notification of thousands of breaches

Although organisations could report data breaches to the ICO under the Data Protection Act 1998, you will be aware that under GDPR there is mandatory reporting of breaches to the ICO in cases where there is a “risk to the rights and freedoms of individuals”. The ICO has now reported that it has received notification of more than 8000 breaches in the 6 months since GDPR came into force. Last summer the ICO observed that many breaches that were being reported did not necessarily meet the threshold of risk, however they do welcome the honesty and transparency coming from organisations under legislation which is designed to strengthen rights for individuals.

With breaches requiring to be reported to the ICO within 72 hours of becoming aware, it is vital that mechanisms are in place internally for employees to understand how to report a breach and complete a risk assessment in the appropriate time-frame to assess whether it is reportable. If you would like any help compiling a data breach policy or risk assessment framework tailored to your organisation please get in touch.

Related services

  • Data Protection and GDPR

Stay updated

Receive the latest news, legal updates and event information straight to your inbox

About the author

Loretta Maxfield

Data Protection & GDPR, Intellectual Property

For more information, contact Loretta Maxfield on +44 1382 346814 .

Make an enquiry

The International Forum for Responsible Media Blog

  • Table of Media Law Cases
  • About Inforrm
  • Search for: Search Button

Top 10 Privacy and Data Protection Cases of 2018: a selection

breach of data protection act case study

  • Cliff Richard v. The British Broadcasting Corporation [2018] EWHC 1837 (Ch) .

This was Sir Cliff Richard’s privacy claim against the BBC and was the highest profile privacy of the year.  The claimant was awarded damages of £210,000. We had a case preview and case reports on each day of the trial and posts from a number of commentators including Paul Wragg , Thomas Bennett ( first and second ), Jelena Gligorijević . The BBC subsequently announced that it would not seek permission to appeal.

  • ABC v Telegraph Media Group Ltd [2018] EWCA Civ 2329 .

This was perhaps the second most discussed privacy case of the year.  The Court of Appeal allowed the claimants’ appeal and granted an interim injunction to prevent the publication of confidential information about alleged “discreditable conduct” by a high profile executive.  Lord Hain subsequently named the executive as Sir Philip Green.  We had a case comment from Persephone Bridgman Baker. We also had comments criticising Lord Hain’s conduct from Paul Wragg , Robert Craig and Tom Double .

  • Ali v Channel 5 Broadcast ( [2018] EWHC 298 (Ch)) .

The claimants had featured in a “reality TV” programme about bailiffs, “Can’t pay? We’ll Take it Away”. Their claim for misuse of private information was successful and damages of £20,000 were awarded. We had a case comment from Zoe McCallum. An appeal and cross appeal was heard on 4 December 2018 and judgment is awaited.

  • NT1 and NT2 v Google Inc [2018] 3 WLR 1165.

This was the first “right to be forgotten” claim in the English Courts – with claims in both data protection and privacy. Both claimants had spent convictions – one was successful and the other not.  We had a case preview from Aidan Wills and a comment on the case from Iain Wilson,

  • Lloyd v Google LLC [2018] EWHC 2599 (QB) .

This was an attempt to bring a “representative action” in data protection on behalf of all iPhone users in respect of the “Safari Workaround”. The representative claimant was refused permission to serve Google out of the jurisdiction.  We had a case comment from Rosalind English.  There was a Panopticon Blog post the case. The claimant has been given permission to appeal and it is likely that the appeal will be heard in late 2019.

  • TLU v Secretary of State for the Home Department [2018] EWCA Civ 2217 .

The Court of Appeal dismissed an appeal in a “data leak” case on the issue of liability to individuals affected by a data leak but not specifically named in the leaked document. We had a case comment from Lorna Skinner and further comment from Iain Wilson.  There was also a Panopticon Blog post .

  • Stunt v Associated Newspapers [2018] EWCA Civ 170 .

The Court of Appeal referred the question of whether the “journalistic exemption” in section 32(4) of the Data Protection Act 1998 is compatible with the Data Protection Directive and the EU Charter of Fundamental Rights to the CJEU.  There was a Panopticon Blog post on the case.

  • Various Claimants v W M Morrison Supermarkets plc [2018] EWCA Civ 2339 .

The Court of Appeal upheld the decision of Langstaff J that Morrisons were vicariously liable for a mass data breach caused by the criminal act of a rogue employee. We had a case comment from Alex Cochrane.  There was a Panopticon Blog post the case.

  • Big Brother Watch v. Secretary of State [2018] ECHR 722 .

An important case in which the European Court of Human Rights held that secret surveillance regimes including the bulk interception of external communications violated Articles 8 and 10 of the Convention. We had a post by Graham Smith as to the implications of this decision for the present regime.

  • ML and WW v Germany ( [2018] ECHR 554 ). 

This was the first case in the European Court of Human Rights on the “right to be forgotten”. This was an application under Article in respect of the historic publication by the media of information concerning a murder conviction.  The application was dismissed.  We had a case comment from Hugh Tomlinson and Aidan Wills.  There was also a Panopticon blog post on the case.

Share this:

Caselaw , Data Protection , Privacy

2018 Top 10 Privacy and Data Protection Cases

' src=

January 29, 2019 at 6:25 am

Reblogged this on | truthaholics and commented: “In this post we round up some of the most legally and factually interesting privacy and data protection cases from England and Europe from the past year.”

' src=

January 29, 2019 at 9:38 am

Reblogged this on tummum's Blog .

' src=

February 2, 2019 at 12:27 am

Very Nice and informative data…keep the good work going on

3 Pingbacks

  • Top 10 Privacy and Data Protection Cases of 2020: a selection – Suneet Sharma – Inforrm's Blog
  • Top 10 Privacy and Data Protection Cases of 2021: A selection – Suneet Sharma – Inforrm's Blog
  • Top 10 Privacy and Data Protection Cases 2022, a selection – Suneet Sharma – Inforrm's Blog

Leave a Reply Cancel reply

breach of data protection act case study

Contact the Inforrm Blog

Inforrm  can be contacted by email [email protected]

Email Subscription

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Email Address:

Sign me up!

Media Law Employment Opportunities

Penningtons Manches Cooper, Paralegal – Commercial Dispute Resolution (Reputation Management & Privacy)

Edwards Duthie Shamash, Media Law Associate, 3 – 5 years PQE

Schillings Senior Associate

Schillings Associate

Good Law Practice, Defamation Lawyer

Brett Wilson, NQ – 4 years’ PQE solicitor

Mishcon de Reya, Associate Reputation Protection, 1-4 PQE

Slateford, NQ – 2 years’ PQE solicitor

  • Top 10 Defamation Cases of 2023: a selection - Suneet Sharma
  • Case Law: Aaronson v Stones: Libel Trial, truth and the perils of tacking on a public interest defence – Floyd Alexander-Hunt
  • Top 10 Defamation Cases 2022: a selection - Suneet Sharma
  • Global Freedom of Expression, Columbia University: Newsletter, 12 August 2024
  • An exposé of whatever-it-takes culture, Eric Beecher’s The Men Who Killed the News is an idealistic book for the times - Denis Muller

Recent Judgments

  • Artificial Intelligence
  • Bosnia Herzegovina
  • Broadcasting
  • Cybersecurity
  • Data Protection
  • Freedom of expression
  • Freedom of Information
  • Government and Policy
  • Human Rights
  • Intellectual Property
  • Leveson Inquiry
  • Media Regulation
  • New Zealand
  • Northern Ireland
  • Open Justice
  • Philippines
  • Phone Hacking
  • Social Media
  • South Africa
  • Surveillance
  • Uncategorized
  • United States

Search Inforrm’s Blog

  • Alternative Leveson 2 Project
  • Blog Law Online
  • Brett Wilson Media Law Blog
  • Canadian Advertising and Marketing Law
  • Carter-Ruck's News and Insights
  • Cearta.ie – The Irish for Rights
  • Centre for Internet and Society – Stanford (US)
  • Clean up the Internet
  • Cyberlaw Clinic Blog
  • Cyberleagle
  • Czech Defamation Law
  • David Banks Media Consultancy
  • Defamation Update
  • Defamation Watch Blog (Aus)
  • Droit et Technologies d'Information (France)
  • Fei Chang Dao – Free Speech in China
  • Guardian Media Law Page
  • Hacked Off Blog
  • Information Law and Policy Centre Blog
  • Internet & Jurisdiction
  • Internet Cases (US)
  • Internet Policy Review
  • Journlaw (Aus)
  • LSE Media Policy Project
  • Media Reform Coalition Blog
  • Media Report (Dutch)
  • Michael Geist – Internet and e-commerce law (Can)
  • Musings on Media (South Africa)
  • Paul Bernal's Blog
  • Press Gazette Media Law
  • Scandalous! Field Fisher Defamation Law Blog
  • Simon Dawes: Media Theory, History and Regulation
  • Social Media Law Bulletin (Norton Rose Fulbright)
  • Strasbourg Observers
  • Transparency Project
  • UK Constitutional Law Association Blog
  • Zelo Street

Blogs about Privacy and Data Protection

  • Canadian Privacy Law Blog
  • Data Matters
  • Data protection and privacy global insights – pwc
  • DLA Piper Privacy Matters
  • Données personnelles (French)
  • Europe Data Protection Digest
  • Mass Privatel
  • Norton Rose Fulbright Data Protection Report
  • Panopticon Blog
  • Privacy and Data Security Law – Dentons
  • Privacy and Information Security Law Blog – Hunton Andrews Kurth
  • Privacy Europe Blog
  • Privacy International Blog
  • Privacy Lives
  • Privacy News – Pogo was right
  • RPC Privacy Blog
  • The Privacy Perspective

Blogs about the Media

  • British Journalism Review
  • Jon Slattery – Freelance Journalist
  • Martin Moore's Blog
  • Photo Archive News

Blogs and Websites: General Legal issues

  • Carter-Ruck Legal Analysis Blog
  • Human Rights in Ireland
  • Human Rights Info
  • ICLR Case Commentary
  • Joshua Rozenberg Facebook
  • Law and Other Things (India)
  • Letters Blogatory
  • Mills and Reeve Technology Law Blog
  • Open Rights Group Blog
  • RPC's IP Hub
  • RPC's Tech Hub
  • SCOTUS Blog
  • The Court (Canadian SC)
  • The Justice Gap
  • UK Human Rights Blog
  • UK Supreme Court Blog

Court, Government, Regulator and Other Resource Sites

  • Australian High Court
  • Canadian Supreme Court
  • Commonwealth Legal Information Institute
  • Cour De Cassation France
  • European Data Protection Board
  • Full Fact.org
  • German Federal Constitutional Court
  • IMPRESS Project
  • Irish Supreme Court
  • New Zealand Supreme Court
  • NSW Case Law
  • Press Complaints Commission
  • Press Council (Australia)
  • Press Council (South Africa)
  • South African Constitutional Court
  • UK Judiciary
  • UK Supreme Court
  • US Supreme Court

Data Protection Authorities

  • Agencia Española de Protección de Datos (in Spanish)
  • BfDI (Federal Commissioner for Data Protection)(in German)
  • CNIL (France)
  • Danish Data Protection Agency
  • Data Protection Authority (Belgium)
  • Data Protection Commission (Ireland)
  • Dutch Data Protection Authority
  • Information Commissioner's Office
  • Italian Data Protection Authority
  • Scottish Information Commissioner
  • Swedish Data Protection Authority

Freedom of Expression Blogs and Sites

  • Backlash – freedom of sexual expression
  • Council of Europe – Freedom of Expression
  • EDRi – Protecting Digital Freedom
  • Free Word Centre
  • Freedom House Freedom of Expression
  • Freedom of Expression Institute (South Africa)
  • Guardian Freedom of Speech Page
  • Index on Censorship

Freedom of Information Blogs and Sites

  • All About Information (Can)
  • Campaign for Freedom of Information
  • David Higgerson
  • FreedomInfo.org
  • Open and Shut (Aus)
  • Open Knowledge Foundation Blog
  • The Art of Access (US)
  • The FOIA Blog (US)
  • The Information Tribunal
  • UCL Constitution Unit – FOI Resources
  • US Immigration, Freedom of Information Act and Privacy Act Facts
  • Veritas – Zimbabwe
  • Whatdotheyknow.com

Inactive and Less Active Blogs and Sites

  • #pressreform
  • Aaronovitch Watch
  • Atomic Spin
  • Bad Science
  • Banksy's Blog
  • Brown Moses Blog – The Hackgate Files
  • California Defamation Law Blog (US)
  • CYB3RCRIM3 – Observations on technology, law and lawlessness.
  • Data Privacy Alert
  • Defamation Lawyer – Dozier Internet Law
  • DemocracyFail
  • Entertainment & Media Law Signal (Canada)
  • Forty Shades of Grey
  • Greenslade Blog (Guardian)
  • Head of Legal
  • Heather Brooke
  • IBA Media Law and Freedom of Expression Blog
  • Information and Access (Aus)
  • Informationoverlord
  • ISP Liability
  • IT Law in Ireland
  • Journalism.co.uk
  • Korean Media Law
  • Legal Research Plus
  • Lex Ferenda
  • Media Law Journal (NZ)
  • Media Pal@LSE
  • Media Power and Plurality Blog
  • Media Standards Trust
  • Nied Law Blog
  • No Sleep 'til Brooklands
  • Press Not Sorry
  • Primly Stable
  • Responsabilidad En Internet (Spanish)
  • Socially Aware
  • Story Curve
  • Straight Statistics
  • Tabloid Watch
  • The IT Lawyer
  • The Louse and The Flea
  • The Media Blog
  • The Public Privacy
  • The Sun – Tabloid Lies
  • The Unruly of Law
  • UK FOIA Requests – Spy Blog
  • UK Freedom of Information Blog

Journalism and Media Websites

  • Campaign for Press and Broadcasting Freedom
  • Centre for Law, Justice and Journalism
  • Committee to Protect Journalists
  • Council of Europe – Platform to promote the protection of journalism and safety of journalists
  • ECREA Communication Law and Policy
  • Electronic Privacy Information Centre
  • Ethical Journalism Network
  • European Journalism Centre
  • European Journalism Observatory
  • Frontline Club
  • Hold the Front Page
  • International Federation of Journalists
  • Journalism in the Americas
  • Media Wise Trust
  • New Model Journalism – reporting the media funding revolution
  • Reporters Committee for Freedom of the Press
  • Reuters Institute for the Study of Journalism
  • Society of Editors
  • Sports Journalists Association
  • Spy Report – Media News (Australia)
  • The Hoot – the Media in the Sub-Continent

Law and Media Tweets

  • 1stamendment
  • DanielSolove
  • David Rolph
  • FirstAmendmentCenter
  • Guardian Media
  • Heather Brooke (newsbrooke)
  • humanrightslaw
  • Internetlaw
  • jonslattery
  • Kyu Ho Youm's Media Law Tweets
  • Leanne O'Donnell
  • Media Law Blog Twitter
  • Media Law Podcast
  • Siobhain Butterworth

Media Law Blogs and Websites

  • 5RB Media Case Reports
  • Ad IDEM – Canadian Media Lawyers Association
  • Entertainment and Sports Law Journal (ESLJ)
  • Gazette of Law and Journalism (Australia)
  • International Media Lawyers Association
  • Legalis.Net – Jurisprudence actualite, droit internet
  • Office of Special Rapporteur on Freedom of Expression – Inter American Commission on Human Rights
  • One Brick Court Cases
  • Out-law.com
  • EthicNet – collection of codes of journalism ethics in Europe
  • Handbook of Reuters Journalism
  • House of Commons Select Committee for Culture Media and Sport memoranda on press standards, privacy and libel

US Law Blogs and Websites

  • Above the Law
  • ACLU – Blog of Rights
  • Blog Law Blog (US)
  • Chilling Effects Weather Reports (US)
  • Citizen Media Law Project
  • Courthousenews
  • Entertainment and Law (US)
  • Entertainment Litigation Blog
  • First Amendment Center
  • First Amendment Coalition (US)
  • Free Expression Network (US)
  • Internet Cases – a blog about law and technology
  • Jurist – Legal News and Research
  • Legal As She Is Spoke
  • Media Law Prof Blog
  • Media Legal Defence Initiative
  • Newsroom Law Blog
  • Shear on Social Media Law
  • Student Press Law Center
  • Technology and Marketing Law Blog
  • The Hollywood Reporter
  • The Public Participation Project (Anti-SLAPP)
  • The Thomas Jefferson Centre for the Protection of Free Expression
  • The Volokh Conspiracy

US Media Blogs and Websites

  • ABA Media and Communications
  • Accuracy in Media Blog
  • Columbia Journalism Review
  • County Fair – a blog from Media Matters (US)
  • Fact Check.org
  • Media Gazer
  • Media Law – a blog about freedom of the press
  • Media Matters for America
  • Media Nation
  • Nieman Journalism Lab
  • Pew Research Center's Project for Excellence in Journalism
  • Regret the Error
  • Reynolds Journalism Institute Blog
  • Stinky Journalism.org
  • August 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • February 2010
  • January 2010

© 2024 Inforrm's Blog

Theme by Anders Norén — Up ↑

Discover more from Inforrm's Blog

Subscribe now to keep reading and get access to the full archive.

Type your email…

Continue reading

breach of data protection act case study

The Privacy Perspective

Legal blogging on the protection of privacy in the 21st century, top 10 privacy and data protection cases 2022.

Inforrm covered a wide range of data protection and privacy cases in 2022. Following my posts in  2018 ,  2019 ,   2020  and  2021  here is my selection of notable privacy and data protection cases across 2022.

  • ZXC v Bloomberg  [2022] UKSC 5

This was the seminal privacy case of the year, decided by the UK Supreme Court. It was considered whether, in general a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation.

The case concerned ZXC, a regional CEO of a PLC which operated overseas. An article was published concerning the PLC’s operations for which ZXC was responsible. The article’s was almost exclusively focused on the contents of a letter sent to a foreign law enforcement agency by a UK law enforcement agency, which was investigating the PLC’s activities in the region.

ZXC claimed a reasonable expectation of privacy in relation to the fact and details of a criminal investigation into his activities, disclosed by the letter, and that the publication of the article by Bloomberg amounted to a misuse of that private information. He argued that details of the law enforcement’s investigations into him, the fact that it believed that he had committed criminal offences and the evidence that was sought were all private.

At first instance Nicklin J found for the claimant, a finding which was upheld by the Court of Appeal. There were three issues before the UK Supreme Court hearing a further appeal by Bloomberg:

(1) Whether the Court of Appeal was wrong to hold that there is a general rule, applicable in the present case, that a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation.

(2) Whether the Court of Appeal was wrong to hold that, in a case in which a claim for breach of confidence was not pursued, the fact that information published by Bloomberg about a criminal investigation originated from a confidential law enforcement document rendered the information private and/or undermined Bloomberg’s ability to rely on the public interest in its disclosure.

(3) Whether the Court of Appeal was wrong to uphold the findings of Nicklin J that the claimant had a reasonable expectation of privacy in relation to the published information complained of, and that the article 8/10 balancing exercise came down in favour of the claimant.

The Court dismissed the appeal on all three grounds. Therefore the precedent is established that there is, as a legitimate starting point, an assumption that there is a reasonable expectation of privacy in relation to the facts of and facts of a criminal investigation at a pre-charge stage.

There was an  Inforrm case comment  on the case , See also  Panopticon Blog  and  5RB case comment.

  • Driver v CPS  [ 2022] EWHC 2500 (KB )

My second case also concerns law enforcement investigations, this time the passing of a file from the CPS and the disclosure of that fact to a third party. Whilst the disclosure did not include the name of the claimant, it was found that  “personal data can relate to more than one person and does not have to relate exclusively to one data subject, particularly when the group referred to is small.”

In this case, the operation in question, Operation Sheridan, concerned only eight suspects, of which the claimant was one. It should be noted that the claim was one under the Data Protection Act 2018, not the GDPR.

In finding for the claimant on the data protection grounds, but dismissing those for misuse of private information, the Judge made a declaration and awarded £250 damages. It should be noted the “data breach was at the lowest end of the spectrum.”

See  Panopticon Blog on case

  •  AB v Chief Constable of British Transport Police   [2022] EWHC 2740 (KB)

The respondent, an individual with autistic spectrum disorder of the Asperger’s type, claimed that retention of his information by the police in relation to 2011 and 2014 accusations that he touched women inappropriately, was unlawful. The respondent stims, rubbing fabric between his fingers. In both cases no prosecution was brought against AB.

The respondent’s claim was based on the fact the data retained was inaccurate and that its retention was a disproportionate inference with his right to respect for his private life under Article 8 of the European Convention of Human Rights.

In December 2017, Bristol County Council was contacted with safeguarding concerns about AB- in particular, that he was suffering ongoing trauma due to the appellant maintaining ongoing false allegations against him.

As to the claims for inaccuracy “he complained that the records retained by the police inaccurately record that AB put his hands between the legs, and under the dress, of the 2011 complainant. He also implicitly complained that the records of the 2014 incident were inaccurate insofar as they suggested that AB had placed his hand over the complainant’s jeans in the area of her vagina.”

It was found at first instance that the police records were inaccurate, that their retention was a disproportionate interference with AB’s article 8 rights and awarded £15,000 for distress, £15,000 for loss of earnings, and £6,000 for aggravated damages.

It was found that “ the police records in this case are intended to reflect the information that was provided to the police, rather than the underlying facts as to what happened. On this issue I have reached a different conclusion from the judge, with the result that I have concluded that the OSRs are accurate. To this narrow extent, the appeal succeeds. ” [95]

However, the article 8 finding for the claimant was upheld, as was, accordingly, the judge’s declaration that retention was unlawful and the assessment of damages.

  • Chief Constable of Kent Police v Taylor  [2022] EWHC 737 (QB)

A breach of confidence claim relating to a series of videos which the defendant was provided by Berryman’s Lace Mawer LLP (“BLM”). The videos were said to contain sensitive information in relation to a vulnerable minor, KDI, who was the subject of an anonymity order in civil proceedings. The videos themselves were particularly sensitive, relating to police interviews of KDI in relation to criminal allegations against them.

The claimant sued the CC of Kent Police for damage to his front door which occurred in the course of entering his property to search for child pornography. BLM acted for the CC of Kent Police in relation to this matter. During the course of those proceedings that the defendant was given access to the videos, which were for an unrelated claim.

The defendant refused to delete the videos upon request or to explain his dealings with the videos. He instead demanded payment if thousands of pounds for his cooperation with the requests.

The Judge accordingly ordered the defendant disclose matters in relation to his dealing with the videos, to ensure confidentiality has not been breached. A further, unusual, order was granted for independent permanent deletion of the videos- it should be noted the order considered the defendants privacy in the coruse of such an inpdenendent assessment being undertaken with the judge stating “I have built in a safeguard in the order I propose to make to limit the nature of the independent IT expert’s role to protect Mr Taylor’s privacy interests”.

  • Various Claimants v MGN  [2022] EWHC 1222 (Ch)

A case concerning the ongoing phone hacking litigation against Mirror Group Newspapers (“MGN”) in which MGN issued and served applications for summary judgment in 23 individual claims. The judge grouped the claims, with this judgment considering six claimants.

It was considered by the judge whether claimants should have been put on notice at various times up until and following the first primary trial in the scandal on 21 May 2015. The judge found that such matters were not “clear-cut” for the purposes of determining whether summary judgment could be entered into; they were more appropriate to be settled at trial.  There was a comment on the case on the  JMW blog .  On 11 August 2022 Andrews LJ  refused MGN permission to appeal .

  • Brake v Guy  [2022] EWCA Civ 235

The claimants appealed an order dismissing their claim for a final injunction and damages for misuse of private information and breach of confidence. The claim was made in relation to a series of emails sent to and received by the first claimant, Mrs Brake, into a business general enquiries email account. The Court reviewed whether “the judge’s evaluation of the evidence which led him to conclude that they had no reasonable expectation of privacy in respect of the contents of the enquiries account and that the information was not imparted to the Guy Parties in circumstances which gave rise to an obligation of confidence.”

Only two of the 3,149 tranche of emails were produced for the judge to consider- he was, understandably, not inclined to accept that there was a reasonable expectation of privacy in relation to the emails on the basis of those two emails alone. The burden of proof was considered to be “a very substantial hurdle” which the claimants had “fallen well short of surmounting it”.

The arguments for breach of confidence were advanced on the same grounds and dismissed. The judge concluded “the claimants have put forward no argument before this Court which persuades me that the judge was wrong to conclude that the personal information in the enquiries account was not “imparted in circumstances imparting an obligation of confidence.””

The case is instructive as to the method and approach to be taken when claiming there is a reasonable expectation of privacy or obligation of confidence in relation to a high volume of documents. It also provides a tacit reminder of the difficulties over overcoming first instance privacy decisions on appeal. There was  a DLA Piper case comment .

  •  TU and RE v Google LLC  [2022] EUECJ C-460/20

A case concerning two claimants applying for the delisting of search results under Article 17 of the GDPR.

The case is instructive as to the pleading of inaccuracy of data in erasure requests- where it arises and where it does, how such a request should be dealt with:

  • The case states at [72 and 73]:  “where the person who has made a request for de-referencing submits  relevant and sufficient evidence capable of substantiating  his or her request and of establishing the manifest inaccuracy of the information found in the referenced content or, at the very least, of a part – which is not minor in relation to the content as a whole – of that information,  the operator of the search engine is required to accede to that request  for de-referencing. The same applies where the data subject submits a judicial decision made against the publisher of the website, which is based on the finding that information found in the referenced content – which is not minor in relation to that content as a whole – is, at least prima facie, inaccurate” , and
  • “By contrast, where the inaccuracy of such information found in the referenced content is  not obvious, in the light of the evidence provided by the data subject , the operator of the search engine is not required, where there is no such judicial decision, to accede to such a request for de-referencing. Where the information in question is likely to contribute to a debate of public interest, it is appropriate, in the light of all the circumstances of the case, to place particular importance on the right to freedom of expression and of information” .

For further analysis please see the  Panopticon Blog’s excellent analysis of this case .

  • SMO  v TikTok Inc.   [2022] EWHC 489 (QB)

The former Children’s Commissioner of England’s case against Tik Tok for data protection infringements and misuse of private information was discontinued this year. The result was due to the myriad of procedural issues arising in relation to the case including permission to serve out of jurisdiction, extension of time and permission to serve on UK lawyers instead. The case serves as a warning for claimants seeing to issue data protection claims outside of the jurisdiction of ensuring it is done so in proper time and with consideration of matters such as service outside of jurisdiction.

See  Panopticon Blog  on case and on the  discontinuance of the claim .

  • Smith & Other v TalkTalk Telecom Group Plc  [2022] EWHC 1311 (QB)

A claim under the Data Protection Act 1998 and tort of misuse of private information, following a mass data breach. The case concerned three applications:

  • For strike out of the misuse of private information claim and references to unconfirmed breaches in the particulars;
  • For permission to amend the particulars of claim in light of the case  Warren v DSG Retail Ltd  [2021] EWHC 2168 (QB); and
  • An application for further information.

The misuse of private information claim was dismissed. Although the claim had been repleaded to focus on “acts” rather than “omissions” (in an attempt to avoid the consequences of the  Warren  decision), the Judge followed his own decision in  Warren,  holding that the action was, in substance, a claim in negligence and that creating a situation of vulnerability to third party data theft was not a claim in missue of private information.  There was an  Inforrm post on the case  and a two part discussion of the issues  here  and  here . See also the  Panopticon Blog on case .

This case was the final nail in the coffin of mass data breach claims on CFAs supported by ATE insurance (as these are not available in data protection cases).  Unless forming part of group litigation, data breach claims are likely to be transferred to the small claims track (see  Stadler v Currys Group Limited  [2022] EWHC 160 (QB) ).

  • Owsianik v. Equifax Canada Co. ,  2022 ONCA 813

An appeal arising out of three separate class actions in which the plaintiffs sought to apply the tort of inclusion upon seclusion in “data breach” cases.  The Ontario Court of Appeal held that on the facts as pleaded, the defendants did not do anything that could constitute an act of intrusion or invasion into the privacy of the plaintiffs. The intrusions alleged were committed by unknown third-party hackers, acting independently from, and to the detriment of, the interests of the defendants.  The defendants’ alleged fault was their failure to protect the plaintiffs by unknown hackers which could not be transformed into an invasion by the defendants of the plaintiffs’ privacy.

This decision in Ontario is consistent with the approach of the English court in Case No.9.  There were case comments by  Blakes  and  McCarthy Tetrault.

Share this:

3 thoughts on “ top 10 privacy and data protection cases 2022 ”.

  • Pingback: Quotes from caselaw 7: Driver v CPS [2022] EWHC 2500 KB – a departure from the starting point of a reasonable expectation of privacy in criminal investigations pre-charge on “special facts” and low value data breaches – The Privacy
  • Pingback: Remote visual support and data privacy compliance | ViiBE

Leave a comment Cancel reply

' src=

  • Already have a WordPress.com account? Log in now.
  • Subscribe Subscribed
  • Copy shortlink
  • Report this content
  • View post in Reader
  • Manage subscriptions
  • Collapse this bar

Reed Smith LLP

Reed Smith LLP

10 January 2023 Reed Smith In-depth

Data, distress, and damage: UK data protection and privacy case law in 2022

Authors: Elle Todd Jonathan J. Andrews

Stadler v. Currys Group Limited [2022] EWHC 160 (QB)

This long-running case concerned claims brought against Currys Group Limited (Currys). Currys sold Mr Stadler’s used smart TV to a third party (after he had returned it to Currys without logging out of various installed apps), resulting in a movie being purchased through Mr Stadler’s Amazon Prime account. Despite Currys reimbursing him the balance (£3.49) and giving him a £200 goodwill voucher, Mr Stadler chose to pursue Currys for misuse of private information, breach of confidence, negligence and breaches of the UK GDPR and the Data Protection Act 2018 (DPA 2018), seeking damages totalling £5,000.

It was held that:

  • In line with the decision in Lloyd v. Google that damages for non-trivial breaches were not recoverable under the Data Protection Act 1998 (DPA 1998) unless there was proof of material damage (or distress), the same “appeared to apply equally” to equivalent claims under the UK GDPR; and that, per Rolfe & Ors v. Veale Wasbrough Vizards LLP [2021] EWHC 2809, a de minimis threshold needed to be passed before claims for distress alone could be successfully brought. Consequently, these claims were dismissed.
  • Following case law such as Warren v. DSG Retail Ltd [2021] EWHC 2168 ( Warren v. DSG ), the High Court was not the appropriate forum for low-value data claims, with Lewis J also criticising attempts to overcomplicate what was at its heart a simple claim in order to justify this.
  • Upholding the precedents set in Warren v. DSG , the claims for misuse of private information and breach of confidence were struck out (as these must involve active “use” or “misuse” of information by a defendant, not just omissions), as was the claim for negligence (given that, where statutory duties are in place, there is no need to impose a duty of care).

Key takeaways:

  • The judgment provides precedent for applying Lloyd v. Google’s requirements for bringing a successful compensation claim under the DPA 1998 to equivalent claims under the UK GDPR (though unlike Lloyd v. Google, this is not a Supreme Court case and so higher courts could rule otherwise in future).
  • The judgment also supports the precedent set by Rolfe & Ors v. Veale Wasbrough Vizards LLP [2021] EWHC 2809 regarding de minimis thresholds for distress claims (as an aside, a similar decision has also been reached preliminarily in the EU by Advocate General Campos Sanchez-Bordona in the CJEU case of UI v. Österreichische Post AG (Case C-300-21) in October 2022, holding that harm alleged in data breach claims must go beyond “mere upset” to be actionable).
  • Attempts to ‘augment’ what should be a clear claim for breach of data protection law with various other heads of claim are even less likely to be successful, with multiple decisions now finding against this practice. This also further limits the recovery of after-the-event (ATE) insurance premiums, which had been common for claimants in low-value data claims typically for breach of confidence and misuse of private information claims, to cover their costs and to pressure defendants into settling (and into paying more money to settle) by having to factor in ATE premiums when considering their costs liability – and as such premiums may well no longer be recoverable in such cases, claimants will need to give more thought to purchasing this, which may well reduce the number of similar claims brought in practice.
  • Further increases the likelihood of similar claims, which have often recently been commenced in the Media and Communications Claims List of the High Court, instead being allocated/re-allocated to the small claims track of the relevant county court (where it is not generally possible to recover costs).

Bloomberg LP (Appellant) v. ZXC (Respondent) [2022] UKSC 5

In this case, Bloomberg LP (Bloomberg) obtained a confidential letter of request sent to ZXC by a legal enforcement body regarding a criminal investigation and published an article that referred to the fact that information had been requested of ZXC and the issues it was being investigated for. ZXC succeeded in a High Court claim for misuse of private information against Bloomberg, which Bloomberg appealed first to the Court of Appeal (which was dismissed) and then to the Supreme Court.

The Supreme Court held that ZXC had a reasonable expectation of privacy in a police investigation up to the point of charge and that, in this case, the right to freedom of expression did not outweigh this. Consequently, it found in favour of ZXC’s claim, awarded it £25,000 in damages, and granted an injunction preventing Bloomberg from publishing its article of the information in question further within the jurisdiction.

  • Amongst a range of case law this year emphasising the dangers of pursuing claims for misuse of private information without sufficient grounds, this case is a useful reminder that, in the right circumstances, misuse of private information claims can still be successfully brought – and may also require the payment of non-trivial damages sums.
  • The Supreme Court also noted (with respect to comments made by the Court of Appeal) that, although information may be both private and confidential, the causes of action for misuse of private information and breach of confidence are distinct. It will be interesting to see how this affects claims in which both heads of claim are pursued (and particularly where both are brought alongside further claims and without clearly differentiating between the grounds for each different head of claim).
  • Interestingly, the ICO’s Draft Journalism Code of Practice cites the High Court judgment as a case which is useful for data controllers in considering the lawful use of personal data under data protection laws, despite the case itself not concerning a data protection claim. The draft ICO code refers to the case in emphasising that the starting point should be that “a suspect has a reasonable expectation of privacy regarding investigations, including the fact that there is an investigation”. It is useful to remember that a data regulator will look at related privacy case law at least in providing guidance, even where it is not an action brought specifically under the laws which it regulates.

Smith v. TalkTalk Telecom Group Plc [2022] EWHC 1311 (QB)

This case concerned data breaches occurring in 2014 and 2015 that resulted in the ICO fining TalkTalk. The claimants (of which there were 385 in total, constituting both actual and potential TalkTalk customers) brought claims for misuse of private information and for compensation under the DPA 1998, on the grounds that TalkTalk’s measures to protect their personal data were insufficient and enabled third parties to access and fraudulently use this. Following Warren v. DSG , the claimants also sought to amend their particulars of claim to argue that TalkTalk’s security failures themselves constituted “acts”.

Saini J struck out and dismissed the claim for misuse of private information, since (per Warren v. DSG ) this was based on TalkTalk’s alleged security failures as opposed to its “positive act” of “use” or “misuse”, and characterised their claim as “a negligence action masquerading as a claim for MPI”. The court did concede that a data breach had occurred on the facts, but this did not constitute a breach of data protection law.

  • Further precedent making clear that claimants should think carefully about the appropriate heads of claim to bring as opposed to trying to bring multiple (or the wrong) heads of claim – with the same implications for ATE insurance as mentioned above.
  • Demonstrates the importance of being able to evidence “use” or “misuse” of information by the defendant before attempting to bring a claim for misuse of private information (and further 2022 judgments have re-emphasised this – see, for example, Underwood & Anor v. Bounty UK Ltd & Hampshire Hospitals NHS Foundation Trust [2022] EWHC 888 (QB) , with very similar findings).
  • Also shows the importance of being able to evidence and establish actual breaches of data protection law – part of the difficulty with the claimants’ DPA 1998 claim was that it was based on “unconfirmed breaches” (with the claimants arguing that these must have occurred at some unspecified point in time), which did not find favour with the court.

Bennett & others v. Equifax Ltd [2022] EWHC 1487 (QB)

This case in fact concerned several cases arising from a data breach involving 700,000 data subjects by Equifax Ltd (Equifax), which was issued in 2017 with an ICO fine under the DPA 1998 totalling the maximum amount possible (£500,000). Of that 700,000, over 100,000 had issued claims, and the claimants consequently sought a group litigation order (GLO) (a method of litigating multiple claims distinct from representative actions such as that dismissed in Lloyd v. Google ). Equifax opposed this, arguing that preliminary causation and loss issues should first be determined, as it would be disproportionate to proceed with a GLO if most of the claims in question had little to no worth.

Although the key issue was not decided (and instead was referred for consideration by a judge at a Case Management Conference), the senior master did make obiter comments seeming to sympathise with concerns raised by Equifax and suggesting that “it may be unlikely that the entirety of the Claimant cohort will be able to establish either financial loss or distress to enable compensation to be awarded”.

  • Suggests that the issues with multiple data claims where it cannot be evidenced that each individual has suffered damage (one of the reasons for the failure of the representative action claim in Lloyd v. Google ) may also apply to other forms of group litigation such as GLOs (though it should be emphasised that these comments were non-binding)

Driver v. Crown Prosecution Service [2022] EWHC 2500 (KB)

This case, one of the first data cases in the King’s Bench Division (as the Queen’s Bench Division became in September), concerned the former leader of Lancashire County Council who, having been informed that he was no longer a suspect in a police investigation into local government corruption (and making press statements stating this) then became the subject of investigation again. The CPS subsequently emailed a third party (and political opponent of Mr Driver), stating that a charging file had been referred to it for consideration (but did not mention Mr Driver’s name in its email). The recipient shared the email more widely, and Mr Driver brought claims under the UK GDPR and DPA 2018 and for misuse of private information (as well as claims under the Human Rights Act 1998 and of negligence which were ultimately not pursued) on the basis that the email had caused him distress, seeking damages of up to £2,000.

This case is notable because it is one of the few to involve an actual award of damages as compensation for a breach of data protection legislation. The court dismissed the claim for misuse of private information but did find (as the CPS originally admitted, although they then attempted to deny at trial) that a personal data breach had occurred and that this constituted a breach of the DPA 2018, awarding Mr Driver £250 in damages.

  • A useful indication of the likely size of damages that courts will order where compensation is found to be payable under data protection laws. Given that the claimant had sought “damages not exceeding £2,000”, the decision to make an award totalling just 12.5 per cent of that total is noteworthy.
  • It should be noted that the court described this breach as being “at the lower end of the spectrum” (and so more serious breaches may result in higher sums being payable). Equally, the CPS’s decision to attempt to deny at trial that a breach had occurred despite it having previously admitted to this may have also influenced the award of damages on this occasion.
  • The failure of the claim for misuse of private information (due to the relevant information already being in the public domain) evidences additional hurdles beyond those covered in the above cases to successfully bringing such a claim.

In-depth 2023-007i

Share Tools

  • Share on Facebook
  • Share on LinkedIn
  • Share via Email
  • Print This Page

You May Be Interested

Advertisement

Advertisement

The Normative Power of the GDPR: A Case Study of Data Protection Laws of South Asian Countries

  • Original Research
  • Open access
  • Published: 07 March 2022
  • Volume 3 , article number  183 , ( 2022 )

Cite this article

You have full access to this open access article

breach of data protection act case study

  • Vibhushinie Bentotahewa   ORCID: orcid.org/0000-0002-3155-6496 1 ,
  • Chaminda Hewage   ORCID: orcid.org/0000-0001-7593-6661 1 &
  • Jason Williams   ORCID: orcid.org/0000-0003-1788-3455 1  

6919 Accesses

6 Citations

Explore all metrics

The increased dependency on technology brings national security to the forefront of concerns of the 21st century. It creates many challenges to developing and developed nations in their effort to counter cyber threats and adds to the inherent risk factors associated with technology. The failure to securely protect data would potentially give rise to far-reaching catastrophic consequences. Therefore, it is crucially important to have national, regional, and global data protection policies and regulations to penalise those engaged in unethical use of technology and abuse the system vulnerabilities of technology. This research paper aims to analyse GDPR inspired Bills in the South Asian Region and to identify their appropriateness for developing a global level data protection mechanism, given that Asian nations are far more diverse than those of the European nations. Against that background, the objectives of this paper are to identify GDPR inspired Bills in the South Asian Region, identify the similarities and disparities, and the barriers to developing a regional level data protection mechanism, thereby fulfilling the need for developing a global level mechanism. This research is qualitative in nature, and with that in mind, the researcher conducted an extensive literature survey of previous research papers, journal articles, previous survey reports and government publications on the above content. Taking account of the findings of the survey, the researcher critically analysed the important parameters identified in the literature review. The key findings of this research indicate that many countries in the South Asian region are in the process of reviewing their current data protection mechanisms, in line with GDPR. In concluding, the researcher emphasised the need to develop adequate data protection mechanisms and believed that going forward it would be the appropriate and practical way to develop a consensus-based regional mechanism that would ultimately enable to develop a lasting global level data protection mechanism.

Similar content being viewed by others

breach of data protection act case study

Italian National Framework for Cybersecurity and Data Protection

breach of data protection act case study

Cybersecurity and the State

breach of data protection act case study

Lithuania and Romania to Introduce Cybersecurity Laws

Explore related subjects.

  • Artificial Intelligence
  • Medical Ethics

Avoid common mistakes on your manuscript.

Introduction

The nations of the world have become an integrated community; just so, the people are having to adapt to rapidly evolving changes in lifestyles, the dependency on progressive advancement of technology is one of them. The introduction of IT systems and advanced computer technology feature prominently amongst many sectors, commercial, governance, shopping, travel, banking, and many more, and the extensive use of IT systems is rapidly becoming a way of life for many. The generation of vast amount of data is a by-product of advanced technology, and this phenomenon continues to grow at an unprecedented rate. The technologies like Artificial Intelligence, the Internet of Things, and Big Data are used to collect data, and collected data is processed and stored, unbeknown to the individual or the public at large. The processing of data in this way continues to challenge the legal framework in every jurisdiction.

In the digitalised world, the right to privacy goes hand in hand with data protection, and it makes the right to privacy an essential element of democratic values. The increasing dependency on technology brings national security to the forefront of concerns of the 21st century and the developing and developed nations face many challenges in their attempts to counter cyber threats, mitigate the risks, and in finding solutions to likely impact arising from the use of the cyberspace. For instance, the military of the United States Army recognises cyberspace as the most important battle space after land, water, and sea [ 1 ]. A former CIA director and senior ranked General has stated the cyber threats across the world had a similarity, and the nations should recognise cyberspace as an important battle space and critical part of national security [ 1 ].

The absence of a strong response to malicious behaviour would be taken as a weakness in the eyes of the law and, naming and shaming of offenders may not have the desired effect [ 2 ]; besides, bringing them to account in such situations will be a challenging task. Therefore, the member states of the EU enacted the GDPR to provide a legal framework setting out guidelines for collecting and processing the personal information of the individuals [ 3 ]. In May 2018, the European Union adopted the General Data Protection Regulation (GDPR), a legal measure aimed to provide a set of standardised data protection laws across its member states [ 4 ].

This initiative has encouraged the countries outside the EU to revisit their own data protection mechanisms, modelling on the GDPR, that has prompted several nations to begin enacting their Personal Data Protection laws to bring them in to par with GDPR. The member countries of the Association for South-East Asian Nations (ASEAN) following the trend, refined and implemented their law/s to uphold the data protection mandate [ 5 ]. However, in the South Asian Association for Regional Cooperation (SAARC) region, most of the countries are in the process of developing data privacy laws [ 6 ], but no visible progress is being made towards producing a region relevant law/s resulting from the SAARC agreements [ 7 ]. The Asian nations are extensively diverse than those of the European nations, and their governance systems are driven by the colonial experience and the wide variations in ideological attitudes such as diverse political and religious beliefs, and multilingualism inherent in the fabric of the society make collaboration a challenging one.

The United Nations (UN) General Assembly, in its Resolution on the Right to Privacy in the Digital Age, noted that the rapid pace of technological development attracted users all over the world to modern Information and Communication Technologies (ICT) [ 8 ]. The governments and the companies made use of this trend to increase their capacity to undertake surveillance, interception, and data collection processes that would unwittingly risk violation of privacy rights [ 8 ]. Given the potentially controversial nature of the issues involved, the UN General Assembly and the Office of High Commissioner for Human Rights stressed on the need to protect privacy rights when the users are connected to online services [ 8 ]. Therefore, the case for undertaking an extensive review to develop a consistent legal framework is beyond doubt and is crucially important and vital now more than ever before.

However, there are several countries without sufficient data protection mechanisms, even without any means to protect their citizen’s personal privacy [ 9 ], and the reason is the lack of resources and the shortage of professionals with sufficient understanding of the issues [ 10 ]. Also, at the national level, each country faced with internal difficulties compelled to grapple with different challenges, which makes it even more important to understand such inhibiting factors and make allowances and flexibility to the adaptation of the framework in emerging scenarios. Despite all the challenges, the case for developing an (having an) international strategy for data security and privacy is important [ 11 ]. The need to have a robust and meaningful data protection mechanism should not be underestimated, and it is the most effective or even the only way forward to safeguard personal privacy and national security.

This research paper provides an understanding of the GDPR inspired bills developed by counties in the South Asian region. The researcher seeks to determine whether the mechanisms developed at the national level would contribute to the development of a global level data protection mechanism. To that end, this research paper aims to analyse the GDPR inspired Bills in the South Asian Region and identify the course of action they would take towards the development of a global level data protection mechanism. In pursuing that aim, the objectives of this paper are to identify GDPR inspired Bills in the South Asian Region, identify the similarities and dissimilarities, and identify the barriers to developing a regional level data protection mechanism that would meet the requirement for developing a global level mechanism.

In addressing the research aim, the researcher reviewed available literature surrounding GDPR and GDPR inspired bills in the South Asian region to analyse the contribution of national level data protection mechanisms towards developing a global level data protection mechanism. Except for the article authored by Greenleaf on GDPR inspired bills in the South Asian region, the quantity of research papers covering GDPR inspired bills is minimal in the South Asian Region. He had only focused on Sri Lanka, Pakistan, and Nepal, without any discussion on the challenges and barriers those countries faced in accepting and implementing data protection mechanisms had not been discussed. Therefore, the researcher believes that the meaningful analysis of national-level mechanisms and the barriers faced by the countries in the South Asian region highlighted in this paper would make a valuable contribution to existing literature.

General Data Protection Regulations (GDPR)

The European Union (EU) enacted the GDPR [ 12 ] governing personal data protection to promote the establishment of a regional strategy for information security based on fundamental rights underpinned by democracy. Amongst other factors, personal data protection is one important element of the rights of the people, and it offers sanctuary to the individuals and makes them feel secure from unethical intrusions to their personal data. The existence of privacy protection regulations will encourage the governments to recognise and acknowledge the differences in privacy interests amongst the countries. That makes it important to have appropriate provisions included in the legal framework to protect the victims affected by privacy breaches.

The collection, use, and disclosure of personal information of individuals are concerning issues in a climate of rapidly developing information processing technology, and an increasing number of people are becoming concerned about their privacy being compromised in the process. Therefore, the overriding concern is how secure collecting, disclosing, processing, and managing personal data is. That leads to emphasising the crucial importance of having adequate data privacy laws around the world.

GDPR prescribes eight Data protection principles, Lawfulness, Fairness and Transparency, Purpose Limitation, Data Minimization, Accuracy, Storage Limitation, Integrity and Confidentiality, Accountability [ 13 ]. Some changes have been introduced in data protection by harmonizing the data privacy laws across Europe. The European Union (EU) prohibits data transfer from an enterprise in the EU region to countries that do not match the same level of EU Regulations on data protection [ 14 ]. The implication is that an organisation or an individual from any part of the world handling information of EU citizens, even based in an EU member state, come under the purview of GDPR. The new rules also provide the EU citizens with a set of rights, including the right to access and the right to be forgotten personal information [ 14 ].

The enterprises that undertake activities relevant to the processing of personal data are required to employ a data protection officer, and [ 14 ] reporting of data security breaches is mandatory as recommended by the commission. The enterprises are obliged to alert both their data protection authority and the people affected by the data breaches within 72 h of detection and provide a detailed report of the incident, including a recovery plan proposal for mitigating its effects [ 14 ]. Those organisations found to violate the GDPR set rules would be liable for substantial fines. The maximum penalty for a GDPR violation is 20 million euros or 4 percent of a company’s annual global revenue from the year before, whichever is higher [ 15 ]. European Union by endorsing the GDPR, has taken the lead in instituting data privacy regulations. It is incumbent on other countries to follow suit and develop a robust, meaningful legislative framework for data protection worldwide.

Some also argue that instead of harmonisation, GDPR would lead to the creation of more national discrepancies and inconsistencies in the current policies [ 16 ]. The GDPR has set standards that no data controller would risk ignoring, and other governments will be compelled to level up to allow other economies unhindered access to the single digital market of the EU. There are visible signs that Japan for instance has expressed its intentions to introduce similar provisions to the GDPR [ 16 ]. The commercial sector in the UK is making every effort to make GDPR the norm in post-Brexit Britain, and the UK remains committed to the privacy principles enshrined in the EU Regulations. The UK Government has also pledged to introduce a new ‘digital charter’ to ensure the UK remains the safest place to use online facilities [ 17 ].

The emerging modern technologies generate vast volumes of data, and it is important to ensure that the information is securely collected, processed, transmitted, stored, and accessed. However, given the enormity of the data generated daily, there will also be a tendency for conflicts to occur in the process of gathering and protecting data, particularly in terms of privacy. In the next section, the researcher focuses on the data protection mechanisms in the South Asian region.

Actions Taken by Countries in the South Asian Region

The eight states of the South Asian region, India, Sri Lanka, Bangladesh, Pakistan, Bhutan, Nepal, Maldives, and Afghanistan, make up the SAARC [ 18 ]. It is the data privacy regulations developing hub, there is a strong possibility that South Asia will emerge with several laws that match existing international standards, but the indicators are some countries are well advanced whilst some have not made much progress towards establishing privacy protection ethics in South Asia. At the present time, there are no emerging sign of a SAARC regional initiative materialising in the near future, and the achievement of a successful outcome seems some distance away [ 6 ].

In 2005, the Pakistan Ministry of Information Technology circulated a draft law on data protection, but it was not presented to the parliament [ 19 , 20 ]. It appeared the legislation had been drafted primarily to meet the needs of the country’s software industry to conduct international business rather than to address actual privacy issues [ 20 ]. Therefore, this draft legislation seemed to have been a half-baked red herring and fallen short of its applicability to processing personal or corporate data by federal, provincial, or local government institutions [ 20 ].

The Personal Data Protection Bill 2020 was introduced by the Ministry of Information Technology and Telecommunications (MOITT) later, but it has not been tabled before the National Assembly or presented to the Senate for its approval up to now [ 21 ]. The bill encompasses many provisions that are in line with the international data protection regulatory framework. The legal obligations for data controllers and processors are broadly in par with other international laws, including GDPR, and they encapsulate the requirements to provide notice of consent, retention, disclosure, breach notification, and cross-border transfers [ 22 ]. Similarly, the rights of individuals are broadly aligned with those in other jurisdictions and include the right to access and to amend data, to withdraw consent, request for erasure of data, and to request a data controller to cease processing their data [ 22 ]. However, certain aspects of the bill remain out of alignment with widely accepted privacy norms, including a potential data localization requirement [ 23 ]. One notable element in the bill is the omission of the provision to appoint a Data Protection Officer. However, the power of the personal data protection authority of Pakistan bestows power to formulate responsibilities of the Data Protection Officer [ 24 , 25 ].

The bill states that a data controller shall not process personal data, including sensitive personal data, unless the data subject has given consent to the processing of the personal data [ 21 ]. The bill contains provisions allowing a data subject to give notice in writing to withdraw his/her consent to the processing of personal data, and the data controller, upon receiving such notice will have to stop the processing of personal data [ 21 ]. There are exceptions to the rule in cases of public interest, freedom of expression, and the security of the state as and when it becomes paramount. The bill also specifies that critical personal data shall only be processed in a server or data centre located in Pakistan, which indicates that Pakistan is to some extent shadowing the data localisation policies [ 26 ].

The transfer of personal data, collated by banks, insurance companies, hospitals, defence establishments and other sensitive institutions, to any individual or organisation is conditional on assurance of confidentiality and obtaining prior consent from the data subject [ 22 ]. Also, the bill categorically stipulates that the country receiving transferred data has personal data protection provisions that are at least equivalent to those provided in the bill, and the data so transferred should be processed in accordance with the bill where applicable [ 22 ].

The bill provides guidance and follows up action in the event of a personal data breach. The data controller shall, without undue delay where reasonably possible, and within 72 h of a reported personal data breach, notify it to the relevant authority except where the personal data breach is unlikely to result in a risk to the freedom and rights of the data subject [ 25 ]. The notification should be in writing, and the incident report should include the nature of the personal data breach, name and contact details of the data protection officer or other contact points where more information can be obtained, likely consequences of the personal data breach, and the measures in placed or proposed to be adopted by the data controller to address the personal data breach [ 27 ]. The bill states that anyone found to violate any of the bill's provisions, such as processing, disseminating, or disclosing personal data, shall be prosecuted and incur a fine of up to PKR 15 million [ 21 ]. For any subsequent offences, unlawful processing of personal data and sensitive data, the threshold of fines would rise to as high as PKR 25 million [ 21 ]. Furthermore, the bill states that anyone failing to adopt the security measures that are necessary to ensure data security and failing to comply with the orders of the personal data protection authority of Pakistan, shall be punished and incur a fine up to PKR 5 million [ 21 ].

The summary outlined in the table (See Table 1 ) suggests that Pakistan provides adequate data protection, in line with GDPR. However, the bill has discrepancies in terms of the need to appoint a Data Protection officer and liability in the form of fines.

Informational privacy has won the recognition of the Supreme Court of India and the right to privacy as a fundamental right under the Constitution and has underscored the right to life and personal liberty [ 29 ]. This is the first time the Supreme Court has pronounced the right of individuals to their personal data, and privacy and data protection has been placed high on the national agenda as data use is considered a key element in the growth and economic development. However, India is not in par with any convention on the protection of personal data but is a signatory to other international declarations and conventions such as the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights, which recognises the right to privacy [ 30 ].

The Information Technology Act 2000 governs the protection of personal information, specifically electronic data and transactions [ 31 ]. Since 2011, various replicates of the Privacy Bill have been released, and the Data Privacy Bill 2017 is the latest. The draft of the Personal Data Protection Bill (PDPB) 2018 was intended to replace the Data Privacy Bill 2017 and still awaiting approval [ 32 ]. As cited, the Indian parliament is expected to vote on the Personal Data Protection Bill of 2019 during the 2020 budget session. On parliamentary approval, India becomes the third-largest entity to implement formal legal frameworks governing the use and share of personal data [ 33 ]. The same criteria and standards would apply to all enterprises, including technology companies, e-commerce platforms, real-estate firms and brokers, banking business correspondents, auto dealers, hotels, and restaurants [ 34 ].

The Data Protection Bill regulates the use of personal data collected, disclosed, shared, or processed in India, or associated businesses within India, conditional cross-border transfers requiring data fiduciaries to store data in India. The bill also sets out the obligations that would bind all entities processing data to adhere to a host of requirements such as data-minimization, notice-and-consent, transparency, security safeguards and localization [ 34 ]. Also, mandatory data breach notification, obtaining prior consent to collect data, and individual privacy rights remain stringent requirements of the bill [ 31 ]. These are clearly stipulated as obtaining consent in advance and in writing from the data subject specifying the purpose for which the data would be used before the collection of the data [ 32 ]. There is also a provision for collecting sensitive information for lawful purposes connected with a function or purpose of the corporate entity on a necessity basis whilst ensuring that the information so collected was used for the intended purposes only [ 32 ]. There is no specific time frame for retaining sensitive personal information [ 32 ], but the retention period should not be longer than necessary.

There are also conditional exceptions included in the PDPB, specifically on the processing of personal data for national security, law enforcement, legal proceedings, delivery of medical or health services in emergencies situations and epidemics, provision of assistance during disasters and breakdown of public order, research and archive purposes or where processing is by small entities [ 34 ]. PDPB is also applicable to entities outside the territories of India to the extent that the central government may regulate any cross-border data transfers outwards from India. The government has powers to permit such transfers subjected to the provision of an adequate level of personal data protection, adherence to laws and international agreements, and the effectiveness of the enforcement by authorities with appropriate jurisdiction [ 35 ].

There is no specific level of penalties set for data security breaches in the current legal framework, and the appointment or role of a data protection officer is not mentioned in the IT rules. However, should the PDPB comes into force, the data fiduciary would be required to appoint a data protection officer and set out his functional roles as specified in the PDPB and the officer's specific functions as deemed necessary [ 36 ]. The failure to take appropriate action promptly in response to a data security breach, the data fiduciary shall be liable to a penalty which may extend up to either 2% of its total worldwide turnover in the preceding financial year or Fifty Million Indian Rupees (INR 50 million) whichever is the higher [ 36 ]. The same will apply in the case of a data protection officer failing to fulfil his/her responsibilities [ 36 ].

The table on Draft Personal Data Protection Bill of India compared with GDPR (See Table 2 ) infers that the Draft Personal Data Protection bill contains guidelines very similar to GDPR except for the level fines specified in the bill.

It is only recently legal protection for infringement on personal data has become available in Bangladesh [ 37 ], and prior to that, a specific statute was non-existent in the country. Also, data privacy and underlying protection rights and requirements appear to be new concepts. In many instances, the country seems to have been on the verge of facing major threats to privacy and personal data leakage [ 38 ]. Therefore, in the absence of a legal framework to curb future challenges of protecting citizens' privacy, the need to develop data protection laws became an imperative priority for the country.

The Information and Communication Technology Act of 2006 (The technology Act) and Digital Security Act addresses issues relating to wrongful disclosure, misuse of personal data, and violation of contractual terms in respect of personal data [ 39 ]. The Information and Communication Technology (ICT) Act of 2006 has provisions to bring prosecutions against the perpetrators for unauthorised intrusions and access to personal data, but the inherent loopholes allow the offenders to evade prosecution against crimes committed anonymously [ 38 ]. Under this Act, those responsible for committing an offence of disclosing confidential and private information could be liable for punitive imprisonments up to two years, with or without a fine extendable up to BDT 200,000 [ 39 ].

According to the Constitution of the People’s Republic of Bangladesh, every citizen shall have the right to privacy in correspondence and other means of communication [ 40 ]. In that respect, the basic framework for data protection and privacy sets out the rights of privacy granted under the Constitution of Bangladesh, alongside the Information Communication Technology Act 2006 and the newly enacted Digital Security Act 2018 [ 39 ].

The enactment of the Digital Security Act of 2018 has enabled Bangladesh to take a step forward in the right direction into the data or information protection regime. Its purpose is primarily to promote confidentiality, integrity, and availability of public and private information systems and networks and, also, to protect the rights of individuals and privacy, economic interests, and security in cyberspace [ 41 ]. This Act explicitly requires obtaining consent or authorisation from data subjects before collecting, storing, and processing personal information [ 41 ]. However, Bangladesh recognises that implementing GDPR mandated requirements for data protection officers, data protection impact assessments and audits, breach notifications, and record-keeping would prove to be difficult and costly for many small companies in Bangladesh [ 42 ]. The rules specify that anyone attempting to illegally access a computer or digital system and to interfere by making changes, transferring any data or information owned by any organisation, will be legally liable for committing a punishable offence, in the form of imprisonment not exceeding five years and/or a fine not exceeding BDT 1 million [ 39 ].

In Bangladesh, the proposed Digital Security Act sets out provisions very similar to GDPR (See Table 3 ). However, despite the close alignment, differences exist in terms of the requirement to appoint a data protection officer and to issue a data breach notification.

The privacy issues in Bhutan have not been sufficiently articulated in the literature, policies, or guidelines. However, the reports suggest that seven of the ten second-generation principles in the 1995 EU Data Protection Directive have been included in Bhutan's Information, Communications and Media Act, which came into force in 2018, but with limited coverage on privacy and privacy law [ 6 ]. Also, the Social Media Strategy and Guideline Policy of Bhutan (2011), the Information and Communications Technology Policy and Strategies (2004), and Bhutan e-Government Master Plan (2014) have limited emphasis on privacy issues [ 43 ]. Thus, these acts are considered only moderately strong for the Asian region. As such, non-EU businesses are losing European Partnership contracts because adequate protection of data in compliance with GDPR could not be guaranteed. Yet not all businesses or government organisations in Bhutan had been affected by GDPR simply because either the businesses in Bhutan do not have commercial links with European companies, or they do not require access to the personal data of EU citizens.

Nepal's laws evidently contain a comprehensive set of regulatory features relevant to personal data that could be expected in a data privacy Act for the public sector. These are clearly specified as the right of access; right of correction; protection against unauthorised access; restrictions on the use and disclosure by government agencies; limitations on additional usage by third parties when obtaining access; both offences and compensation provisions for breaches; an independent authority to investigate complaints and resolve disputes; and a right of appeal to the courts.

The right to privacy as a fundamental right featured for the first time in the 1990 constitution of the Kingdom of Nepal, and so did the right to information, and later, the right to privacy was retained in the 2007 interim constitution [ 44 ]. However, there was no reference made to the authority of the state to receive violations of privacy rights complaints, but the public had the freedom to submit such reports to the National Human Rights Commission (NHRC), and also the option to take legal action against violation of the right to privacy, in the Nepalese courts [ 44 ].

Nepal became the Federal Republic in 2015 with the promulgation of the new Constitution, and substantial changes were made to the country’s legal system [ 45 ]. The critical element is the right to privacy and protection of information as a fundamental right stipulated in Article 28 of the Constitution, along with that constituted the criminal code and the Individual Privacy Act 2018 [ 46 ]. The Criminal Code has a separate chapter on laws covering privacy violations, breaches of confidentiality, taking and editing photos of a person without consent, and breaches of private information in electronic media considered criminal acts [ 47 ].

Nepal enacted the Privacy Act of 2018 [ 48 ], but notably, it was not considered a data privacy law due to the exclusion of basic principles. However, private sector bodies operating in Nepal were obliged to pay careful attention to many provisions in the Act. For example, personal data collected by corporate entities might only be used for the purpose for which such data was collected, and collection and disclosure were prohibited without consent [ 48 ]. That is an endorsement of the need to obtain consent before collecting private information and the restrictions on collecting data and using it only for the purposes for which it was collected.

These obligatory requirements generate more responsibility on businesses as commercial activities conducted online need to collect the users' personal data and restrict data sharing with third parties. However, in terms of collecting or using personal information belonging to a Nepalese resident from outside the territory of Nepal or involving an offshore entity within Nepal, the enforcement of the Act appears vague [ 49 ]. Therefore, the purpose for collecting data and information and the intended use should be revealed with clarity. If the intention behind information gathering is for a particular need, academic study, specific research objectives, public opinion, then the nature of the collection, the purpose for the collection, methodology and mode of information processing, along with assurance not to breach the privacy of individual information must be presented.

The Act requires public authorities or corporate bodies to obtain consent from the individual/s before disclosing personal information collected, stored, or retained by them [ 46 ]. The violation of the Act is a criminal offence, and legal action commensurate with the offence may be taken by either an individual or the State, and proven liable, the offender would incur imprisonment of up to 3 years or a fine of up to NPR 30,000 or both. Also, the offender could be liable to pay compensation to the affected party (victim) for the violation of the provisions of the Act [ 46 ].

The Privacy Act, however, has failed to address the shortcomings and important aspects in it. The existing definition does limit broader interpretation of 'personal data.' Another important shortcoming is that the Privacy Act does not define or specify some of the vital concepts of data protection such as 'controller' and 'processor.' This will make data management difficult and in practice, will hamper legal enforcement of punitive action against breaches in Nepal (Table 4 ).

Afghanistan

The use of information and communication technologies in Afghanistan has been growing rapidly [ 50 ], and the popularity of modern technology has made its way into all aspects of the citizens making a difference to their way of life. In the absence of specific laws or regulations to manage data protection in Afghanistan [ 51 ], it is important to put in place legal frameworks that will safeguard private and enterprise data flowing through the ICT based infrastructures. The Constitution of Afghanistan guarantees the right of confidentiality and privacy to its citizens using a broad spectrum of communications systems [ 51 ]. It provides freedom and confidentiality of correspondence between individuals by way of a letter, telephone, telegraph, as well as other means [ 51 ]. Some laws do address other aspects containing data protection provisions, but there is also a need to develop regulations to implement the privacy Laws.

Sri Lanka has a growing population, and the use of information technology and associated services is growing even at a faster rate [ 52 ], so is the use of cyberspace. A vast majority of the population use mobile phones to manage their daily lives, and amongst them, a new generation of professionals, youth, and those still in education are resorting to modern technology and associated IT systems using online network facilities available right across the country [ 52 ]. This trend will continue in a progressively developing country becoming increasingly dependent on advanced technology and the benefits it offers to the citizens.

That is all well and good. The concerning factor on the consumer front is the inevitable intrusion into the privacy of the users of information and technology and modern digital systems. That exposes the users, and there is a pressing need to develop laws to safeguard against the challenges of cyberspace crime faced by the state and the users. The most important of them is to legally protect individuals' personal information, which cannot be ignored or treated mildly. In forming legislation, the parameters of the importance of the law and the guideline should be considered to ensure they are sound, unambiguous, and enforceable.

On the economic front, Sri Lanka needs data protection and information security laws as they are crucial to attracting foreign direct investment (FDI), and as pointed out by the economists, due to the lack of adequate legal mechanisms, the foreign investors will be reluctant to invest in the country [ 53 ]. In a different context, however, Sri Lankan entities that process data of European residents are faced with stringent obligations. The Computer Crimes Act 2007 appeared to have addressed the issue of data privacy to some extent by specifying penalty clauses for unlawful acquisition and illegal interception of data and unauthorised disclosure of information [ 54 ]. Also, the right to privacy has been recognised by the judiciary under the common law of Sri Lanka [ 54 ]. This indicates that despite the absence of specific constitutional or legislative recognition, the right to privacy recognises the Sri Lankan judiciary in a variety of legal contexts under common law.

Chapter III of the Sri Lanka Constitution (1978) provided adequate guarantees for the fundamental rights of its citizens, but not specifically for the right to their privacy [ 55 ]. The proposed versions of the drafts of the Constitution in 1997 and 2000 had stipulated the right to privacy and family life as a fundamental right [ 55 ]. The proposed October 1997 Constitution specifically stated every person has the right to have his or her private and family life, home, correspondence and communications respected and shall not be subjected to unlawful attacks on his or her honour and reputation [ 55 ].

The 19th Amendment to the Constitution makes minimal, half-baked reference to privacy. It states that a fundamental right to information cannot be complied with if an individual's privacy is to be tampered with [ 9 ]. This was not an expressed provision where the right to privacy was a separate and compounded fundamental right of the citizens in Sri Lanka, and if this right was to be exercised against private organisations, it should be separately encapsulated in the statute. To remove any ambiguity in the reference made to the constitutional right of privacy, the Minister for Telecommunications had confirmed that a Personal Data Protection Bill would be introduced in Parliament in 2019 [ 56 ]. The Data Protection Drafting Committee of the Ministry of Digital Infrastructure and Information Technology (MDIIT), and the Legal Draftsman Department, have initiated drafting legislation on data protection [ 57 ]. The drafted bill aims to cover the fundamental principles of privacy and data protection, shadowing legislation models introduced by similar countries.

The bill prescribes measures to protect the personal data of individuals held by banks, telecom operators, hospitals, and other entities amalgamating in processing personal data [ 57 ]. It aims to regulate the processing of personal data, designate a data protection authority, and safeguard the rights of citizens [ 58 ]. Under the terms of the bill, data could be processed for specified purposes only, with a provision that the data could be processed for purposes in the public interest, to respond to an emergency, and for scientific, historical, research, or statistical purposes [ 58 ]

The rights of data subjects provided in the bill include the right to withdraw the consent given to controllers, the right to access, rectify, and erase data without undue delay, and to object to the processing of data [ 58 ]. Consent is now required before collecting private information, and even if consent is obtained, the collected data should only be used for the purposes for which it was collected [ 58 ]. The final draft stipulates that every controller, unless exempted from this Act or any written law, is obliged to appoint a Data Protection Officer to ensure compliance [ 58 ]. The data protection authority shall be responsible for all matters relating to personal data protection in Sri Lanka and for the implementation of the provisions of the bill. The penalties for failure to comply with the provisions of the bill shall not exceed a sum of LKR 10,000,000 in any given case [ 57 ].

The bill stipulates that only public authorities may process personal data within Sri Lanka, and the processing of classified data overseas is subject to permission being granted by the DPA and any relevant supervisory body [ 6 ]. The private sector is not subjected to conditional data localisation stipulations except for transferring personal data to a third country prescribed by the Ministers [ 6 ].

The Framework for the Proposed Personal Data Protection Bill was first released on 12 June 2019 for stakeholder comments, and the final draft was released on 24 September 2019 by the Ministry of Digital Infrastructure and Information Technology [ 53 ]. The bill comprehensively covered both the public and the private sector in full. The legislation was to be implemented in stages, and the bill was scheduled to become operational within a period of 3 years after ratification by the Parliament, allowing the Government and private sector a time-lapse to prepare for the implementation of legislation [ 53 ].

Sri Lanka deserves credit for its commitment to developing a data protection mechanism, despite its developing country status. The summarised information in the table (see Table 5 ) shows draft legal mechanisms compared to GDPR.

The criteria for data protection in the Maldives fall under the right to privacy, and it was embedded in the 2008 Constitution of the Republic of the Maldives and the Penal code of the state [ 59 ]. The penal code prohibits obtaining private information or highly secured information without having a license or authority to do so and disclosing any such information to a third party [ 59 , 60 ].

In 2016, the Ministry of Economic Development of the Government of Maldives announced the drafting of a new data protection bill and was circulated to the public, but it has not yet become law [ 59 ]. The purpose of the Act was to promote small and medium enterprises, encourage e-commerce, and establish procedures to store, manage, and protect customers' confidential information [ 61 ]. The apparent shortcoming of the Act was the absence of one important element: the provision to punish non-compliance but has made allowance for everyone the use of discretion to comply with the Act [ 61 ]. The primary beneficiary of the Act is identified as the commercial sector as they need an efficient system to manage, use and store confidential information in accordance with international standards and thereby boost customer confidence in the enterprises [ 61 ].

This analysis of the South Asia region shows that some countries appear to have no mechanisms of any kind; however, those having at least a draft mode of data protection mechanism intends to develop legal mechanisms matching GDPR. Given these positive trends at national level, it is optimistically conceivable that the SAARC regions would maintain a constructive dialogue between the nations, consolidate its influence in the region to move forward with a consensus-based approach to develop a regional level data protection mechanism, and sustain the momentum ultimately to achieve the goal of developing a global level data protection mechanism.

Research Methodology

This research is qualitative in nature, and it enables the researcher to get a deeper understanding of experiences, phenomena, and context [ 62 ]. Qualitative research is a stimulant to ask questions by way of observations, in-depth interviews, focus groups, and existing documents, paper surveys with open-ended questions and online surveys, and it produces subjective knowledge [ 63 ]. The characteristics of the qualitative data collection method fit in well with the aim of the research. The researcher decided to use the qualitative data collection method to get an in-depth understanding of the available data protection mechanisms and the challenges and the barriers faced by the countries in the South Asian region. To that end, in this research, the researcher used journals, books, newspapers, websites, government reports, and constitutions for collecting data for this research and found books and journals excellent sources for extracting background information that contributed to widening the scope of the study. The main data sources used in the literature review are the Government Websites, Publications from companies, Google Scholar and Researchgate. Several keyword searches were used to find the studies relevant to answer the research questions.

The aim of the Qualitative studies is to gain a greater level of understanding of the subject [ 64 ]. This approach specifically answers the questions such as ‘how’ and ‘Why’. Therefore, the use of qualitative analysis in this research has enabled the researcher to analyse the findings and provide detailed answers to the research question. To that end, the qualitative analysis was used mainly to obtain answers to the questions such as why it is important to develop data protection mechanisms and what type of challenges the countries would face in developing GDPR inspired data protection mechanisms. The researcher used government websites to access data protection mechanisms relevant legal documents, understand the existing ones and ascertain the differences between them. The newspapers provided contemporary data about the new changes incorporated into existing legal mechanisms. Considering the findings, the researcher critically analysed the important parameters identified in the literature review.

The main limitation of this research is that the literature search pointed to contradicting information on data privacy and security policies, and it was a challenge that affected the progress of the literature review. An unknown number of countries are in the process of reviewing and modernising their legal mechanisms to protect personal privacy in the climate of evolving technologies. That, in a way, has inundated literature published in the previous years. It was a matter of selecting the most appropriate articles to ensure the quality of the research outputs.

Similarities and Disparities Between GDPR Inspired Bills in South Asian Countries

The states and state apparat, organisations and individuals face sophisticated, complex cyber-security threats designed to cause significant damage to the economy and infrastructure dependent essential services. This has become a frequent occurrence especially in the countries in the Asia region where, unlike those in the West, the use of the internet has expanded at a rate in correlation with the internet revolution. That has invariably aroused growing concerns in the community about cybercrimes ranging from data breaches to transferring personal data. To allay any concerns and fears, most Asian countries have taken steps to introduce new data protection legislation or enhance existing cybercrime countermeasures.

The South Asian region has no regional level data protection mechanism in place yet, but having one will encourage them to develop a global level data protection mechanism. To reach that point, it is important to identify the ambiguities in GDPR inspired bills already possessed by the countries in this region. The understanding of discrepancies would help to resolve implicit issues between the countries and bring them together to establish a common stance on developing a unified regional level data protection mechanism that would eventually level up towards developing a global level data protection mechanism.

The commercial sector in the South Asian region is growing in line with modern technology and increasingly becoming digitalised and moving into online platforms to conduct business activities [ 56 ]. In the light of these changing environments, the public, private, and non-profit entities are all in the process of introducing Information and Communication Technology (ICT) to improve their computing capabilities, in a continuous process to keep up with the Western world. The number of ICT users is growing at an unprecedented rate, and they are constantly becoming attracted to ICT capabilities, but many in general lack technical knowledge of cyber security and their privacy rights [ 65 ]. That exposes commercial enterprises and individual users to greater risks from cyber-attacks originating from locations anywhere in the world. Therefore, it is imperative for both the private sector and the individual users of cyberspace to have sufficient awareness of their exposure to the risks from cyber threats and their privacy rights. However, South Asian countries are facing a daunting task of having to strike a balance between privacy and the right to information. The lack of awareness itself is only one of many factors such as language barriers, limited education, lack of opportunities, and many do not have access to the high-tech environment. Also, many users are not aware of the cyber laws their government has put in place, and there is more to be done to make such information available to the citizens to raise awareness of the consequences of cyberspace crime.

The rise in data breaches and privacy-related incidents has facilitated discussion around how much control people should have over their personal information. Since then, there has been improved recognition of the right to privacy in the digital age and increased awareness amongst the public regarding how individuals can access or control their data. On another positive note, there has been a push for comprehensive rights for the individuals, such as the right to request consent for processing and the right to be forgotten; governments have responded by strengthening their privacy law frameworks. In addition, the organisations, when collecting personal information and when processing and transferring personal information to a third party are required to seek consent from the individuals. The organisations engaged in processing personal data are also required to employ a data protection officer within the organisation.

At the global level, the rise in data breaches in terms of frequency and volume has put pressure on governments to introduce data breach notification requirements making reporting of data security breaches mandatory. The notification should include complete details of the breach, the name and contact details of the data protection officer, a description of the likely consequences of the breach and an incident recovery plan proposal for mitigating its effects. Those organisations violating the rules will become liable and incur a heavy fine.

Since the GDPR came into effect, many commercial enterprises became obliged to re-examine their stand on privacy rights. The European Commission enabled the free flow of data between the EU and countries considered to have ‘adequate’ regulations in place. Many are currently seeking to strengthen their laws to obtain adequacy in the South Asian region (Table 6 ).

There is a visible lack of literature on national level GDPR inspired data protection mechanisms and adequacy of those mechanisms to GDPR in the South Asian region. Therefore, the discussion in this paper would contribute to the prevailing knowledge and help future researchers in their research surrounding regional level data protection mechanisms in the South Asian region. In the absence of a regional level mechanism in the South Asian region, the findings presented in this paper would serve as a valuable source to future researchers and would help them to foresee the progress of the development of a regional level mechanism.

Barriers to Developing Data Protection Mechanisms

The apparent hesitancy and reluctance of South Asian states to participate, and the prevalence of external/internal issues affecting collective decision making seem contributory factors hindering progress towards the development of cyber specific regional level mechanism.

Social Differences

The assessment of social impacts is essential to determine what difference a policy will make to people’s lives. It enables the researcher to analyse the social impacts and consider the widest range of impacts that policies would have on individuals, communities, and society. A summary of social differences within the countries is outlined in Fig.  1 .

figure 1

Social differences

Many of the wealthiest countries record the highest current-level development scores, and they enjoy political stability, freedom of expression, and low levels of corruption [ 66 ], and that allows the countries in this category space to focus more on developing data protection mechanisms in comparison to less developed countries that are struggling to reconcile the differences exist in them.

A country’s overall economic strength influences internet diffusion and the resources and capital required to expand technology [ 67 ], and there is also a demand for capital investment for developing data protection mechanisms. Therefore, the economic stability of the developed countries have the capacity to allocate funding towards the protection of privacy of their citizens and assist in reviewing existing policies of the developing/underdeveloped countries.

The knowledge of the individuals also may influence the spread of communication technology [ 67 ]. The knowledge is of many forms ranging from knowledge of the use of communication technology to privacy and security threats associated with technologies. This knowledge gives users an insight into technology and understands the necessity of having established data protection mechanisms. In addition, the literature suggests that some languages have greater recognition than others, and they dominate certain areas of life, for example, the use of the English language in the computer industry [ 67 ]; hence the language barrier could be an obstacle to developing policies. Therefore, literature material should be translated into multilingual formats and made easily available and accessible to all countries for developing data protection mechanisms.

Furthermore, the cultural background also can contribute to the development of data protection mechanisms. In Asia, the interpretation of the perception of obscenity and pornography/erotica varies from country to country [ 68 ]. For example, the reports suggest that Japanese people have a higher tolerance to erotic materials, in comparison to those in China, Taiwan and Hong Kong [ 68 ]. The reports suggest that the Islamic countries also have a lesser tolerant approach to obscene materials. Hence, the countries such as China, Singapore, Pakistan scrutinise social networking sites or even block web access to filter out such sensitive material [ 68 ]. These social differences and beliefs form the most significant barriers to developing a global level data protection mechanism.

These social differences and beliefs are the biggest barriers facing developing a global level data protection mechanism. However, a report suggests that an increasing number of Europeans living in the border regions of the EU claims social and economic differences are not the problem acting cooperation between their home and neighbouring country [ 69 ].

Mistrust Between Countries

The emerged mistrust amongst the countries has come about for a variety of reasons, such as political differences, border disputes, and the persistence of ongoing conflicts. As an example, the tensions arising from the border disputes between India and Pakistan, for instance, is an example of why the countries embroiled in conflicts find it difficult to come together for the common good [ 70 ]. The ongoing conflict between Nepal and India involving a section of the Madheshi community in Nepal led to the closure of the India–Nepal border in 2015, and given the socio-cultural proximity of the Madheshi community with India, the blockade impacted on bilateral ties between the two nations [ 70 ]. These reasons have become a challenge in bringing everyone together to come up with a global level data protection mechanism.

Legal Differences

The ways in the countries adhere to international or regional conventions differ between the countries, and these differences tend to influence the determination of specific initiatives to develop their laws. Legal disparities make it difficult to respond and investigate, enforce the law, and hinder international collaboration [ 71 ]

Internet Penetration

The reports produced by the World Bank highlight marked differences between the South Asian countries in terms of internet penetrations, and amongst them, Bangladesh (58.4%) has the highest internet penetration, and Pakistan (32.4%) has the lowest internet penetration [ 72 ]. These variations in the level of internet penetration demand countries to increase financial resources take prompt actions to protect the privacy of their citizens.

Identification of Difficulties

One of the most concerning and significant is the attribution hindrance [ 73 ]. The perpetrators are becoming increasingly effective in concealing the authenticity of identities, and their operational locations and, also the identification of the origin of a cyberattack is extremely difficult, even impossible, without international cooperation. Furthermore, the limitations in jurisdiction make investigation a complex process and a challenging task that makes prosecution of cyber perpetrators a futile effort.

Delays in the Enactment of Laws

The enactment law/s in different countries is driven by the decisions made at the national level, based on various factors in different circumstances. They could be political, economic, and social issues. For example, the ratification of the Budapest Convention had taken too long by most countries for varying reasons; the delayed development of the law is one of them [ 71 ]. In another scenario, UN negotiations of a new treaty on cybercrime will take an intense diplomatic effort lasting a considerable timescale without achieving a successful outcome [ 74 ].

Laws and Basic Principles Overlap

The internet has no physical borders and is freely available to the users, governed by national legislation, but constitutional or legal conflicts can arise on the grounds of privacy and freedom of expression. This could lead to debatable contentious privacy and security issues, which may drag on unabated with no end to it.

Differences in National Legislation

There are apparent differences between national legislations of countries. For example, the defined expression of a data breach and the time limit for notifying the breach to the individuals and/or the authorities varies significantly [ 75 ]. In the EU, a data access breach alone, however minor, makes it a notifiable breach within 72 h of being detected, in most cases [ 76 ]. In China, the discovery of security flaws and vulnerabilities in network products and services necessitates informing the relevant government agencies and network users of such breaches [ 77 ]. In Japan, the only requirement is to ‘make the effort’ to notify the incident of a breach [ 78 ], even that requirement is deemed a vague one.

When it comes to the South Asian region, according to Pakistan, the data controller should notify the data breaches to the relevant authority within 72 h of the known incident, except when breaches are unlikely to affect the freedom and rights of the data subject [ 27 ]. In India, Data breach notification is mandatory, but the time duration has not been specified [ 36 ]. In Bangladesh, no reference is made to this requirement for data breach notification [ 42 ]. Likewise, even though South Asian countries developed GDPR inspired bills, some differences between countries exist.

The reliance on new technologies and IoT generates a large volume of information, and any data breach will impact personal privacy, and to safeguard privacy, it is important to have data protection mechanisms. To meet that requirement, most countries revisited and developed data protection mechanisms at the national level lining with GDPR. However, some countries are yet to make up ground. Also, there is a necessity for developing a regional and global level mechanism to bring perpetrators to account. Therefore, it is essential to identify the challenges faced by countries when developing such mechanisms.

There is little insufficient literature on the identification of the barriers to developing unified data protection mechanisms. Therefore, the barriers and the challenges listed in the research paper would contribute to existing literature and would benefit future researchers seeking to develop a unified data protection mechanism. It is important to bring all the countries together to develop a unified data protection mechanism, but it is easier said than done. Therefore, to reap the benefits of collaboration, it is important to identify and address the disparities and support those in need by sharing know-how from those conversant with policy development.

In the light of increased reliance on technologies, the world is faced with unprecedented challenges from cybercriminals. The reality of increasing cyber-related threats became a major concern for the intelligence and security services and, in the wake of cyberthreats spreading beyond borders, the need to find solutions became a high priority. The higher risk factors to the privacy of the individuals inevitably brought to the forefront the need to find solutions to the challenges faced by the organisations and the security services. The cyberspace related threats are stealth in nature, and the enemy is characteristically invisible, difficult to trace, and dangerous, and that made taking urgent action to protect people and the nations a necessity. Against that background, the need to develop a global level policy framework to fill the gaps in the existing legislation became an urgent necessity. Therefore, in this research, the researcher selected South Asian region and sought to explore existing data protection mechanisms and identified the challenges and barriers they faced in developing data protection mechanisms. The intended purpose of the identification of the barriers is to influence and encourage those countries to develop a unified mechanism that would serve the interest of all.

The available literature suggests that most countries in the South Asian region possess at least a draft data protection mechanism in place at the national level. Some are in the process of implantation of appropriate legislations that are designed to deter unethical activities with some success, but those countries without adequate data protection and privacy acts have failed to successfully prosecute cyber-criminals for violating legislation. The analysis of this research showed that, except for the differences in the requirement to appoint a data protection officer and the amount of the imposable fine, the GDPR inspired bills provided an adequate level of data protection to citizens in the South Asian region. The discussion on the General Data Protection Regulation (GDPR) suggests that it remains the only credible source of guidance, and there is no visible collective approach to developing robust unified data protection mechanisms at the regional level. These findings answer the research questions which sought to identify the GDPR inspired data protection mechanisms in the South Asian region and their adequacy to GDPR. The researcher believes that it is essential to have unified purposeful data protection mechanisms developed collectively by the nations to overcome security and privacy challenges and prevent data breach perpetrators escape with impunity. In pursuit of that aim, it is important to identify the factors that affect progress to developing data protection mechanisms.

The literature-based evidence shows that there is a general disparity in the privacy policy and data protection legislation amongst states, but when looked separately at the national and regional levels, the degree of disparity varies. These disparities are attributable to several internal and external factors and are influenced by specific laws of the states. It is also important to identify the gaps in cyber legislation that allows cybercriminals to get away without impunity for the weaknesses in law enforcement and inconsistency in the laws themselves. The identification of these challenges and barriers meet fulfilment of the research question aimed to ascertain the barriers the countries faced in the process of developing data protection mechanisms. These identified gaps/limitations/disparities in the regulatory frameworks, when scrutinised in real situations, make a case for having a unified global level privacy policy and strategic data protection laws to prevent states and organisations from taking arbitrary actions and to avoid perpetrators walking away without any proper punishments for the actions. All that said, there is an exception to the rule, ‘national security’ of the state overrides any emphasis on privacy protection, but it has essentially to be on a need basis.

The absence of a collectively established universal data protection mechanism indicates a clear gap at the global level. It is fair to say that the lack of data protection mechanisms at the national level is hindering and slowing down the progress in developing regional data protection mechanisms. To overcome this, respective countries need to revisit and develop their national-level data protection mechanisms. Having proper mechanisms in place at the national level will make it easier to facilitate a constructive dialogue to produce a meaningful unified regional mechanism. Currently, most nations have individually developed data protection mechanisms matching GDPR, which sets the benchmark for the other nations. In the negotiating process, the diplomatic route would be the preferable option to influence and urge the countries to take collaborative, cohesive action in a participatory manner. That would give each nation the opportunity to open discussion and express their views, resolve any misgivings, and an added impetus to developing a consensus-based data protection mechanism. The failure to do so is likely to jeopardise the chances of ratification and implementation.

The existing literature comprises data protection mechanisms in the South Asia region, but the existing literature does not demonstrate variations between the GDPR inspired bills in that region and the EU GDPR. Therefore, the researcher’s findings on the adequacy/inadequacy of the GDPR inspired bills in the South Asian region adds new knowledge to the current literature. In addition, the literature readings indicated that whilst some countries in the South Asian region were in the process of drafting their national-level data protection mechanisms in parallel to the GDPR, some countries failed to keep up. Therefore, to understand the barriers countries faced in developing data protection mechanisms, the researcher conducted a thorough literature review, the outcome of which would also contribute to existing literature. It would be beneficial to have necessary mechanisms in place to overcome those challenges and bring countries together to develop an unified data protection mechanism.

The future researchers undertaking research to find common factors that would bring all the countries together to develop a unified data protection mechanism would also benefit from this research findings/study. The identification of the barriers to begin with would help them in their research to address these barriers and support to overcome them. That would help in developing a lasting sustainable unified data protection mechanism fulfilling the interest of all the countries.

Recommendations

The researcher recommends that the SAARC should support South Asian countries to take the initiative to develop national and regional level data protection mechanisms. To make it happen, the SAARC should take the initiative to make awareness of the importance of protecting personal data and should provide appropriate guidance and support. The recommendation is to set up a project team consisting of like-minded professionals with expertise in negotiating and influencing those at the highest level in governments, the judiciary, and policy development, preferably facilitated by the SAARC. The team must have clear terms of reference and delegated authority to seek assurances from the governments to expose and share their plans for developing legislation on data security and privacy protection laws. Realistically, it may not be easy in the absence of consensus amongst the nations across the region, and whilst they remain engrossed in their uncompromising ideology and their stand on statehood as world leaders. These attitudes will have to be overcome if world order means anything, and it would be a daunting task to set a time scale to achieve the outcome as the process itself is likely to be an evolving one, and the success is dependent on good faith and willingness of the countries to play their part in the interest of the global community.

Dhungel, R. Cyber Security For National Security. 2019. https://risingnepaldaily.com/opinion/cyber-security-for-national-security . Accessed 12 Apr 2020.

Hollis, D. A brief primer on international law and cyberspace. Carnegie endowment for international peace [Online], p.3-4. 2021. Available at https://carnegieendowment.org/files/Hollis_Law_and_Cyberspace.pdfhttps://carnegieendowment.org/files/Hollis_Law_and_Cyberspace.pdf . Accessed: 13 July 2021.

European Union. Legislative acts. Official Journal of the European Union [Online], p.88. 2016. Available at https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679 . Accessed 15 Feb 2019.

Burgess, M. What is GDPR? The summary guide to GDPR compliance in the UK. 2020 https://www.wired.co.uk/article/what-is-gdpr-uk-eu-legislation-compliance-summary-fines-2018 . Accessed 20 May 2020.

Raghunath P. Human security in a datafying South Asia: approaching data protection. Int J Med Stud. 2019. https://doi.org/10.2139/ssrn.3438583 .

Article   Google Scholar  

Greenleaf, G. Advances in South Asian data privacy laws: Sri Lanka, Pakistan and Nepal. Privacy Laws and Business International Report. 2019. file:///C:/Users/sm77809/Downloads/SSRN-id3549055.pdf. Accessed 2 Jan 2020.

Greenleaf, G W. Asian Data Privacy Laws: trade and human rights perspectives. First edition. Google book. 2014. https://books.google.co.uk/books?id=3yfSBAAAQBAJ&pg=PA438&lpg=PA438&dq=data+privacy+and+security+policies+in+nepal&source=bl&ots=HJCoGnm1xK&sig=ACfU3U2iMecNAbiaFx4aGdgMXh2OxAVk0Q&hl=en&sa=X&ved=2ahUKEwjQrqKKzr7qAhXUnFwKHVq_DSk4ChDoATAFegQIChAB#v=onepage&q=data%20privacy%20and%20security%20policies%20in%20nepal&f=false . Accessed 15 Apr 2018.

De Soysa, S. The right to privacy and a data protection act: Need of the hour. 2017. http://www.ft.lk/article/606874/The-right-to-privacy-and-a-data-protection-act:-Need-of-the-hour . Accessed 23 Feb 2018.

UNCTAD. Data protection and privacy legislation worldwide [Online]. 2020. Available at https://unctad.org/page/data-protection-and-privacy-legislation-worldwide . Accessed: 15 June 2021.

Tovi MD, Muthama MN. Addressing the challenges of data protection in developing countries. Eur J Comput Sci Inf Technol 2013;1(2):5 [Online]. Available at: https://www.eajournals.org/wp-content/uploads/ADDRESSING-THE-CHALLENGES-OF-DATA-PROTECTION-IN-DEVELOPING-COUNTRIES.pdf . Accessed 18 Nov 2019.

Government of UK. National Cyber Security Strategy 2016-2021. Government of UK [Online], p.63-64. 2016. Available at https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/567242/national_cyber_security_strategy_2016.pdf . Accessed 5 Sept 2019.

Wolford B. What is GDPR, the EU’s new data protection law? 2022. https://gdpr.eu/what-is-gdpr/ . Accessed 22 Nov 2020.

ICO. The principles. 2022. https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/principles/ . Accessed 13 Mar 2018.

ICO. Guide to the General Data Protection Regulation (GDPR). 2018. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/711097/guide-to-the-general-data-protection-regulation-gdpr-1-0.pdf . Accessed: 14 Feb 2019.

GDPR.eu project. What are the GDPR Fines?. ND. https://gdpr.eu/fines/ . Accessed 15 Aug 2018.

Albrecht JP. How the GDPR will change the world. Eur Data Prot Law Rev. 2016;2:3. https://doi.org/10.21552/EDPL/2016/3/4 .

Government of UK. The Queen’s speech 2017. Prime minister’s office, London. 2017. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/620838/Queens_speech_2017_background_notes.pdf . Accessed 05 Jan 2020.

Greenleaf, G. Privacy in South Asian (SAARC) States: reasons for optimism, UNSW Law Research Paper No. 18–20. 2017. https://ssrn.com/abstract=3113158 . Accessed 15 March 2018.

OneTrust Technology. Pakistan: Revised draft Personal Data Protection Bill v. GDPR. 2019. https://www.dataguidance.com/opinion/pakistan-revised-draft-personal-data-protection-bill-v-gdpr . Accessed 3 July 2019.

Privacy International and the Digital Rights Foundation. State of privacy Pakistan. 2019. https://privacyinternational.org/state-privacy/1008/state-privacy-pakistan . Accessed 20 Feb 2020.

Rehman, S. Pakistan-data protection overview. 2020. https://www.dataguidance.com/notes/pakistan-data-protection-overview . Accessed 23 August 2020.

OneTrust Company News. What is the Pakistan data protection bill 2018?. 2018. https://www.onetrust.com/what-is-the-pakistan-data-protection-bill-2018/ . Accessed 4 November 2018.

IFEX. Pakistan’s new draft of data protection law contains ‘draconian and anti-democratic’ sections. 2020. https://ifex.org/pakistans-new-draft-of-data-protection-law-contains-draconian-and-anti-democratic-sections/ . Accessed 10 July 2020.

DLA Piper. Data protection laws of the world. 2020. https://www.dlapiperdataprotection.com/index.html?t=transfer&c=PK . Accessed 3 Jan 2021.

The global legal group. Pakistan: Data Protection Laws and Regulations 2020. 2020. https://iclg.com/practice-areas/data-protection-laws-and-regulations/pakistan . Accessed 13 Oct 2020.

Panakal, D. D. Pakistan’s data protection bill includes localization and registration provisions. 2020. https://www.natlawreview.com/article/pakistan-s-data-protection-bill-includes-localization-and-registration-provisions . Accessed 5 July 2020.

Government of Pakistan. Personal data protection bill. 2018. https://moitt.gov.pk/SiteImage/Downloads/Personal%20Data%20Protection%20Bill%20without%20track%20changes.pdf . Accessed 10 January 2019.

Intersoft Cosulting. General data protection regulation. 2022. https://gdpr-info.eu/chapter-1/ . Accessed 22 March 2019.

Jyoti P. India's Supreme Court upholds right to privacy as a fundamental right—and it's about time. 2018. https://www.eff.org/deeplinks/2017/08/indias-supreme-court-upholds-right-privacy-fundamental-right-and-its-about-time#:~:text=The%20one%2Dpage%20order%20signed,Part%20III%20of%20the%20Constitution Accessed 10 Aug 2018

Talwar Thakore and Associates. Data Protected–India. 2020. https://www.linklaters.com/en/insights/data-protected/data-protected---india#:~:text=India%20is%20not%20a%20party,or%20the%20Data%20Protection%20Directive.&text=India%20has%20also%20not%20yet%20enacted%20specific%20legislation%20on%20data%20protection . Accessed 20 June 2020.

Deloitte. The Asia Pacific Privacy Guide. 2019. https://www2.deloitte.com/content/dam/Deloitte/in/Documents/risk/in-ra-Deloitte_AP-PrivacyGuide_Interactive-noexp.pdf . Accessed 12 Dec 2019.

Subramaniam A, Das S. The Privacy, Data Protection and Cybersecurity Law Review: India. 2020. https://thelawreviews.co.uk/edition/1001546/the-privacy-data-protection-and-cybersecurity-law-review-edition-7 . Accessed 14 Oct 2020.

Cloen, T. South Asia: The road ahead in 2020. 2020 https://www.atlanticcouncil.org/commentary/feature/south-asia-the-road-ahead-in-2020/#India . Accessed 12 May 2020.

Burman, A. Privacy and Promote Growth. 2020. https://carnegieindia.org/2020/03/09/will-india-s-proposed-data-protection-law-protect-privacy-and-promote-growth-pub-81217 Accessed 12 Aug 2020.

Deloitte. India draft personal data protection bill, 2018 and EU General Data Protection Regulation a comparative view. 2019. https://www2.deloitte.com/content/dam/Deloitte/in/Documents/risk/in-ra-india-draft-personal-data-protection-bill-noexp.pdf . Accessed 20 Sept 2020.

Walia H, Chakraborty S. India: Data protection laws and regulations 2020 . 2020. https://iclg.com/practice-areas/data-protection-laws-and-regulations/india . Accessed 10 November 2020.

Moniruzzaman M. Personal data protection in Bangladesh and GDPR. 2019. https://bdjls.org/personal-data-protection-in-bangladesh/ . Accessed 23 Sep 2020.

Hossain K, Alam K, Khan SU. Data privacy in Bangladesh a review of three key stakeholders perspectives. 2018. https://www.researchgate.net/publication/329275065_Data_Privacy_in_Bangladesh_A_Review_of_Three_Key_Stakeholders_Perspectives . Accessed 12 Dec 2018.

Doulah N. Bangladesh-data protection overview. 2020. https://www.dataguidance.com/notes/bangladesh-data-protection-overvie . Accessed 3 Oct 2020.

Molla, M. S. and Nahar, S. Need of Personal Data Protection Laws in Bangladesh: A legal Appraisal. ND. https://www.hg.org/legal-articles/need-of-personal-data-protection-laws-in-bangladesh-a-legal-appraisal-48450 . Accessed: 12 February 2020.

Mishbah ABMH. Bangladesh steps into the data protection regime. 2019. https://www.thedailystar.net/opinion/human-rights/news/bangladesh-steps-the-data-protection-regime-1726351 . Accessed 22 May 2018.

Goswami S. Bangladesh to propose a privacy law. 2021. https://www.bankinfosecurity.asia/bangladesh-to-propose-privacy-law-a-15898 . Accessed 3 Mar 2021.

Author unknown. Digital privacy: issues and challenges in Bhutan. 2015. https://kuenselonline.com/digital-privacy-issues-and-challenges-in-bhutan/ . Accessed 23 Jan 2018.

Pradhan K. Nepal. 2014. https://www.giswatch.org/en/country-report/communications-surveillance/nepal . Accessed 24 Feb 2018.

National Forum of Parliamentarians on Population and Development. Nepal's Constitution and Federalism Vision and Implementation. 2020. https://asiafoundation.org/wp-content/uploads/2020/10/Nepals-Constitution-and-Federalism_Vision-and-Implementation_English.pdf . Accessed 10 Aug 2020.

Pradhan D. Nepal-data protection overview. 2020. https://www.dataguidance.com/notes/nepal-data-protection-overview . Accessed 14 July 2020.

Neupane A, Karki S. Nepal: an introduction to the Individual Privacy Act 2018. 2019. https://www.dataguidance.com/opinion/nepal-introduction-individual-privacy-act-2018 . Accessed 13 Mar 2020.

Neupane Law Associates. Introduction to the Privacy Act 2018. 2019. https://www.neupanelegal.com/news-detail/introduction-to-the-privacy-act-2018.html . Accessed 10 June 2019.

Upreti, R A. Individual Privacy Act, 2018. 2018. http://www.pioneerlaw.com/news/individual-privacy-act-2018-2075 . Accessed 3 Jan 2019.

The World Bank. From transition to transformation: the role of the ICT Sector in Afghanistan. 2013. https://www.infodev.org/infodev-files/final_afghanistan_ict_role_web.pdf . Accessed 21 Jan 2018.

Kraemer T. Afghanistan-data protection overview. 2020. https://www.dataguidance.com/notes/afghanistan-data-protection-overview . Accessed 23 Dec 2020.

Gunawardana K. Current status of information technology and its issues in Sri Lanka. 2018. https://www.researchgate.net/publication/316383091_Current_Status_of_Information_Technology_And_Its_Issues_in_Sri_Lanka . Accessed 3 Mar 2018.

The morning. Data Protection Bill further delayed. 2020. http://www.themorning.lk/data-protection-bill-further-delayed/ . Accessed 24 Jan 2020.

Madugalla KK. Right to Privacy in Cyberspace: Comparative Perspectives from Sri Lanka and other Jurisdictions. 2016. http://repository.kln.ac.lk/bitstream/handle/123456789/15625/2829.pdf?sequence=1&isAllowed=y . Accessed 27 Apr 2018.

Berry L. Data protection law an E-business and E-government perception. 2017 https://silo.tips/download/data-protection-law-an-e-business-and-e-government-perception . Accessed: 2 Aug 2019.

Deloitte. Unity in Diversity; the Asia Pacific Privacy Guide. 2019. https://www2.deloitte.com/content/dam/Deloitte/in/Documents/risk/in-ra-Deloitte_AP-PrivacyGuide_Interactive-noexp.pdf . Accessed 5 Mar 2020.

Sirimane M. Sri Lanka: Proposed Bill on Personal Data Protection. 2020. https://www.dataguidance.com/opinion/sri-lanka-proposed-bill-personal-data-protection . Accessed: 13 Apr 2020.

Ikigai Law. Introduction to Digital Security Laws in Nepal, Sri Lanka, and Bangladesh. 2019. http://www.mdiit.gov.lk/images/Legal_framework_for_proposed_DP_Bill_11th_June_2019_-_revised_FINAL_ver3.pdf . Accessed 10 Jan 2020.

Ameen D. Maldives-data protection overview. 2020. https://www.dataguidance.com/notes/maldives-data-protection-overview . Accessed 5 Nov 2020.

Robinson HP. Final report of the maldivian penal law and sentencing codification project: text of draft code (volume 1) and official commentary (Volume 2). 2006. https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=1289&context=faculty_scholarship . Accessed 23 Jan 2019.

Sun Media Group. Privacy and Data Protection Act under compilation. 2016. https://en.sun.mv/40808 . Accessed 13 Feb 2018.

Cleland J. A The qualitative orientation in medical education research. Korean medicine Education. 2017. https://doi.org/10.3946/kjme.2017.53 .

Dudovskiy, J. Data collection methods. 2022. https://research-methodology.net/research-methods/data-collection/#:~:text=Data%20collection%20is%20a%20process,primary%20methods%20of%20data%20collection . Accessed 14 Jan 2021.

Sobh R, Perry C. Research design and data analysis in realism research. Eur J Mark. 2006;40(11/12):1194 [Online]. Available at https://www.researchgate.net/publication/228953893_Research_design_and_data_analysis_in_realism_research . Accessed 10 Aug 2020.

Subedi, R. Cyber Security Situation in Nepal. ND. https://www.enepalese.com/2015/07/32099.html . Accessed 10 Oct 2020.

Beal D, Rueda-Sabater E, Santo TE. Comparing socioeconomic developmenacross nations. 2012. https://www.bcg.com/publications/2012/public-sector-globalization-comparing-socioeconomic-development . Accessed 12 Nov 2020.

Hargittai E. Weaving the western web explaining differences in internet connectivity among OECD countries. Telecommunications Policy. 1991. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.79.7637&rep=rep1&type=pdf . Accessed: 3 Mar 2020.

Liu J, Hebenton B, Jou S. Handbook of Asian Criminology. Google book. 2013. https://books.google.co.uk/books?id=5QFw0WHPJD8C&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false . Accessed 12 May 2020.

Makszimov V. Social and economic differences across EU less important for citizens–survey. 2020. https://www.euractiv.com/section/economy-jobs/news/social-and-economic-differences-across-eu-less-important-for-citizens-survey/ . Accessed 12 Feb 2021.

Avis W. Border disputes and micro-conflicts in south and southeast Asia. 2020 https://gsdrc.org/publications/border-disputes-and-micro-conflicts-in-south-and-southeast-asia/ . Accessed 13 Oct 2021.

Mendoza MA. Challenges and implications of cybersecurity legislation. 2017. https://www.welivesecurity.com/2017/03/13/challenges-implications-cybersecurity-legislation/ . Accessed 13 Oct 2020.

Statista. Internet penetration in Asia as of June 2020, by country or region. 2020. https://www.statista.com/statistics/281668/internet-penetration-in-southeast-asian-countries/ . Accessed 10 Sep 2021.

Yannakogeorgos PA. Strategies for resolving the cyber attribution challenge, Air University Press, Alabama. 2016. https://media.defense.gov/2017/May/11/2001745613/-1/-1/0/CPP_0001_YANNAKOGEORGOS_CYBER_TTRIBUTION_CHALLENGE.PDF . Accessed 13 Sep 2020.

Hakmeh J. Building a Stronger International Legal Framework on Cybercrime [Online]. 2017. https://www.chathamhouse.org/2017/06/building-stronger-international-legal-framework-cybercrime Accessed 12 Dec 2020.

Bevitt A, Retzer K, Łopatowska J. Dealing with data breaches in Europe and beyond. 2020. https://uk.practicallaw.thomsonreuters.com/6-505-9638?transitionType=Default&contextData=(sc.Default)&firstPage=true . Accessed 3 July 2020.

European Data Protection Supervisor. Personal Data Breach. 2021. https://edps.europa.eu/data-protection/our-role-supervisor/personal-data-breach_en . Accessed 22 Feb 2021.

Luo D, Wang Y. China-Data Protection Overview. 2020. https://www.dataguidance.com/notes/china-data-protection-overview . Accessed 17 Feb 2021.

Hounslow D. Japan-Data Protection overview. 2020. https://www.dataguidance.com/notes/japan-data-protection-overview . Accessed 11 Dec 2020.

Download references

Author information

Authors and affiliations.

School of Technologies, Cardiff Metropolitan University, Llandaff Campus, Western Avenue, Cardiff, CF5 2YB, UK

Vibhushinie Bentotahewa, Chaminda Hewage & Jason Williams

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Vibhushinie Bentotahewa .

Ethics declarations

Conflict of interest.

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the topical collection “Cyber Security and Privacy in Communication Networks” guest edited by Rajiv Misra, R K Shyamsunder, Alexiei Dingli, Natalie Denk, Omer Rana, Alexander Pfeiffer, Ashok Patel and Nishtha Kesswani.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Bentotahewa, V., Hewage, C. & Williams, J. The Normative Power of the GDPR: A Case Study of Data Protection Laws of South Asian Countries. SN COMPUT. SCI. 3 , 183 (2022). https://doi.org/10.1007/s42979-022-01079-z

Download citation

Received : 29 September 2021

Accepted : 24 February 2022

Published : 07 March 2022

DOI : https://doi.org/10.1007/s42979-022-01079-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • General Data Protection Regulation (GDPR)
  • Data protection
  • Information communication technology
  • Find a journal
  • Publish with us
  • Track your research

DKLM

  • Send an email
  • +44(0)207 549 7888

Data Protection Breaches - Recent Cases

In a recent case, Plymouth Hospital NHS Trust was ordered to pay compensation to a patient after one of its employees unlawfully gained access to the man’s medical records. The nurse who accessed the data was the man’s partner at the time. The patient claimed that the breach of the Data Protection Act 1998 (DPA) and the way his subsequent complaint regarding the matter was handled had made worse a pre-existing paranoid personality disorder and prevented him from working. He was awarded damages of £12,500 for exacerbation of his pre-existing medical condition and £4,800 for loss of earnings. In a second case, a former health worker at the Royal Liverpool University Hospital pleaded guilty to unlawfully obtaining patient information by accessing the medical records of five members of her ex-husband’s family so that she could obtain their new telephone numbers. The matter came to light when a man contacted the hospital after receiving nuisance calls which he suspected had been made by his former daughter-in-law. He had previously changed his phone number following unwanted calls from her and was immediately concerned that there had been a breach of patient confidentiality. Checks by the hospital revealed that none of the patients whose details had been compromised were at any time under the woman’s care and she had no work-related reasons to access their records. She had accessed the information for her own purposes without the consent of her employer and was fined £500 for breach of the DPA and also ordered to pay £1,000 towards prosecution costs and a £15 victim surcharge. Meanwhile, the European Commission has announced proposals for significant reform of data protection legislation. The Information Commissioner’s initial response to the proposals can be found on the website of the Information Commissioner's Office.

We use essential cookies to make our site work. We'd also like to set analytics cookies that help us make improvements by measuring how you use the site. Clicking Reject All only enables essential cookies. For more detailed information about the cookies we use, see our Cookies page . For further control over which cookies are set, please click here

Our use of cookies.

You can learn more detailed information in our Privacy Policy

Some cookies are essential, whilst others help us improve your experience by providing insights into how the site is being used. The technology to maintain this privacy management relies on cookie identifiers. Removing or resetting your browser cookies will reset these preferences.

Essential Cookies

These cookies enable core website functionality, and can only be disabled by changing your browser preferences.

Google Analytics cookies help us to understand your experience of the website and do not store any personal data. Click here for a full list of Google Analytics cookies used on this site.

Third-Party cookies are set by our partners and help us to improve your experience of the website. Click here for a full list of third-party plugins used on this site.

Search site

Contact our office

Make an enquiry

Collyer Bristow logo

  • Longer Reads

Recent trends in data breach litigation

When the Data Protection Act 2018 (DPA 2018) came into force, bringing the General Data Protection Regulation (GDPR) into English law, there was speculation that the floodgates were about to open for data breach claims from individuals against businesses misusing personal data.

3 minute read

Published 20 February 2023

Patrick Wheeler

Partner - Head of IP & Data Protection

Arrow

Key information

  • Specialisms
  • Dispute Resolution
  • Data Protection
  • Media & Privacy

Recent cases indicate that only data breaches with a serious impact on an individual will be worth litigating, and that care is needed both in the choice of court and the costs incurred.

The right to receive compensation for damage suffered as a result of a breach of the UK GDPR has its statutory basis in Article 82. Cases decided under the previous law – the Data Protection Act 1998 (DPA 1998) – need to be viewed with caution but many principles have carried through to the current law. This includes the issue of compensation for distress as well as material damage – a position confirmed and clarified in  Vidal-Hall v Google Inc  [2015] EWCA Civ 311. In that case the claim related to distress and anxiety alone. The court held that there was no requirement to prove financial loss in order for damages to be payable.

The effect of the decision in Vidal-Hall was to encourage an increase in the number of claims brought for damages for distress alone. Recently, however, the courts appear to have begun to take a more stringent approach both to liability and the assessment of damages in such claims, in order perhaps to discourage claims where there has been a technical breach, but the ‘damage’ (distress) is arguably not significant. The courts have also been keen to channel claims towards the county court and away from the High Court, and have criticised the level of costs incurred in this type of litigation.

In  Rolfe v Veale Wasbrough Vizards LLP [2021] EWHC 2809, the defendant had sent a letter regarding unpaid school fees to a mistakenly typed email address. The recipient quickly notified the sender of the error, and, on the defendant’s request, then confirmed that the email had been deleted. The claimants (parents and daughter) brought a claim for misuse of confidential information, breach of confidence, negligence and damages pursuant to Article 82 of the GDPR and section 169 of the DPA 2018. The defendants applied for summary judgment, arguing that any damage or distress to the claimants was de minimis and that therefore the claim had no prospect of success. Master McCloud granted the defendant’s application, saying ‘no person of ordinary fortitude would reasonably suffer the distress claimed arising in these incidents in the 21st century, in a case where a single breach was quickly remedied. … In the modern world it is not appropriate for a party to claim (especially in the High Court) for breaches of this sort which are, frankly, trivial’. The claimants were ordered to pay defendant costs, and on the indemnity basis.

Similar conclusions were reached in  Johnson v Eastlight Community Homes  [2021] EWHC 3069, a judgment handed down very shortly after Rolfe. The breach related to the incorrect addressing of an email (subsequently confirmed to have been deleted) containing financial information. The damages claim was limited to £3,000 but was issued in the High Court, with the claimant’s budgeted costs amounting to over £50,000. Master Thornett refused the defendant’s application for strike-out but confirmed the application of the de minimis principle and ordered the case be transferred to the county court, commenting: ‘The presentation and processing of this case to-date in this forum has, I am satisfied, constituted a form of procedural abuse.’

Even where claims survive in the High Court, and the decision favours the claimant, actual damages awarded may be very low. In  Geoffrey Driver v Crown Prosecution Service  [2022] EWHC 2500 (KB), the claimant (a prominent local politician) sought damages not exceeding £2,000 (along with declaratory relief) after a member of staff at the CPS sent an email to a third party (who had apparently requested the information) regarding his involvement in a local government corruption scandal and criminal investigation. Applying the provisions of the DPA 2018 (the data processing having been held to be for law enforcement processes), Knowles J held that there had been a data breach but that it was at the ‘lowest end of the spectrum’ and accordingly awarded damages of £250.

The judicial direction of travel seems to discourage individual claimants seeking to bring low-level claims for breaches of data protection legislation – especially when brought in the High Court along with allied claims for breach of confidence or misuse of private information. While this may be welcome news for data controllers, limiting potential liability for minor or inadvertent breaches which are quickly remedied, it poses challenges for claimants who have suffered more significant distress as a result of a breach in bringing and managing a complaint in a cost-effective way.

Group litigation was seen as an opportunity for large numbers of people affected by a common data breach. However, this avenue was dealt a blow by the Supreme Court in  Lloyd v Google  [2021] UKSC 50 that, under the relevant provisions of the DPA 1998, compensation was not available for pure ‘loss of control’ of personal data, where damage (material or otherwise) could not be shown. Relatedly, therefore, it would be difficult to pursue such claims as part of group litigation, as individual assessments of damages would be required. The courts seem likely to interpret the corresponding provisions of the DPA 2018 in the same way.

It might seem from these decisions and judicial guidance that there is no value in a claim for the consequences of a data breach. That is probably a misreading of the position, since recent cases seem to have been either factually or procedurally defective, but it will require claimants and their advisers to give careful thought to what kind and level of damage should be pleaded and the most appropriate forum for the dispute to be heard. Costs significantly in excess of the level of damages will also be most unlikely to be recoverable.

For more information, please visit our Data Protection page.

This article was first published in The Law Society Gazette in February 2023.

Shorter Reads Fired over Mouse Jiggling: the employment law risks of TikTok ‘work hacks’

breach of data protection act case study

Longer Reads UK Information Commissioner introduces new approach to enforcement

GDPR_padlock

Shorter Reads Prince Harry continues his public war over phone hacking allegations

breach of data protection act case study

Longer Reads Two key decisions highlight issues when handling children’s data

Children playing with phones

Shorter Reads Prince Harry back in court for phone hacking hearing finale

phone call

Longer Reads Demystifying the UK government’s proposed reforms to data protection laws

GDPR anniversary to be marked with big fines

Shorter Reads Royal Mail cybersecurity incident a reminder of the international reach of the UK GDPR

Shorter reads monitoring staff – employers should not expect to see staff chained to their desks all day.

typing

Shorter Reads Scrapping GDPR regulations likely to cause additional red tape for UK businesses

Data protection

Longer Reads Don’t let your cookie compliance crumble

Conceptual keyboard

Shorter Reads Data privacy update: International data transfers

breach of data protection act case study

Shorter Reads Data Subject Access Requests: Six key issues for HR Teams

computer code

Shorter Reads Rights without responsibilities?

Shorter reads data subject access requests are ‘weaponised’ as disgruntled customers seek to ‘tie businesses in knots’, shorter reads ai regulation in financial services and how the uk diverges from eu rules.

computer chip

Longer Reads Domestic CCTV: are you compliant?

security camera

Longer Reads GDPR: New Guidelines on international data transfers

Longer reads dear santa…a cautionary data privacy tale.

santa with child

Podcasts Article 27 of the GDPR – Do you need to appoint a UK/EU Data Representative?

Data Protection

Shorter Reads Lads, run your own accounts

football players

Shorter Reads Appeal begins over Meghan Markle privacy lawsuit

person reading

Shorter Reads ESG and Data Protection

mother and child walking

Podcasts GDPR and Artificial Intelligence in the workplace

employment conversations flyer

Longer Reads What are the limits on video surveillance in the workplace?

CCTV

Longer Reads Complying with data laws when engaging with freelancers

Crypto image

Shorter Reads UK data protection standards to be deemed adequate by the EU

office scene

Shorter Reads Ikea fined €1 million for illegal staff surveillance

Ikea fined €1 million for illegal staff surveillance

Shorter Reads Is your personal data safe on Zoom?

Is your personal data safe on Zoom?

Shorter Reads Firmware cyber-attacks: the next big thing?

lady typing

Shorter Reads Laurence Fox sued for defamation over ‘paedophile’ tweets

twitter

Shorter Reads Website cookies crumble as they fail to meet legislation

mouse and macbook

Shorter Reads Post-Brexit breathing space for EU-UK transfers of personal data

Shorter reads now is the time to review your data protection policies.

typing

Longer Reads The end of the Brexit transition period: what this means for your business’s data protection...

eu flag

Videos Control the process: mastering your data processing agreements

Control the process slides

Shorter Reads No rest for the European Data Protection Board

European Data Protection Board

Shorter Reads Test-and-trace data sharing: a healthy lesson for private-sector businesses on the importance of transparency

test and trace

Shorter Reads Good(ish) news for BA

airplane

Shorter Reads New EDPB guidelines: copying and pasting GDPR provisions into your commercial agreements isn’t enough

man writing

Longer Reads Privacy Shield invalidated: what this means for your data flows to the US

Shorter reads no data protection impact assessment (dpia) undertaken for test and trace programme – but what is a....

No Data Protection Impact Assessment (DPIA) undertaken for Test and Trace programme

Shorter Reads Preparing for 4 July: Pubs and restaurants required to collect customers’ details

Pubs and restaurants required to collect customers’ details

Shorter Reads Protecting Personal Data as Lockdown unlocks

Protecting Personal Data

Shorter Reads Babylon Health admits GP app suffered a data breach

video call on a mobile phone

Videos M&A: Cyber security and data privacy risks

M&A: Cyber security and data privacy risks

Shorter Reads easyJet’s woes worsen

easyjet airplane

Shorter Reads Cathay Pacific, British Airways, easyJet – what’s the connection?

easyjet

Longer Reads Can someone record you without your permission?

Can someone record you without your permission?

Shorter Reads Data protection in the time of coronavirus

hands holding an iphone

Longer Reads WM Morrisons Supermarkets plc v Various Claimants – Supreme Court hands down decision

WM Morrisons Supermarket

Shorter Reads Zoom under increased scrutiny as popularity rises

Zoom under increased scrutiny as popularity rises

Longer Reads Privacy and real-time bidding: An updated guide for adtech vendors and publishers

Longer reads how to limit reputational damage after a data breach, shorter reads british airways facing record gdpr fine, longer reads ai: it’s time to explain, shorter reads it’s time to ai-xplain, shorter reads uk supreme court considers ‘ordinary reader of facebook’ in defamation claim.

Corporate reputation management

News GDPR anniversary to be marked with big fines

Shorter reads why gdpr isn’t last year’s news.

laptop

Longer Reads The use of non-disclosure agreements in an employment context

Shorter reads facebook and cambridge analytica – the gdpr implications.

Facebook and Cambridge Analytica – the GDPR implications

Shorter Reads GDPR – Smaller charities at risk

GDPR – Smaller charities at risk

Shorter Reads Monitoring staff – employers should not expect to see staff chained to their desks all...

Shorter reads data subject access requests are ‘weaponised’ as disgruntled customers seek to ‘tie businesses in..., longer reads the end of the brexit transition period: what this means for your business’s..., shorter reads new edpb guidelines: copying and pasting gdpr provisions into your commercial agreements isn’t..., shorter reads no data protection impact assessment (dpia) undertaken for test and trace programme – but what....

Related content

  • WhatsApp Us
  • Give us a call
  • Make an enquiry

Arrow

Associated sectors / services

Patrick Wheeler

  • Share on Twitter
  • Share on LinkedIn

Need some more information? Make an enquiry below.

First Name*

Email address*

Phone number*

I would like information about: I would like information about: Business Dispute Resolution Families and Individuals Real Estate

Please tick this box to indicate your consent to providing the above information to Collyer Bristow so that you can be contacted about relevant Collyer Bristow’s services. As detailed in our privacy policy , you may withdraw this consent at any time by contacting [email protected] .

Please add your details and your areas of interest below

First Name *

Last Name *

Phone Number

Collyer Bristow contact (If applicable)

Specialist sectors: Entrepreneurs and growth businesses Financial services (including FinTech) Private equity Private wealth: individuals and families Private wealth: wealth advisors Real estate: commercial Real estate: construction Real estate: residential

Legal services: Corporate and commercial Corporate recovery, restructuring & insolvency Data protection and privacy Dispute resolution Employment Intellectual property

Other information: Higher: Collyer Bristow's initiative championing equality in the workplace

Jurisdictions of interest to you (other than UK): Channel Islands Italy Switzerland USA Turkey

Article contributor

Patrick Wheeler

Partner - Head of IP & Data Protection Specialising in Intellectual property disputes , Data protection , Digital , Intellectual property and Manufacturing

Enjoy reading our articles? why not subscribe to notifications so you’ll never miss one?

Message us on WhatsApp (calling not available)

Please note that Collyer Bristow provides this service during office hours for general information and enquiries only and that no legal or other professional advice will be provided over the WhatsApp platform. Please also note that if you choose to use this platform your personal data is likely to be processed outside the UK and EEA, including in the US. Appropriate legal or other professional opinion should be taken before taking or omitting to take any action in respect of any specific problem. Collyer Bristow LLP accepts no liability for any loss or damage which may arise from reliance on information provided. All information will be deleted immediately upon completion of a conversation.

I accept Close

Tribute to our Partner, Tim Bamford

  • Find out about us
  • Join Collyer Bristow
  • Make a Payment
  • View everyone
  • Dispute resolution
  • Individuals & families
  • Real estate

breach of data protection act case study

© 2024 Collyer Bristow LLP

  • Regulatory Information
  • Accessibility
  • Pricing and service information
  • Find out about Business
  • Find out about Dispute Resolution

Find out about Individuals & Families

  • Find out about Real Estate
  • Support across the full business lifecycle
  • Services for medium sized businesses & entrepreneurs
  • Agency & distribution contracts
  • Franchising contracts
  • Horse racing contracts
  • IP agreements
  • Joint venture agreements
  • Outsourcing contracts
  • R&D agreements
  • Services agreements
  • Sponsorship & marketing contracts
  • Technology & software contracts
  • Company secretarial services
  • Corporate Governance
  • ESG credentials
  • Funding & Finance
  • Group structuring
  • Mergers & Acquisitions (M&A)
  • Administrations
  • Bankcruptcy
  • Buying businesses from administration
  • Compulsory & voluntary liquidation
  • Directors' disqualification
  • Turnaround & restructuring
  • Brand protection
  • Crisis management
  • Media strategy
  • CB Comply: Data protection training
  • Cookie taste test
  • Data breach prevention
  • Data transfers across borders
  • Business Immigration
  • Confidentiality agreements
  • Diversity & discrimination
  • Employee relations
  • Employment contracts
  • Employment Tribunals & dispute resolution
  • Flexible working
  • LTIPS & bonus schemes
  • Recruitment
  • Restructuring & redundancy
  • Settlement agreements
  • Financial regulatory
  • Confidential information
  • Database rights
  • Performers' rights
  • Your IP Rights Health Check
  • Private equity
  • International
  • Investing in the UK
  • Corporate immigration
  • Fast-track to British citizenship
  • Global business mobility visa
  • Recruiting from abroad
  • Skills threshold
  • Legal solutions for businesses
  • CB Checkpoint: Preparing your business for investment or exit
  • CB Comply: DSAR & data breach support
  • CB Counsel: an extension of your in-house legal team

breach of data protection act case study

Meet the team

Collyer Bristow offers live, interactive, engaging and practical training sessions on a variety of privacy-related topics.

breach of data protection act case study

CB Checkpoint: Optimising your key business documentation, policies & procedures.

There are a number of key policies, procedures and documents your business should keep up to date, to be both legally protected and in preparation for investment or exit. Collyer Bristow’s ‘CB Checkpoint’ team will give your business a full check-over, reviewing your key documentation and advising on any changes that are suggested or required. We will deliver a full report of our findings and offer support, should you need it, to ensure legal protection for your business, its assets, and ultimately its reputation.

breach of data protection act case study

Collyer Bristow’s Data Privacy team offers live, interactive, engaging and practical training sessions on a variety of privacy-related topics.

  • Find out about: Commercial Litigation & Dispute Resolution services
  • Services for complex and high value commercial dispute resolution
  • Arbitration
  • Account freezing orders
  • Bank loans and guarantees disputes
  • Cryptocurrency & Blockchain disputes
  • Cyber fraud
  • Derivatives disputes
  • FCA regulatory & enforcement
  • Financial adviser claims
  • Foreign Exchange (Forex) disputes
  • Investment & mis-selling claims
  • LIBOR manipulation claims
  • Shareholders' group actions
  • Agency disputes
  • Alternative forms of dispute resolution
  • Boardroom disputes
  • Breach of contract disputes
  • Breach of directors' duties claims
  • Civil fraud
  • Company disputes
  • Complex claims
  • Litigation funding
  • M&A & warranty claims
  • Shareholder disputes
  • Partnership disputes
  • Product liability
  • Professional negligence
  • Trade disputes
  • Breach of privacy
  • Confidential information disputes
  • Copyright disputes
  • Data privacy disputes
  • Design rights disputes
  • Patent disputes
  • Trademark disputes
  • Real estate disputes
  • International & cross-border disputes

breach of data protection act case study

Navigating your Business through conflict.

We work with clients across a vast range of industries supporting them through the full range of disputes. We provide dynamic and tailored strategies to get them back on track and focused on the day-to-day activities required for success.

Download Brochure

breach of data protection act case study

  • Find out about: Private Wealth services
  • Services for wealthy individuals and families
  • Employment law for employees
  • CB Clarity: pre-nuptial & post-nuptial agreements
  • Child Arrangements Orders
  • Cohabitation agreements
  • Family law online tool: consider your options
  • International Divorce
  • What are nuptial agreements?
  • What is the divorce process?
  • Deed of variation
  • International probate
  • Probate property sales
  • What is probate & how does it work?
  • Residential property & conveyancing
  • Personal disputes
  • Reputation & privacy disputes
  • IR35 investigations
  • Proving tax position
  • Tax enquiries & HMRC disputes
  • Tax-related penalties
  • Capacity disputes // Disputes around Lasting Powers of Attorney
  • Estate & Inheritance disputes
  • Trust disputes
  • Wealth management
  • Charity & philanthropy
  • Family offices
  • Personal Insolvency
  • Duties of trustees
  • Family trusts
  • International estate planning
  • Lasting power of attorney
  • Trust administration
  • Will trusts
  • What are Lifetime Gifts & what are the rules around making them?
  • What is a Lasting Power of Attorney?
  • Business wills & inheritance tax
  • Drafting a will
  • Updating your will
  • Wills for business owners
  • CB Entrust: will writing service
  • US/UK tax & estate planning
  • Citizenship
  • High Potential Individual Visa
  • Permanent Residence/ Indefinite Leave To Remain
  • Personal immigration
  • UK Ancestry visa
  • Resealing foreign grants of probate
  • Legal solutions for individuals and families

breach of data protection act case study

  • Private Wealth

Support for the day to day, support for the complex.

Tailored support for wealth individuals and families.

DOWNLOAD BROCHURE

breach of data protection act case study

CB Clarity: Pre and post nuptial agreement services

The purpose of a nuptial agreement is to agree a fair financial settlement between a couple in the event they get divorced/the civil partnership is dissolved, and it seeks to protect any pre- acquired assets such as inheritance, businesses or property and also seeks to deal with any future inheritance. Put simply, pre and post- nuptial agreements help to provide security, clarity and certainty in the future, for both parties.

breach of data protection act case study

CB Entrust: The personalised will writing service from Collyer Bristow

CB Entrust is not an off the shelf will such as those available on the high street, but a fixed price expert approach to writing one of the most significant documents of your life.

breach of data protection act case study

  • Find out about: Real estate services
  • Services supporting the growth, management and disposal of property
  • Commercial occupiers
  • Investors & property managers
  • Mixed-use developers
  • Private rented sector
  • Construction contracts & disputes
  • Breach of property contract
  • Commercial rent arrears recovery
  • Covenant enforcement
  • Dilapidation claims
  • Early neutral evaluation
  • Landlord & tenant disputes
  • Party wall disputes
  • Possession of land claims
  • Property litigation
  • Real estate insurance disputes
  • Right to light claims
  • Service charge disputes
  • Real estate finance
  • Legal solutions for Real Estate
  • CB Restore: Landlord support for tenancy breach & repossesion

breach of data protection act case study

  • Real Estate

Support across all your real estate requirements.

We work with property owners, investors, developers, funders and both landlords and tenants on the financing, growth, management and disposal of their property and portfolios.

Download brochure

breach of data protection act case study

  • For individuals and families
  • For businesses
  • Financial services
  • Manufacturing & logistics
  • International services
  • Business & Banking Litigation Network

CREATING A LEGACY FOR YOU AND YOUR FAMILY.

We recognise that as each family office is unique in terms of culture, structure and scope, it is crucial to work with a multi-disciplinary team of trusted advisers with the ability to provide you with pragmatic and truly bespoke advice. Our multi-disciplinary team has extensive experience advising clients on managing their investment portfolio whether this covers real estate, heritage property or private equity. When it comes to deploying capital, we take a commercial approach whilst striving to minimise the risk involved.

Family moving into a new house

Helping those in the financial sector respond rapidly to market changes.

Now more than ever, your business needs to work with a legal team with a deep understanding of the changing financial markets which can provide you with practical, incisive guidance. Our work ranges from corporate finance, lending, and restructuring, to funding, regulation, high-value banking litigation and complex claims against financial institutions.

stock-charts

ENSURING THE BEST OUTCOME FOR BOTH SIDES OF THE TABLE.

Whether you are an investor or dealing with a private equity fund, our team delivers tailored solutions with a commitment to ensuring the best possible outcome for both sides of the table.

meeting-room-colleagues

SUPPORTING THE ENGINE OF GLOBAL COMMERCE & INDUSTRY.

We have a long history of supporting the manufacturing and logistics sectors. Our clients come from such diverse sub-sectors as FMCG, automotive, medical devices, digital technology, food & drink, construction, beauty and fashion. Whether you source or process raw materials, manufacture parts or products, or transport and store the goods, we have the sector and legal expertise you need.

breach of data protection act case study

An international focus.

Across our various specialisms we work with clients and contacts based in countries all around the world. However as a firm we maintain a particular focus upon; have a cultural awareness and language expertise in; and have built considerable professional contacts in Italy, USA, Turkey, Channel Islands, and Switzerland.

airport-terminal

  • Lifetime gifting
  • Digitalisation

lifetime gifting

A lifetime gift is any gift that you make, without strings, during your lifetime. In the UK, there are strict rules around gifting to stop people from avoiding IHT by giving away their possessions as gifts before they die.

breach of data protection act case study

The digitalisation of our everyday lives

All of us have at least part of our lives online and in digital assets such as emails, social media profiles, cryptocurrency and online bank accounts, to name just a few. The law recognises that digital assets can be owned. However, there is no consistency between assets and you may find that some of your most valuable assets are mere licences to use a third-party provider’s service. This has significant consequences when attempting to access, manage and transfer digital assets after death.

breach of data protection act case study

Flexible working is the future of the workplace.

As well as the obvious benefits to both employers and employees of continuing to combine working from home with going into the office and more flexible hours, there are employment law and other legal implications that employers will need to consider. Each organisation’s requirements will be slightly different.

breach of data protection act case study

Supporting businesses in digital transformation.

The metaverse. Artificial intelligence. E-sports. Cryptocurrency. Traditional business models and industries have either been or are being disrupted by digital innovation, paving the way for new opportunities and changing “the way that things are done.”

breach of data protection act case study

  • Insights & news
  • Videos & Podcasts
  • Publications

UK USA: Crossing the Pond flyer

Listen to the latest in our UK/USA podcast series

We are Collyer Bristow - The law firm for those that value individuality , creativity and collaboration .

Legal & Pricing info Regulatory Information Accessibility Sitemap Pricing and service information

Listen to our latest podcast: US/UK Estate Planning – What you need to know (We’re the Brits in America) (Part 2)

Read our latest article: Trends in International Arbitration: A review of the 2023 casework statistics for the LCIA and ICC

Listen to our latest podcast: US/UK Estate Planning – What you need to know (We’re the Brits in America) (Part 1)

Listen to our latest podcast: BHS Judgment #2 – Trading Misfeasance

Listen to our latest podcast: BHS Judgment #1 – Wrongful trading

  • I'm looking for someone
  • I need information on a specific area
  • I want to contact you

If it's urgent, please call +44 20 7242 7363

I have an issue and need your help

Scroll to see our A-Z list of expertise

  • A About Collyer Bristow
  • B Banking & Financial Disputes
  • Business & Banking Litigation Network
  • Business Life cycle
  • C CB Counsel
  • Charity & philanthropy
  • Cohabitation Agreements
  • Commercial contracts
  • Commercial Litigation & Dispute Resolution
  • Commercial real estate
  • Company secretarial
  • Confidential Information
  • Construction
  • Contesting a will
  • Corporate Law
  • Corporate recovery, restructuring & insolvency
  • Corporate reputation management
  • D Data protection
  • Divorce FAQs
  • E Employment law for employees
  • Employment lawyers
  • Equality, Diversity & Inclusion
  • Equitable Claims to Property
  • F Family & Divorce
  • Family Trusts
  • Financial settlements
  • H Horse racing contracts
  • Housebuilders & developers
  • I Immigration
  • Immigration FAQs
  • Inheritance Act Claims
  • Intellectual property
  • Intellectual property disputes
  • International Commercial Disputes
  • International trusts, tax & estate planning
  • Investors & property managers
  • J Join Collyer Bristow
  • L Lasting Power of Attorney FAQs
  • Lifetime Gifts
  • Lifetime Giving FAQs
  • M Manufacturing & Logistics
  • Media & Privacy
  • Media, arts & culture
  • Mergers & Acquisitions
  • Mixed-use and housing developers
  • N Nuptial agreements FAQs
  • P Party Wall Disputes
  • Performers’ Rights
  • Possession of Land Claims
  • Postnuptial Agreements
  • Prenuptial Agreements
  • Private wealth disputes
  • Probate FAQs
  • R Real estate
  • Residential property
  • T Tax disputes & investigations
  • Trust & Estates Disputes
  • Trusts, tax & estate planning
  • U Unfair Dismissal Claims
  • US/UK Tax Advice
  • W Will Trust
  • Wills & succession planning

I'd like to just search for something

Get in touch

Get in touch using our form below.

First name *

Email address *

Phone number *

How did you hear about Collyer Bristow? How did you hear about Collyer Bristow? Search Engine (e.g. Google) Social media content Social media advert Email Newspaper/print article Word of mouth recommendation Professional referral Other

Please tick this box to indicate your consent to providing the above information to Collyer Bristow so that you can be contacted about relevant Collyer Bristow's services. As detailed in our privacy policy , you may withdraw this consent at any time by contacting [email protected] .

  • Our People About us
  • Individuals & Families
  • Insights, news & events
  • Open in maps Contact us
  • Message on WhatsApp Give us a call
  • Corporate recovery, restructuring & insolvency
  • Data protection
  • Employment law for employers
  • Manufacturing supply chain
  • CB Comply: DSAR & data breach response
  • Employer knowledge hub: Hybrid working
  • Banking & financial disputes
  • Commercial disputes
  • Construction disputes
  • Tax disputes & investigations
  • Business & Banking Litigation Network
  • Find out about Private Wealth
  • Charity & philanthropy
  • Family & Divorce
  • Immigration
  • International trusts, tax & estate planning
  • Trust & Estates Disputes
  • Trusts, tax & estate planning
  • Wills & succession planning
  • US/UK Tax & Estate Planning
  • Media, arts & culture
  • CB Clarity: pre-nuptial & post-nuptial agreements
  • Family law online tool: Consider your options
  • Knowledge Hub: Lifetime Gifting
  • Construction contracts
  • Investors & property managers
  • CB Restore: Landlord support for tenancy breach & repossesion
  • Art Work: CB support for the arts
  • Shorter Reads
  • View all insights
  • Videos and Podcasts

The ICO exists to empower you through information.

Personal data breach examples

Share this page.

  • Share via Reddit
  • Share via LinkedIn
  • Share via email

To help you assess the severity of a breach we have selected examples taken from various breaches reported to the ICO. These also include helpful advice about next steps to take or things to think about.

Case study 1: Failure to redact personal data

Reporting decision: notifying the ico and data subjects.

What happened?

A data controller sent paperwork to a child’s birth parents without redacting the adoptive parents’ names and address. After discovering the breach, the data controller did not inform the adoptive parents.

Why was this a problem?

The breach presented a high risk to the adoptive parents’ safety. The birth parents visited the adoptive parents’ address and had to be removed by the police, and the adoptive parents and their children had to relocate.

What should have happened?

The controller should have notified the adoptive parents as soon as the breach was discovered. This would have allowed the adoptive parents to take steps to minimise the risk, for example by moving into alternative accommodation or putting additional safeguarding measures in place.

The incident also needed to be reported to the ICO, as there was likely to be a risk to individuals.

The controller should also investigate why the incident occurred and take steps to prevent a similar incident occurring in the future.

Case study 2: Emailing a file in error

Reporting decision: documenting the breach on internal breach log only.

A debt insolvency agent emailed a vulnerable new client’s file in error to a colleague in a different department. The colleague who received the file immediately deleted the email and informed the sender of the error.

The file contained a list of the client’s outstanding debts, their contact details, basic financial history, information about their mental health and reasons for seeking support with their financial situation. The client was vulnerable due to their mental state.

What did the data controller do?

The sender and recipient work for the same organisation in similar roles, but in different departments. Both work to the same data security measures and have completed training on working with vulnerable people.

The recipient correctly deleted the email and informed the sender. As a result, it is very unlikely that there would be any risk of harm or detriment to the data subject, despite special category personal data being involved. Therefore, there is no legal obligation to report the breach to the ICO or inform the affected data subject.

The organisation documented the breach internally and provided guidance to staff about checking contact details when sending emails, to minimise the risk to their data subjects. If the email had been sent to a member of the public, the risk to the data subject would have been higher.

Case study 3: Working on an unencrypted laptop

Reporting decision: initially not reportable, but then reportable to both the ico and data subjects.

An employee lost his briefcase, containing work on an unencrypted laptop and unredacted paper files relating to a sensitive court case – including information on criminal convictions and health information.

Initially, the employee told his manager that he believed the laptop was encrypted and the paper files were redacted. The manager reported the incident to the IT department, who remotely wiped the laptop.

At that point, the data controller did not report the breach to the ICO as they believed there was little or no risk to data subjects, though they did record the incident on their breach log.

After being informed by the IT department that the laptop was unencrypted, and after the employee discovered the paper files had not been redacted, the controller reported the breach to the ICO and informed the data subjects.

The paper files were unredacted and not secured, so somebody could have accessed sensitive data. As the laptop was unencrypted, there was no way for the controller to know whether the data had been accessed. Therefore, they could not be certain that a risk to the data subjects would not occur.

They updated the internal breach log to reflect the new information and documented the developing situation, including the way the breach changed from being not reportable to reportable. On discovering the possibility of a risk to data subjects, the controller correctly reported the breach to the ICO and informed the data subjects.

The controller was then able to use their internal breach log to explain the delay in reporting the breach to the ICO, outside the required 72 hours.

Case study 4: Sending medication to the wrong patient

A courier, delivering medication for a Scottish pharmacy, delivered one set of medication to the wrong patient (Patient A).

Patient A called the pharmacy to complain. The pharmacist then realised the prescription was for a different patient with a similar name (Patient B). After contacting the courier, the unopened medication was collected and delivered to Patient B.

Patient A and Patient B both complained to the pharmacist. Patient B felt their medical information and address had been shared inappropriately with Patient A.

The pharmacist decided that any risk to Patient B was unlikely, due to the actions of Patient A, the pharmacy and the courier. However, they decided to report the breach to the ICO in case Patient B subsequently complained to the ICO about how their personal data had been handled.

Did the data controller need to report the breach?

As the pharmacy had concluded it was unlikely there was a risk to Patient B, the breach did not need to be reported to the ICO.

There would be no further action for the pharmacy to take, assuming they had documented the details of the breach, their decision not to report and any safeguards put in place to prevent a recurrence. The threshold for informing data subjects is higher than for informing the ICO. Therefore, the pharmacy didn’t need to tell data subjects about the breach either. Informing individuals about minor breaches that are unlikely to cause risk or harm can cause unnecessary worry to data subjects and can also result in data subjects becoming fatigued if informed of numerous breaches.

The pharmacist should have had confidence in their decision making and taken responsibility for it. If, having received a complaint from the data subject, the ICO wanted to know why the pharmacy had not reported the breach, they would be able to refer to the rationale recorded on the internal breach log.

Note that if the pharmacy had been in England, it would have reported the incident via the Data Security and Protection Incident Reporting tool, regardless of the threshold for reporting to the ICO.

Case study 5: A phishing attack

A law firm employee failed to recognise a phishing attack. They received an email, clicked a link to download a document, then inadvertently entered login credentials into what they believed was a legitimate website.

A while later, the employee contacted the company’s IT department as they noticed they were no longer receiving emails.

The data controller discovered the employee’s email account had been compromised when they entered their login details. A forwarding rule had also been set up, diverting the employee’s emails to a third party.

Additionally, the third party had responded to several emails using a spoofed email account, advising the recipients of a change in bank details. This resulted in two clients making significant payments to the third party.

The controller also discovered that the compromised email account contained scanned copies of client ID documents.

The controller reported the breach to the ICO and notified affected clients about the breach.

The controller identified a high risk to affected clients’ rights and freedoms, partly due to the financial detriment that two clients experienced after making payments to the third party. It is also likely that other clients will have received emails asking for payments.

Also, the controller identified that there was a high risk of identity theft or fraud, due to scanned copies of ID documents being held on the compromised account.

Data Protection Act Punishment

Principles, gdpr and failure to comply.

The UK’s Data Protection Act 2018, which incorporates the European Union’s General Data Protection Regulation (GDPR) has been a major step forward for both the rights of individuals and obligations of organisations handling personal data. What is the Punishment for Breaking the Data Protection Act? Read on to find out.

Training your employees on data protection?

We can help you get started - let's have a chat.

Book a Meeting

Updating the original DPA from twenty years prior, this modernised piece of legislation sought to elevate data protection standards, ensuring rights and enforcing obligations at a pivotal point in time when personal information has become a greatly sought-after commodity for both legitimate businesses and international criminal enterprises.

Like it’s preceding legislation, under both the DPA 2018 and GDPR, contravention can result in substantial fines being levied against offending organisations. Depending on the severity of the breach in question, fines may be applied in a tiered approach, with authorities considering the circumstances that led to a breach, response to an incident and the overall impact.

Given the publicity regarding these current substantial fines, organisations are justified in asking; what is the punishment for breaking the Data Protection Act and what exactly does the DPA mean for my business?

What is Data Protection?

The DPA fulfils the role of governing how your personal information is used by businesses and other organisations. Ensuring that personal data is used fairly and lawfully, is accurate and up to date, and is used only for explicitly stated purposes and is handled in an appropriately secure manner. Incorporating the European wide GDPR, the DPA ensures a greater level of protection for the digital and data rights of citizens.

Regarding the GDPR, all the requirements laid forth are incumbent upon not only EU based organisations, but also those which collect any information within any EU territories. For example, though Facebook is an American company, data collected on European citizens should be as carefully managed by Facebook as any German company.

Personal Data

For the purposes of the GDPR, personal data is defined as: “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”

Breaking the Data Protection Act – Case Study

Doorstep Dispensaree Ltd Fined £275,000

In late 2019, the Information Commissioner’s Office announced a fine levied against a London-based pharmacy . The penalty came as a result of the pharmacy’s failure to ensure the security of special category data, which was kept in unlocked containers at the back of its premises.

Around 500,000 documents containing medical information and other sensitive information were found unprotected not only from prying eyes but also from the elements, with many files discovered to be rain damaged.  This negligence resulted in the pharmacy being issued a £275,000 fine as well as being ordered to improve its data protection processes within three months or face further consequences.

Try our GDPR Training for Free!

In the event of a breach.

As well as asking, what is the punishment for breaking the Data protection Act, it’s worth understand a little bit about what constitutes a breach. It’s vital to understand that a ‘data breach’ doesn’t just refer to a ‘stolen data’ incident, and legally encompasses a variety of incidents. Defined as “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data.”

For example, the loss of a strongly encrypted USB drive containing personal data, may not necessarily qualify as a breach, whereas that same drive without any encryption, if lost, would qualify as an incident in violation of the DPA.

Should you and your organisation be unfortunate enough to fall victim to an information security attack and the personal information with which you work being affected, then under law there are specific responsibilities with which you must comply.

  • In the event of a personal data breach, an organisation is required to report said breach to the relevant supervisory authority within 72 hours of becoming known (where feasible).
  • It is also important to bear in mind that if the breach is likely to result in adverse effects upon the rights and freedoms of data subjects, then those individuals must be informed “without undue delay.”
  • You must also ensure you have “robust breach detection, investigation and internal reporting procedures in place.”

Failure to report such an incident may have a range of effects on individuals including discrimination, damage to reputation, financial loss, social or economic disadvantages.

Any incident must also be assessed on a case by case basis and organisations are required to give reasonable justification for their decision to either report (or not report) a breach to the supervisory authority, which in the UK is the Information Commissioner.

Organisations are also required to keep a record of any personal data breaches, regardless of whether you are required to or decide to notify authorities. 

The Information Commissioner has the power to issue fines for infringing on data protection law, including the failure to report a breach. The specific failure to notify can result in a fine of up to 10 million Euros or 2% of an organisation's global turnover , referred to as the ‘ standard maximum’ .

The most serious of data protection violations can result in a maximum fine of 20 million Euros (equivalent in sterling) or 4% of the total annual worldwide turnover in the preceding financial year, whichever is higher.

Organisations wishing to avoid these fines should also be aware that this ‘higher maximum’ amount can apply to failure to comply with “any of the data protection principles, any rights an individual may have or in relation to any transfers of data to third countries.” Security controls such as security awareness training and phishing simulation play a part in this as well.

Facebook/Cambridge Analytica Scandal

The data protection violation, which occurred in 2015, resulted in the maximum possible fine of £500,000 . Becoming public in early 2018, if the improper protections Facebook implemented for user data had happened following the introduction of the GDPR, then the fine levied by the ICO could have been 4% of Facebook’s 2018 global revenue; around £1.7 billion .

Deputy Commissioner of the ICO, James Dipple-Johnstone noted regarding the fine, “Protection of personal information and personal privacy is of fundamental importance, not only for the rights of individuals…we expect that Facebook will be able to move forward and learn from the events of this case”.

The Principles of the DPA

How Has DPA Changed?

Under the UK’s Data Protection Act 1998, eight data protection principles existed at the centre of the legislation. By 2018 these principles were developed and advanced further by the European Union’s GDPR and made a part of UK law within the DPA 2018.

With a great deal of cross-over between the DPA 1998 and 2018, many of the now seven principles of data protection are only slight augmentations of the previous laws. Below we can see how these existing seven principles of data protection have been incorporated and developed by the GDPR.

What is the Punishment for Breaking the Data Protection Act? All organisations operating under either the DPA 2018 or the GDPR should familiarise themselves with information provided directly by the Information Commissioner’s Office regarding fines and penalties. With a great deal of information pertaining to the question of ‘what is the punishment for breaking the Data Protection Act?’ the ICO provides an invaluable resource.

Security Awareness for your Organisation

Enjoyed our blog? Learn more about how Hut Six can help improve you security awareness with training and simulated phishing. Start a free trial now, or book a meeting with one of our experts.

Coronavirus Cyber Attacks

How Cyber Criminals are Exploiting the Coronavirus

How Cyber Criminals are Exploiting the Coronavirus - From Critical Infrastructure to Leaked Video Conferences. Blog by Hut Six Security

Insider Threat Breach at Morrisons

Morrisons Found Not Liable for Insider Threat Breach

UK supermarket Morrisons found not guilty for insider threat data breach. Blog by information security awareness training provider Hut Six Security

Phishing Text Message Examples

What is a Phishing Text Message?

What is a phishing text message? "smishing" is still a significant threat. Blog by Information Security training provider Hut Six Security.

Opportunist Cyber Criminals

Cyber Criminals Always There to Exploit a Crisis

It has been reported that a significant cyber attack has been launched against the World Health Organisation. Information Security blog by Hut Six Security.

6 SME Security Tips For SMEs

6 Business Critical Information Security Tips for SMEs

Information security tips to help safeguard any organisation. Blog by Information Security Awareness Training Provider Hut Six Security.

Anti-Phishing Training for Small Businesses

The Essential Anti-Phishing Training Guide for SMEs

What is phishing and how can you avoid it? The essential Anti-Phishing Training Guide from information security awareness platform Hut Six Security.

Business Continuity Plan

What to Do if you Don’t Have a Business Continuity Plan

In times of sudden change, be it a natural disaster, electronic failures or global pandemics, having a business continuity plan is essential. But what should you do if you don't have one?

COVID-19 Phishing Attacks

Phishers Exploiting COVID-19 Coronavirus

Phishing attacks are using the COVID-19 Coronavirus as a means of attracting unsuspecting individuals. Information Security blog from Hut Six Security.

Small Business Security Basics

SME Security is No Picnic

SME Security is No Picnic: problem in Chair not in Computer. Information security blog by information security awareness training provider Hut Six Security.

Data Protection Act for Businesses

How Does the Data Protection Act Affect Businesses?

How Does the Data Protection Act Affect Businesses? Rights, Obligations and Important Concepts. Blog by Hut Six Security.

Speak to us about your Cyber Awareness

We use cookies on our site to improve user experience, performance and marketing.

View our privacy policy for more information.

breach of data protection act case study

  • Publications
  • Case Studies (Annual Report)

Case Studies

The following is a list of case studies, which have not been featured in the DPC's Annual Reports. These case studies provide an insight into some of the issues that this Office investigates on a day to day basis. 

  • Inaccurate Information held on a banking system
  • Failure to respond fully to an access request
  • Use of CCTV in the workplace
  • Access to CCTV footage
  • Obligation to give reasons when refusing to provide access to personal data
  • Processing of Special Category Data
  • Further processing for a compatible purpose
  • Appropriate security measures when processing medical data
  • Appropriate security measures
  • Processing that is necessary for the purpose of legitimate interests pursued by a controller
  • Processing that is necessary for the purpose of performance of a contract
  • Confidential expressions of opinion and subject access requests
  • Processing of health data
  • Access requests and legally privileged material
  • Processing in the context of a workplace investigation
  • Amicable resolution - proof of identification and data minimisation
  • Amicable resolution - Right to erasure and user generated content
  • Amicable resolution in cross-border complaint - right to erasure
  • Amicable resolution - right to erasure
  • Disclosure and unauthorised publication of a photograph
  • Legal basis for processing and security of processing
  • Erasure request and reliance on Consumer Protection Code
  • Debt collector involvement 
  • Appropriate security measures for emailed health data
  • Access to employee's email on a corporate email service
  • Disclosure by a credit union of a member's personal data to a private investigations firm
  • Data accuracy 
  • Retention of data by a bank relating to a withdrawn loan application
  • Access to information relating to a bank's credit assessment
  • Use of employee's swipe-card data for disciplinary purposes
  • Disclosure of a journalist's name and mobile phone number by a public figure
  • Fair and lawful processing of CCTV images of a customer
  • Disclosure of personal and financial data to a third party and erasure request
  • Unlawful processing and disclosure of special category data
  • Unlawful processing and erasure request
  • Disclosure, withdrawing consent for processing and subject access request
  • Unlawful processing of special category data
  • Disclosure of personal data
  • Fair processing of personal data
  • Unlawful processing of photograph and erasure request under Article 17 of GDPR
  • Technical and organisational measures
  • Access request following account suspension
  • Right to be forgotten - removal of online news article and photograph
  • Unlawful processing of personal data by a waste management company
  • Request for erasure of biometric data from employer database
  • Complaint of excessive personal data requested by a letting agent
  • Excessive CCTV cameras in the workplace complaint
  • Rectification request regarding inaccurate information in a Section 20 report
  • Hospital refuses erasure request of special category data

1) Case Study 1: Inaccurate Information held on a banking system

The complainant in this instance held a mortgage over a property with another individual. The complainant and the other individual left the original property and each moved to separate addresses. Despite being aware of this, the complainant’s bank sent correspondence relating to the complainant’s mortgage to the complainant’s old address, where it was opened by the tenants in situ.

In response, the complainant’s bank noted that its mortgage system was built on the premise that there would be one correspondence address and, in situations where joint parties to the mortgage no longer had an agreed single correspondence address, this had to be managed manually outside the system, which sometimes led to errors.

It was apparent that the data controller for the purposes of the complaint was the complainant’s bank, as it controlled the complainant’s personal data for the purposes of managing the complainant’s mortgage. The data in question consisted of (amongst other things) financial information relating to the complainant’s mortgage with the data controller. The data was personal data because it related to the complainant as an individual and the complainant could be identified from it.

Data Protection legislation, including the GDPR sets out clear principles that data controllers must comply with when processing a person’s personal data. Of particular relevance to this claim was the obligation to ensure that the data is accurate and kept up to date where necessary, and the obligation to have appropriate security measures in place to safeguard personal data.

In applying these principles to the facts of this complaint, by maintaining an out-of-date address for the complainant and sending correspondence for the complainant to that address, the data controller failed to keep the complainant’s personal data up to date (Article 5(1)(d)). In addition, given the multiple pieces of correspondence that were sent to the wrong address, the data controller’s security measures failed to appropriately safeguard the complainant’s data (Article 5(1)(f). The obligation to implement appropriate security measures under Article 5(1)(f) is to be interpreted in accordance with Article 32 of the GDPR, which sets out considerations that must be taken into account by a data controller when determining whether appropriate security measures are in place.

2) Case Study 2: Failure to respond fully to an access request

This complaint concerned an access request made by the complainant. The complainant was dissatisfied that his request for access to a copy of any information kept about the complainant by the data controller in electronic and in manual form was refused by the data controller, a County Council. The data controller instead advised the complainant that the requested files were available online or for viewing at the data controller’s premises.

During the course of the investigation of this complaint, the complainant alleged that the files made available to the complainant by the data controller at its premises did not constitute all the personal data concerning the complainant that was held by the data controller.

However, the data controller was of the view that the access request made by the complainant was limited to personal data held in relation to two planning applications due to the reference numbers for the planning applications being quoted by the complainant on the complainant’s access request. Accordingly, the data controller sought to distinguish between personal data relating to the publicly available planning files, which were supplied to the complainant at a public viewing, and personal data created following the refusal of the complainant’s planning application, which the data controller considered to be outside the scope of the access request.

While the complainant mentioned two specific planning applications, the access request was expressed in general terms and sought access to “any information you keep about me electronically or in manual form”. Accordingly, it was considered that the personal data sought by the complainant included all data that arose in the context of the complainant’s engagement with the data controller prior to submitting the two identified planning applications and all data that arose after those applications were refused.

The data controller, due to the specific circumstances of the case, contravened its data protection obligations when it failed to supply the complainant with a complete copy of the complainant’s personal data in response to the access request within the statutory period. Under GDPR, Article 15 relates to the right of access by the data subject to personal data relating to them that the controller holds. Article 12(3) sets out the condition under which a controller must provide said personal data. There is an onus on a controller to provide information on the action taken under such a request without undue delay and in any event within one month of receipt of the request. There are also conditions set out in this article that provide for this timeframe to be extended.

3) Case Study 3: Use of CCTV in the workplace

We received a complaint that concerned the use of CCTV cameras by the data controller in the complainant’s work premises, and the viewing of that CCTV footage (which contained personal data of the complainant, consisting of, among other things, images of the complainant) for the purpose of monitoring the complainant’s performance in the course of his employment with the data controller.

At the time of the complaint, the data controller had a CCTV policy in place, which stated that the reason for the CCTV system was for security and safety. This was also stated on signage in place in areas where the CCTV cameras were in operation. The facts indicated that the purposes for which the complainant’s personal data was initially collected were security and safety. However, during a meeting with the complainant, a manager informed the complainant that CCTV footage containing the complainant’s personal data had been reviewed solely for the purposes of monitoring the complainant’s performance in the course of the complainant’s employment with the data controller. This purpose was not one of the specified purposes of processing set out in the CCTV policy and signage. The controller acknowledged that the use of the complainant’s personal data in this way was a contravention of its policies.

Where personal data is processed for a purpose that is different from the one for which it was collected, the purposes underlying such further processing must not be incompatible with the original purposes. In relation to the use of the complainant’s personal data, the purpose of monitoring their performance was separate and distinct from the original purposes of security and safety for which the CCTV footage was collected.  On that basis, the processing of the complainant’s personal data contained in the CCTV footage for the purpose of monitoring performance was further processing for a purpose that was incompatible with the original purposes of its collection.

A further issue arose regarding the security around the manner in which the CCTV system and CCTV logs were accessed. In written responses to the DPC, the controller stated that, at the time of the complaint, access to CCTV footage was available on a standalone PC in the department, which did not require log-in information.  The responses from the controller indicated that access to CCTV footage was not logged either manually or automatically. The absence of an access log for the CCTV footage was a deficiency in data security generally. Data controllers must implement appropriate security and organisational measures, in line with Article 32 of the GDPR, in relation to conditions around access to personal data.

The CCTV policy has since been substantially revised and replaced by a new policy. The controller confirmed that the PC utilised has now been deactivated and removed. Access to CCTV recordings is now limited to a single individual in the specific unit and recordings are reviewed only in the event of a security incident or accident.

Of particular relevance in this type of situation are the obligations to process personal data fairly (Article 5(1)(a)), and to obtain such data for specific purposes and not further process it in a manner that is incompatible with those purposes (Article 5(1)(b)). Further, appropriate security measures should be in place to ensure the security of the personal data (Article 5(1)(f) and Article 32).

4) Case Study 4: Access to CCTV footage

This complaint concerned an alleged incomplete response to a subject access request for CCTV footage made by the complainant to an educational institution. The complainant advised that they were the victim of an alleged attempted assault. The complainant requested access to CCTV footage from the time the alleged assault happened, in particular in relation to a specific identified time period from two different camera angles.

In response to the request by the organisation, a select number of stills from the CCTV footage relating to one camera were provided to the complainant. The complainant requested to be provided with a still for every second of the recording in which the complainant’s image appeared. The response received from the educational institution was that all “significant” footage, in the opinion of the controller, had been provided and as the CCTV cameras were on a 30-day recording cycle, the footage had since been recorded over. The controller clarified that it did not store any footage unless there was a ”lawful requirement” to do so.

The DPC noted that, when a valid access request is made to a data controller, the request must be complied with by the data controller with a certain period. (Under Article 12(3) of the GDPR, this is generally set at one month). The right of access to personal data is one of the key fundamental rights provided for in data protection legislation. In the context of access requests to CCTV footage, the data controller’s obligation to provide a copy of the requester’s personal data usually requires providing a copy of the CCTV footage in video format. Where this is not possible, such as where the footage is technically incapable of being copied to another device, or in other exceptional circumstances, it may be acceptable to provide a data subject with stills as an alternative to video footage. However, in such circumstances where stills are provided, the data controller should provide the data subject with a still for every second of the recording in which the data subject’s image appears and an explanation of why the footage cannot be provided in video format. The controller should also preserve all footage relating to the period specified until such time as the requester confirms that they are satisfied with the response provided.

As the data controller had not provided the complainant with either the CCTV footage requested or a complete set of the stills relating to the specified period, the data controller failed to comply with its obligations in relation to the right of access, both from a time perspective (Article 12(3)) and regarding the provision of a full and complete set of personal data processed by the controller (Article 15).  

5) Case Study 5: Obligation to give reasons when refusing to provide access to personal data

This complainant previously owned a property in a development managed by a management company. The complainant made a data access request to the management company but was of the view that the data controller failed to provide all of the complainant’s personal data in its response.

The management company was determined to be the data controller, as it controlled the contents and use of the complainant’s personal data for the purposes of its role as a management company in respect of a development in which the complainant had owned a property. The data in question consisted of (amongst other things) the complainant’s name and address. The data was personal data as the complainant could be identified from it and it related to the complainant as an individual.

During the course of the DPC’s examination of the complaint, the data controller provided a description of a document containing the complainant’s personal data that was being withheld on the basis that it was legally privileged. This document had not been referred to in the data controller’s response to the complainant’s access request. It was noted that the data controller should have referred to this document and the reason(s) for which it was refusing to provide the document to the complainant in its response to the complainant’s access request.

The DPC also considered whether the data controller had supplied the complainant with all of their personal data, as required by legislation. The DPC noted that the complainant had provided specific and detailed descriptions of data they believed had not been provided. In response, the data controller stated that it did not retain data relating to matters that it considered to be closed and had provided the complainant with all of their personal data held by the data controller at the date of the access request. The office was of the view that it was credible that the data controller would not retain personal data on an indefinite basis. The DPC was satisfied that the data controller had provided the complainant with all of their personal data (with the exception of the document over which the data controller had asserted legal privilege, as set out above.) For that reason, no further contravention of the legislation had occurred. 

Under Article 15 of the GDPR, a data subject has a right to obtain from a data controller access to personal data concerning him or her which are being processed. However, this right does not apply to personal data processed for the purpose of seeking, receiving or giving legal advice, or to personal data in respect of which a claim of privilege could be made for the purpose of or in the course of legal proceedings (Section 60(3)(a)(iv) of the Data Protection Act 2018). Where a data controller refuses to comply with a request for access to personal data, however, it is required under Article 12 of the GDPR to inform the data subject without delay of the reasons for this refusal.

6) Case Study 6: Processing of Special Category Data

This complaint concerned the processing of the complainant’s personal data (in this case, details about the nature of the complainant’s medical condition) by his employer, for the purpose of administering the complainant’s sick leave and related payments. In particular, the complainant raised concerns regarding the sharing of his medical records by the data controller (the employer), including with staff at the local office of the data controller where the complainant worked. The complainant highlighted his concerns to a senior official in the organisation. However, the view of the senior official was that the minimum amount of information necessary had been shared.

When a person’s personal data is being processed by a data controller, there are certain legal requirements that the data controller must meet. Of particular relevance to this complaint are the obligations (1) to process personal data fairly; (2) to obtain such data for specific purposes and to not further process it in a manner that is incompatible with those purposes; (3) that the data be relevant and adequate and the data controller not process more of it than is necessary to achieve the purpose for which it was collected; and (4) to maintain appropriate security of the personal data. As well as the rules that apply when personal data is being processed, because the personal data in this case concerned medical information, (which is afforded even more protection under data protection legislation), there were additional requirements that had to be met by the data controller.

 It was considered that the initial purpose of the processing of this personal data by the data controller was the administration of a statutory illness payment scheme. This office also found that the further processing of complainant’s personal data for the purpose of managing employees with work-related stress or long-term sick leave and the monitoring of sick pay levels was not incompatible with the purpose for which the data was initially collected. Moreover, the DPC concluded that processing for the purpose of managing work-related stress and long-term sick leave and monitoring sick pay was necessary for the performance of a contract to which the data subject was a party, for compliance with a legal obligation to which the controller was subject, and for the purpose of exercising or performing a right or obligation which is conferred or imposed by law on the data controller in connection with employment.

It was, however, considered that the data processed by the local HR office (i.e. the specific nature of the complainant’s medical illness) was excessive for the purpose of managing long-term sick leave and work related stress leave and for monitoring sick-pay levels. Moreover, the DPC concluded that, on the basis that excessive personal data was disclosed by the shared services provider to the local HR office and further within that office, the level of security around the complainant’s personal data was not appropriate. Finally, it was considered that, in these circumstances, the data controller did not process the complainant’s personal data fairly. Therefore, the data controller was found to have contravened its data protection obligations.

Under the GDPR, special category personal data (such as health data) must be processed fairly in line with Article 5(1)(a).  It must be collected for a specified, explicit and legitimate purpose and not further processed in a manner incompatible with those purposes in line with Article 5(1)(b). It may be processed only in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing, in line with Article 5(1)(f). When processing special category data, controllers need to be conscious of the additional requirements set out in Article 9 of the GDPR.

7) Case Study 7: Further processing for a compatible purpose

The complainant was a solicitor who engaged another solicitor to represent them in legal proceedings. The relationship between the complainant and the solicitor engaged by the complainant broke down and the solicitor raised a grievance about the complainant’s behaviour to the Law Society. In this context, the solicitor provided certain information about the complainant to the Law Society. The complainant referred the matter to the DPC, alleging that the solicitor had contravened data protection legislation.

It was established that the complainant’s solicitor was the data controller, as it controlled the contents and use of the complainant’s personal data for the purpose of providing legal services to the complainant. The data in question consisted of (amongst other things) information relating to the complainant’s legal proceedings and was personal data because the complainant could be identified from it and it related to the complainant as an individual.

The DPC noted Law Society’s jurisdiction to handle grievances relating to the misconduct of solicitors (by virtue of the Solicitors Acts 1954-2015) . It also accepted that the type of misconduct that the Law Society may investigate includes any conduct that might damage the reputation of the profession. The DPC also noted that the Law Society accepts jurisdiction to investigate complaints made by solicitors about other solicitors (and not just complaints made by or on behalf of clients) and its code of conduct requires that, if a solicitor believes another solicitor is engaged in misconduct, it should be reported to the Law Society. The DPC therefore considered that the complaint made by the data controller to the Law Society was properly made and that it was for the Law Society to adjudicate on the merit of the complaint.

The DPC then considered whether the data controller had committed a breach of data protection legislation. In this regard, the DPC noted that data controllers must comply with certain legal principles that are set out in the relevant legislation. Of particular relevance to this complaint was the requirement that data must be obtained for specified purposes and not further processed in a manner that is incompatible with those purposes. The DPC established that the reason the complainant’s personal data was initially collected/processed was for the purpose of providing the complainant with legal services. The DPC pointed out that when the data controller made a complaint to the Law Society, it conducted further processing of the complainant’s personal data. As the further processing was for a purpose that was different to the purpose for which it was collected, the DPC had to consider whether the purpose underlying the further processing was incompatible with the original purpose.  

The DPC confirmed that a different purpose is not necessarily an incompatible purpose and that incompatibility should always be assessed on a case-by-case basis. In this case, the DPC held that, because there is a public interest in ensuring the proper regulation of the legal profession, the purpose for which the complainant’s data was further processed was not incompatible with the purpose for which it was originally collected. On this basis, the data controller had acted in accordance with data protection legislation.

The DPC then noted that, in addition to other legal requirements, a data controller must have a lawful basis for processing personal data. The lawful basis that the data controller sought to rely on in this case was that the processing was necessary for the purposes of the legitimate interests pursued by the data controller. In this regard, the DPC held that the data controller had a legitimate interest in disclosing to the Law Society any behaviour that could bring the reputation of the legal profession into disrepute. Further, the data controller was required by the Law Society’s Code of Conduct to report serious misconduct to the Law Society). As a result, the DPC was of the view that the data controller had a valid legal basis for disclosing the complainant’s personal data and had not contravened the legislation.

Under Article 6 of the GDPR, a data controller must have a valid legal basis for processing personal data. One such legal basis, in Article 6(1)(f) of the GDPR, provides that processing is lawful if and to the extent that it is necessary for the purpose of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights or freedoms of the data subject. However, Article 6(4) of the GDPR provides that where processing of personal data is carried out for a purpose other than that for which the data were initially collected, this is only permitted where that further processing is compatible with the purposes for which the personal data were initially collected.

In considering whether processing for another purpose is compatible with the purpose for which the personal data were initially collected, data controllers should take into account (i) any link between the purposes for which the data were collected and the purposes of the intended further processing, (ii) the context in which the data were collected, (iii) the nature of the personal data, (iv) the possible consequences of the intended further processing for data subjects, and (v) the existence of appropriate safeguards.

8) Case Study 8: Appropriate security measures when processing medical data

The background to this complaint was that the complainant’s wife made a Freedom of Information (“FOI”) request to a GP who had been involved in the care of the complainant’s son. The GP subsequently wrote to another doctor who had also treated the complainant’s son, and had separately also treated the complainant, to inform them of the FOI request. That doctor replied to the GP’s letter and, in the reply, disclosed medical information concerning the complainant, who was not a patient of the GP.

In order to determine who the data controller was, the DPC sought confirmation of the capacity in which the complainant had consulted the doctor who disclosed the information in question. It was confirmed that the doctor only saw patients publicly and, on this basis, the DPC determined that the data controller was the HSE.

In response to the complaint, the data controller admitted that the personal data regarding the complainant was disclosed in error because the doctor mistakenly believed the complainant was also a patient of the GP. However, the HSE advised that the GP recipient would have been bound by confidentiality obligations in respect of the data received. The data controller also indicated that, because the doctor in question had retired, the issue could not be addressed with them personally. The HSE confirmed that its internal policies regarding data processing had been updated and improved since the incident involving the complainant.

The DPC noted that, when personal data is being processed by a data controller, there are certain legal requirements that the data controller must meet. Of particular relevance to this complaint were the obligations to process the personal data fairly and to have appropriate security measures in place to protect against unauthorised processing (disclosure). The DPC further noted that, because the personal data was of a medical nature (and thus benefitted from increased protection under the legislation), the standard to be met in terms of what was appropriate security was higher than that applicable to personal data generally.” In addition, the DPC confirmed that, because of the increased protection afforded to health data under data protection legislation, it can be processed only if certain specified conditions are met.

It was apparent that appropriate security measures were not in place when the unauthorised disclosure to the GP took place. The DPC noted that the disclosure was to a GP who was not involved in the complainant’s medical care, and further, that the letter in which the disclosure was made had a heading referring to the complainant’s son but contained medical information relating to the complainant in the body of the letter. The mistake was therefore evident on the face of the letter itself. The DPC noted the data controller’s argument that the GP was bound by confidentiality obligations; however, it held that while this was relevant in terms of the consequences of the unauthorised disclosure, it did not address whether the data controller had appropriate security measures in place.  The DPC also highlighted that the data controller was not able to address control measures related to the disclosure as the doctor in question had retired. The DPC held that this was suggestive of the fact that a general framework related to security of personal data was not in place at the time of the disclosure.

The DPC then looked at whether the requisite conditions to permit the processing of data regarding health had been met. The DPC decided that, because the data controller had failed to put forward any lawful basis for disclosing the personal data, it had also contravened data protection legislation in this regard.

The obligation to ensure security of personal data is evident in Article 5(1)(f) of the GDPR and is further specified in Article 32, which requires that a controller and a processor implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk. In considering appropriate security measures, data controllers and processors must take into account, amongst other things, the nature, scope, context and purpose of processing, as well as the risk of varying likelihood and severity for data subjects. In this regard, the GDPR recognises that health data, which is a “special category of personal data” under Article 9 of the GDPR, are by their nature particularly sensitive in relation to fundamental rights and freedoms and merit specific protection.

Data controllers should also be aware that, where a breach of security occurs leading to the accidental or unlawful unauthorised disclosure of personal data (a “personal data breach”), it must be notified to the DPC without undue delay in accordance with Article 33 of the GDPR. Where the personal data breach is likely to result in a high risk to the rights and freedoms of the data subject, it must also be communicated to the data subject without undue delay.

9) Case Study 9: Appropriate security measures

This complaint concerned the alleged loss by the complainant’s bank of several items of correspondence relating to the complainant’s bank account, which had been hand-delivered to the bank by the complainant’s partner.

It was established that the bank was the data controller as it controlled the contents and use of the complainant’s personal data in connection with its provision of banking services to the complainant. The data in question consisted of (amongst other things) the complainant’s name, address and bank account information and was personal data as the complainant could be identified from it and it related to the complainant as an individual.

During the course of the examination of the complaint, the data controller maintained that the relevant documents had been misplaced within the bank and not externally and therefore argued that no personal data breach had occurred. The DPC noted that maintaining appropriate security measures for personal data is a key requirement under data protection law. It considered the nature of the personal data that was contained in the correspondence that went missing (the complainant’s name, address and bank account information) and noted that misplacing this information had the potential to cause significant risk to the complainant and the complainant’s financial affairs. The security measures that were in place in the data controller were not sufficient to ensure an appropriate level of security, given the nature of the personal data being processed.  As regards the data controller’s argument that the correspondence was lost internally, the DPC’s view was that a data controller’s technical and organisational measures to safeguard the security of personal data must take account of the fact that internal as well as external loss of personal data, or unauthorised access to it, can give rise to risks to people like the complainant.

Based on the above, it was considered that there had been a failure of the data controller to have appropriate security and organisational measures in place, to safeguard the complainant’s personal data, and that the data controller had therefore failed to act in accordance with the data protection legislation.

Under Article 5(1)(f) of the GDPR, personal data must be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful disclosure, using appropriate technical or organisational measures. The obligation to ensure security of personal data is further specified in Article 32, which requires that a controller and a processor implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk. In considering appropriate security measures, data controllers and processors must take into account, amongst other things, the nature, scope, context and purpose of processing, as well as the risk of varying likelihood and severity for data subjects.

Data controllers should also be aware that, where a breach of security occurs leading to the accidental or unlawful unauthorised disclosure of personal data (a “personal data breach”), this must be notified to the DPC without undue delay in accordance with Article 33 of the GDPR. Where the personal data breach is likely to result in a high risk to the rights and freedoms of the data subject, it must also be communicated to the data subject without undue delay.

10) Case Study 10: Processing that is necessary for the purpose of legitimate interests pursued by a controller

This complainant was an employee of a shop located in a shopping centre and was involved in an incident in the shopping centre car park regarding payment of the car park fee. After the incident, the manager of the car park made a complaint to the complainant’s employer and images from the CCTV footage were provided to the complainant’s employer. The complainant referred the matter to the DPC to examine whether the disclosure of the CCTV images was lawful.

It was established that the shopping centre was the data controller as it controlled the contents and use of the complainant’s personal information for the purposes of disclosing the CCTV stills to the complainant’s employer. The data in question consisted of images of the complainant and was personal data because it related to the complainant as an individual and the complainant could be identified from it.

The data controller argued that it had a legitimate interest in disclosing the CCTV images to the complainant’s employer, for example, to prevent people from exiting the car park without paying and to withdraw the agreement it had with the complainant’s employer regarding its staff parking in the car park. The DPC noted that a data controller must have a lawful basis on which to process a person’s personal data. One of the legal bases that can be relied on by a data controller is that the processing is necessary for the purposes of legitimate interests pursued by the data controller. (This was the legal basis that the data controller sought to rely on here.) The DPC acknowledged that the data controller had in principle a legitimate interest , in disclosing the complainant’s personal data for the reasons that it put forward. However, it was not “necessary” for the data controller to disclose the CCTV stills to the complainant’s employer for the purposes of pursuing those legitimate interests. This was because the car park attendant employed by the data controller had discretion to take steps against the complainant, in pursuit of the legitimate interests, without the need to involve the complainant’s employer. For example, the car park attendant had discretion to ban the complainant from using the car park without involving the complainant’s employer. On this basis, the DPC determined that it was not necessary for the data controller to notify the complainant’s employer of the incident and provide it with CCTV stills. Accordingly, the data controller had no legal basis for doing so and had contravened data protection legislation.

Under Article 6 of the GDPR, personal data can be processed only where there is a lawful basis for doing so. One such legal basis is under Article 6(1)(f), which provides that processing is lawful if and to the extent that it is necessary for the purpose of the legitimate interests pursued by the controller or by a third-party, except where such interests are overridden by the interests or fundamental rights or freedoms of the data subject. Data controllers should be aware, however, that it is not sufficient merely to show that there is a legitimate interest in processing the personal data; Articles 5(1)(c) and 6(1)(f) require data controllers to be able to show that the processing in question is limited to what is “necessary” for the purpose of those legitimate interests.

11) Case Study 11: Processing that is necessary for the purpose of performance of a contract

This complainant was involved in an incident in a carpark of a building in which they worked.  A complaint was made by the manager of the car park to the complainant’s employer and images from the CCTV footage of the incident were subsequently obtained by the complainant’s employer. Disciplinary proceedings were then taken against the complainant arising out of the car park incident. The complainant’s manager and other colleagues of the complainant viewed the CCTV stills in the context of the disciplinary proceedings.

The complainant’s employer was the data controller in relation to the complaint, because it controlled the contents and use of the complainant’s personal data for the purposes of managing the complainant’s employment and conducting the disciplinary proceedings. The data in question consisted of images of the complainant and was personal data because it related to the complainant as an individual and the complainant was identifiable from it.

In response to the complaint, the data controller maintained that it had a lawful basis for processing the complainant’s personal data under the legislation because the CCTV images were used to enforce the employee code of conduct, which formed part of the complainant’s contract of employment. It also stated that, because of the serious nature of the incident involving the complainant, it was necessary for the data controller to investigate the incident in accordance with the company disciplinary policy, which was referred to in the complainant’s employment contract. The data controller also argued that the CCTV stills were limited to the incident in question and that only a limited number of personnel involved in the disciplinary process viewed them.

The DPC noted that data protection legislation permits the processing of a person’s personal data where the processing is necessary for the performance of a contract to which the data subject (the person whose personal data is being processed) is a party. The DPC noted the data controller here sought to argue that the use of the CCTV images was necessary for the performance of the complainant’s employment contract. However, the DPC was of the view that it was not ‘necessary’ for the data controller to process the complainant’s personal data contained in the CCTV images to perform that contract. For this argument to succeed, the data controller would have had to show that it could not have performed the complainant’s employment contract without processing the complainant’s personal data. As the data controller had failed to satisfy the DPC that this was the case, the data controller was judged to have infringed the data protection legislation.

The DPC also noted that, in addition to the requirement to have a lawful basis for processing, there are also certain legal principles that a data controller must comply with, when processing personal data. It highlighted that the processing must be adequate, relevant and limited to what is necessary in relation to the purposes for which the data is processed. The DPC noted the data controller’s argument that the CCTV stills were limited to the incident in question and that only a limited number of personnel involved in the disciplinary process viewed the stills. However, the DPC was of the view that the data controller had failed to show why it was necessary to use the CCTV images. On this basis, there had been a further infringement of the legislation by the data controller.

Under Article 6 of the GDPR, personal data can be processed only where there is a lawful basis for doing so. One such legal basis is under Article 6(1)(b), which provides that processing is lawful if and to the extent that it is necessary for the performance of a contract to which the data subject is a party. Data controllers should be aware, however, that it is not sufficient merely to show that there is a contractual basis for processing the personal data; Articles 5(1)(c) and 6(1)(b) require data controllers to be able to show that the processing in question is limited to what is “necessary” for the purpose of performance of the contract. 

12) Case Study 12: Confidential expressions of opinion and subject access requests

This complainant made a data subject access request to their employer. However, the complainant alleged that their employer omitted certain communications from its response, wrongfully withheld data on the basis that it constituted an opinion given in confidence and did not respond to the request within the required timeframe as set out in the legislation.

The complainant’s employer was the data controller as it controlled the contents and use of the complainant’s personal data for the purposes of managing the complainant’s employment. The data in question consisted of the complainant’s HR file and data regarding the administration of the complainant’s employment. The data was personal data because the complainant could be identified from it and the data related to the complainant as an individual.

During the course of the examination of the complaint, the data controller identified additional documents containing the complainant’s personal data and provided these to the complainant. In relation to the document which the data controller had asserted constituted an opinion given in confidence, during the course of the investigation of this complaint, the individual who had expressed the opinion in question consented to the release of the document to the complainant, and so the document was provided by the data controller to the complainant.

Data protection legislation provides a right of access for a data subject to their personal data and, further, that access must be granted within a certain timeframe. Having investigated the complaint, the DPC was satisfied that the data controller had carried out appropriate searches and had provided the complainant with all the personal data, which the complainant was legally entitled to receive. The documents provided by the data controller to the complainant during the course of the examination of this complaint should have been furnished to the complainant within the timeframe provided for in the legislation.

Under Article 15 of the GDPR, a data subject has a right to obtain from a data controller access to personal data concerning him or her, which are being processed. The data controller must respond to a data subject access request without undue delay and in any event within one month of receipt of the request. However, section 60 of the Data Protection Act 2018 provides that the right of access to personal data does not extend to data which consist of the expression of opinion about the data subject by another person given in confidence or on the understanding that it would be treated as confidential to a person who has a legitimate interest in receiving the information.

13) Case Study 13: Processing of health data

The complainant was a member of an income protection insurance scheme and had taken a leave of absence from work due to illness. The income protection scheme was organised by the complainant’s employer. In order to claim under the scheme, the complainant was required to attend medical appointments organised by an insurance company. Information relating to the complainant’s illness was shared by the complainant with the insurance company only. However, a third party company (whose involvement in the claim was not known to the complainant) forwarded information to the complainant’s employer regarding medical appointments that the complainant was required to attend. The information included the area of specialism of the doctors in question.

It was established that the insurance company was the data controller as it controlled the contents and use of the complainant’s personal data for the purposes of managing and administering the complainant’s claim under the insurance scheme. The data in question included details of the complainant’s illness, scheduled medical appointments and proposed treatment and was deemed to be personal data because the complainant could be identified from it and it related to the complainant as an individual.

During the course of the investigation, the data controller argued that the complainant had signed a form, which contained a statement confirming that the complainant gave consent to the data controller seeking information regarding the complainant’s illness. When asked by the DPC to clarify why it had shared the information regarding the complainant’s medical appointments with the third party company (who was the broker of the insurance scheme), the data controller advised it had done so to update the broker and to ensure that matters would progress swiftly.

The data controller stated it had a legislative obligation to provide the complainant with certain information. In particular, that the data controller was obliged to inform the complainant as to the recipients or categories of recipients of the complainant’s personal data. The DPC pointed out that, while the data controller had notified the complainant that it might seek personal data relating to them, it had failed to provide sufficient information to the complainant as regards the recipients of the complainant’s personal data.

Data protection legislation also requires that data, which are kept by a data controller, be adequate, relevant and limited to what is necessary in relation to the purposes for which the data were collected. The DPC examined the reason given by the data controller for disclosing information about the nature of the complainant’s medical appointments (i.e. to update the broker and to ensure matters progressed smoothly). The DPC was of the view that it was excessive for the data controller to disclose information regarding the specific nature of the medical appointments, including the specialisms of the doctors in question, to the third party company.

The DPC pointed out that, under data protection legislation, data concerning health is afforded additional protection.  The DPC was of the view that, because the information disclosed by the data controller included details of the specialisms of the doctors involved, it indicated the possible nature of the complainant’s illness and thus benefitted from that additional protection. The DPC confirmed that, because of the additional protection, there was a prohibition on processing the data in question, unless one of a number of specified conditions applied. For example (and of relevance here), the personal data concerning health could be legally processed if the complainant’s explicit consent to the processing was provided to the data controller. The DPC then considered whether the complainant signing the claim form (containing the paragraph about consent to the data controller seeking information, as described above) could be said to constitute explicit consent to the processing (disclosure) of the information relating to the complainant’s medical appointments. The DPC noted that it could be said that the complainant’s explicit consent had been given to the seeking of such information by the data controller. However, the complainant had not given their explicit consent to the giving of such information by the data controller to third parties. On this basis, the DPC held that a further contravention of the legislation had been committed by the data controller in this regard.

Under Article 13 of the GDPR, where personal data are collected from a data subjects, the data controller is required to provide the data subject with certain information at the time the personal data are obtained, such as the identity and contact details of the data controller and, where applicable, its Data Protection Officer, the purpose and legal basis for the processing and the recipients of the data, if any, as well as information regarding the data subject’s rights. This information is intended to ensure that personal data are processed fairly and transparently. Where the personal data have been obtained otherwise than from the data subject themselves, additional information is required to be provided to the data subject under Article 14 of the GDPR. This information must be given in a concise, transparent, intelligible and easily accessible form.

Additionally, the data minimisation principle under Article 5(1)(c) requires that personal data be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed. This means that the period for which personal data are stored should be limited to a strict minimum and that personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means.

Finally, data controllers should note that personal data concerning health is considered a “special category of personal data” under Article 9 of the GDPR and is subject to specific rules, in recognition of its particularly sensitive nature and the particular risk to the fundamental rights and freedoms of data subjects which could be created by the processing of such data. The processing of medical data is only permitted in certain cases as provided for in Article 9(2) of the GDPR and sections 45 to 54 of the Data Protection Act 2018, such as where the data subject has given explicit consent to the processing for one or more specified purposes.

14) Case Study 14: Access requests and legally privileged material

This complaint concerned an alleged incomplete response to a data subject access request. The background to this complaint was that the complainant had submitted an access request to the trustees of a pension scheme (the “Trustees”). As part of its response to the access request, the Trustees referred to a draft letter relating to the complainant; however, this draft letter was not provided to the complainant.

It was established that the Trustees were the data controller as they controlled the contents and use of the complainant’s personal data for the purposes of the complainant’s pension. The data in question consisted of (amongst other things) information about the complainant’s employment and pension and was personal data because it related to the complainant as an individual and the complainant could be identified from it.

The data controller sought to argue that the draft letter was legally privileged and that therefore the data controller was not required to provide it to the complainant. The DPC sought further information from the data controller regarding the claim of legal privilege over the draft letter. In response, the data controller did not clarify the basis on which privilege was asserted over the draft letter, however, it agreed to provide the data to the complainant.

It was decided therefore that the data controller had failed to establish an entitlement to rely on the exemption in respect of legally privileged data. Accordingly, the letter should have been provided to the complainant in response to the complainant’s access request within the timeframe set out in the legislation.

Under Article 15 of the GDPR, a data subject has a right to obtain from a data controller access to personal data concerning him or her, which are being processed. The data controller must respond to a data subject access request without undue delay and in any event within one month of receipt of the request. However, the right of access to one’s personal data does not apply to personal data processed for the purpose of seeking, receiving or giving legal advice or personal data in respect of which a claim of privilege could be made for the purpose of or in the course of legal proceedings. Where a data controller seeks to assert privilege over information sought by a data subject under Article 15, the DPC, examining a complaint in relation to the refusal, will require the data controller to provide considerable information, including an explanation as to the basis upon which the data controller is asserting privilege, so that the validity of the claim can be properly evaluated.

15) Case Study 15: Processing in the context of a workplace investigation

The complainant was involved in a workplace investigation arising out of allegations made by the complainant against a colleague. The complainant’s employer appointed an independent consultancy firm (the “Consultancy Company”) to carry out the investigation and the findings of the Consultancy Company were subject to a review by an independent panel.

After the conclusion of the workplace investigation, the complainant made a data access request to their employer and a number of documents were provided in response to this request. However, the complainant was of the view that the request was not responded to fully. For example, the complainant claimed that the witness statements (that had been taken during the investigation) that were provided to the complainant were factually incorrect and that certain documents were not provided to the complainant (such as access logs to the complainant’s personnel files). The complainant further alleged that their employer had disclosed details of the complainant’s work performance, sick leave arrangements and copies of the complainant’s pay slips to the complainant’s colleagues. Finally, the complainant claimed that their employer had failed to comply with the complainant’s requests for rectification of the witness statements (which the complainant alleged were factually incorrect).

It was established that the complainant’s employer was the data controller as it controlled the complainant’s data in the context of the workplace investigation. The data in question consisted of the complainant’s payroll information, information relating to the complainant’s sick leave and witness statements relating to the complainant. The data was personal data because it related to the complainant as an individual and the complainant could be identified from it.

In response to the complainant’s allegation that their access request was not responded to fully, the data controller stated that, in relation to the witness statements, the complainant was provided with the copies of the original witness statements that were held on the complainant’s file. In relation to the access logs, the data controller was of the view that these did not constitute personal data (because they tracked the digital movement of other employees on the data controller’s IT systems). In relation to other miscellaneous documents that the complainant alleged had not been received, the data controller indicated that, if the complainant could specify details of these documents, it would consider the complainant’s allegation further.

Regarding the complaint that the data controller had disclosed details of the complainant’s work performance to colleagues of the complainant, the data controller argued that the complainant’s performance would have been discussed with the complainant’s managers and therefore was disclosed for legitimate business reasons. Regarding the complaint around disclosure of details regarding the complainant’s sick leave, the data controller noted that was not aware of any such disclosure. Finally, in relation to the allegation that the complainant’s payslips were disclosed, the data controller argued that they were provided to an employee of the data controller to be reviewed in the context of a separate case taken by the complainant.

The complainant also made a request for rectification of witness statements, which the complainant alleged, were factually incorrect. However, the data controller advised that what was recorded in the witness statements represented the views of the people involved and, on this basis, refused to amend the witness statements.

The DPC was of the view that there were five issues to be examined by it in relation to the complaint. The DPC’s view on each of these issues is summarised below (under headings representing each of the five issues).

Access request

The DPC noted that the complainant had made a valid access request. However, having considered the matter, on balance, the DPC was of the view that there was no evidence available to suggest that the data controller unlawfully withheld information. The DPC noted, however, that the complainant’s data access request had not been dealt with in the timeframe required under the legislation. In this regard, the data controller had committed a data protection breach.

Under Article 12(3) of the GDPR, a data subject has a right to obtain from a data controller access to personal data concerning him or her, which are being processed. The data controller must respond to a subject access request without undue delay and in any event within one month of receipt of the request.

Alleged unauthorised disclosure of the complainant’s personal data

Controllers must have a lawful basis, under data protection legislation to process personal data, including the disclosure of that data to a third party. In relation to the disclosure of details regarding the complainant’s work performance, the DPC was of the opinion that such processing was lawful as it was for legitimate business reasons. Regarding the issue of disclosure of sick leave details, the DPC concluded that it did not have sufficient information relating to the alleged incident in order to determine whether a breach of the legislation had occurred. In relation to the disclosure of the complainant’s payslips, the DPC was of the view that the disclosure was lawful. This was because the payslips were disclosed in order to assist the data controller in defending a separate legal claim brought by the complainant, against it.

Under Article 6 of the GDPR, a data controller is required to have a legal basis for processing (including disclosing) any personal data. The available legal bases for processing include (a) that the data subject has given consent, (b) that the processing is necessary for the performance of a contract to which the data subject is a party, (c) that the processing is necessary for compliance with a legal obligation to which the data controller is subject, (d) that the processing is necessary in order to protect the vital interests of an individual, (e) that the processing is necessary for the performance of a task carried out in the public interest, or (f) that the processing is necessary for the purposes of legitimate interests pursued by the data controller or by a third-party.

Fair processing

There is an obligation on data controllers to process personal data fairly. During the course of its investigation, the DPC asked the data controller to confirm how it complied with its obligations to process the complainant’s data in a fair manner, in relation to each of the alleged disclosures of the complainant’s personal data. The data controller failed to provide the information required and in these circumstances, the DPC considered that the data controller failed to process the complainant’s data, in line with fair processing obligations.

Under the GDPR, personal data must be processed lawfully, fairly and in a transparent manner in relation to the data subject. That principle requires that the data subject be provided with certain information under Articles 13 and 14 of the GDPR in relation to the existence of the processing operation and its purposes. Data subjects should be made aware of risks, rules, safeguards and tights in relation to the processing of their personal data. Where personal data can be legitimately disclosed to another recipient, data controllers should inform the data subject when the personal data are first disclosed of the recipient or categories of recipients of the personal data.

Right to rectification

Under Data Protection legislation, there is a right to rectification of incorrect personal data. However, here the data controller had confirmed that what was recorded in the witness statements represented the views of the people involved. The view was taken that where an opinion is correctly recorded and where the opinion is objectively based on matters that the person giving the opinion, would reasonably have believed to be true, the right to rectification does not apply. 

Under Article 5 of the GDPR, personal data being processed must be accurate and, where necessary, kept up to date and data controllers are required to ensure that every reasonable step is taken to ensure that personal data that are inaccurate, having regard to the purpose for which they are processed, are erased or rectified without delay. Under Article 16 of the GDPR, a data subject has the right to obtain from a data controller without undue delay the rectification of inaccurate personal data concerning him or her. However, under section 60 of the Data Protection Act 2018, this right is restricted to the extent that the personal data consist of an expression of opinion about the data subject by another person given in confidence or on the understanding that it would be treated as confidential to a person who has a legitimate interest in receiving the information.

Retention of the complainant’s personal data

The DPC asked the data controller to outline the legal basis for the retention (i.e. processing) of the complainant’s personal data relating to the workplace investigation. The data controller advised that this data was being retained in order to deal with the complainant’s requests and appeals under various statutory processes. On this basis, the DPC was of the view that the retention of the complainant’s personal data was lawful as it was for legitimate business reasons.

Under the GDPR, not only must a data controller have a lawful basis for initially obtaining an individual’s personal data, but it must also have an ongoing legal basis for the retention of those data in accordance with Article 6, as set out above. Under Article 5(1)(e) of the GDPR, personal data which is in a form permitting the identification of data subjects must be kept for no longer than is necessary for the purposes for which they are processed.

16) Case study 16: Proof of identification and data minimisation

The DPC received a complaint, via the Berlin Data Protection Authority, from an individual regarding a request they made to a data controller to have the email address associated with their customer account changed. The complainant had made the request via the data controller’s online chat function and was subsequently informed that a copy of an ID document to authenticate account ownership would be required in order to proceed with the request. The complainant refused to provide this information and their request was therefore not progressed by the data controller at that time.  

Following receipt of the complaint, the DPC engaged with the data controller during which it was established that the data controller does not require individuals to provide an ID document in order to change the email address associated with an account. Furthermore, the customer service agent had used an incorrect operating procedure when responding to the request of the complainant. The data controller’s standard procedure directs customer service agents to advise customers that they can change their email address by signing into their own account and making the change directly within their ‘Account’ settings page. The data controller also advised that if a customer does not wish, or is not able, to change their email address on their own, its procedure directs customer service agents to request limited information from the customer which is already held by them, in order to verify the account holder.

In light of the complaint, the data controller agreed to provide clear instructions on how the complainant could change their email address associated with their account information without providing any additional personal data. The data controller also conducted a thorough review of its customer service systems and provided further refresher training to all of its customer service agents on the correct standard operating procedures to follow in such instances.

The DPC then engaged with the complainant, via the Berlin Data Protection Authority, to provide the information it had received from the data controller in an attempt to facilitate an amicable resolution to the complaint. The complainant subsequently confirmed to the DPC that they had successfully changed the email address on their account with the data controller.  

This case study demonstrates the benefits to both data controllers and to individual complainants of engaging in the amicable resolution process in a meaningful way. In this case, the positive actions taken by the data controller, including providing detailed information to the complainant on how to proceed themselves with changing the email address associated with their account, resulted in a good outcome for both parties.

17) Case study 17: Amicable resolution - right to erasure and user generated content

This complaint concerned an initial refusal by the data controller to comply with an erasure request made by the complainant, pursuant to Article 17 GDPR. The complainant first lodged their complaint via the Spanish Data Protection Authority, the AEPD, who then transferred the complaint to the DPC as the Lead Supervisory Authority.

The complainant stated that they were named, and therefore identified, in a negative review relating to their place of employment. The review, accompanied by a partial image of the complainant, had been posted online. The complainant had sought the removal of their name and any associated images from the review.

During its engagement with the DPC on the matter, the data controller advised that they had reviewed the content in question in the context of their own privacy guidelines for the removal of content from the website and that they considered the content did not infringe upon same. 

The DPC requested that the data controller review the matter again, in the spirit of amicably resolving the complaint. The data controller subsequently reverted to advise that after a further assessment of the content in question they had made the decision to remove the review posting in its entirety.

This case study demonstrates the benefits, to individual complainants, of the DPC’s intervention by way of the amicable resolution process. In this case, this led to the complainant being able to affect their right of erasure over their personal data, as afforded to individuals under Article 17 of the GDPR.

18) Case study 18: Amicable resolution in a cross-border complaint - right to erasure

The DPC received a complaint from an individual regarding an erasure request made by them to a data controller, a platform for booking accommodation, pursuant to Article 17 GDPR. The complainant had begun creating an account on the data controller’s platform but chose to abandon the process before it was complete. The complainant then communicated his erasure request to the data controller by email and telephone. In response to the erasure request, the data controller informed the complainant that they required an identity document in order to comply with the erasure request.

The complaint was identified as potentially being capable of amicable resolution under Section 109 of the Data Protection Act 2018, and the data controller agreed to work with the DPC to attempt to amicably resolve the complaint. The data controller provided the DPC with its replies to the complainant relating to the matters raised in the complaint thus far, and confirmed that, in response to the complainant’s erasure request, the data controller had requested an identity document.

In the course of the DPC’s investigation of the complaint, the data controller also confirmed that the account in question had never been used to book or host accommodation or to use the service in any way. Following intervention by the DPC, the data controller undertook to delete the complainant’s account without requesting that the complainant provide any additional documentation. 

The DPC communicated these developments to the complainant. The complainant responded by confirming that they accepted the proposed action and that erasure of the account would resolve their complaint. The DPC engaged further with the data controller, which provided confirmation to the DPC that it had erased the complainant’s account. The data controller also conveyed this erasure confirmation to the complainant directly.

The complaint was amicably resolved in accordance with section 109 of the Data Protection Act 2018. This case study demonstrates the benefits, to individuals, of the DPC’s intervention by way of the amicable resolution process. In particular, this case study brings to the fore the manner in which the DPC can assist a complainant through the amicable resolution process. This includes explaining the complainant’s individual concerns to the data controller, where initial engagement between them and data controller has not led to a resolution of their concerns. In this case, the DPC’s involvement resulted in deletion of the complainant’s personal data by the data controller, in accordance with Article 17, without requiring any further action on the part of the individual.

19) Case study 19: Amicable resolution - right to erasure

This complaint concerned the alleged non-response to an erasure request made by the complainant to a data controller pursuant to Article 17 GDPR.

Following receipt of the complaint from the complainant, the DPC engaged with both parties in relation to the subject matter of the complaint. Further to this engagement, it was established that, during the week in which the complainant sent their erasure request by email to the data controller, a new process to manage personal data erasure requests was being implemented by the data controller.

The data controller informed the DPC that it was during this transitional period from the old system to the new system that the erasure request was received from the data subject. The data controller further advised that while new personnel were being trained on how to manage these types of requests during this period, it appeared a response to the erasure request was missed. The data controller stated that this was an oversight, possibly due to a technical issue or human error and that it regretted the error.

In the circumstances, the data controller agreed to comply with the erasure request and sincerely apologised for the error. The data controller also subsequently confirmed to the DPC that it had deleted the complainant’s personal data.

The DPC informed the complainant of the outcome of its engagement with the data controller, noting that the positive actions taken by the data controller appeared to deal with the concerns raised in their complaint.

The complainant subsequently confirmed to the DPC that they agreed to the amicable resolution of their complaint as their concerns were now resolved and that their complaint was now withdrawn.

In this circumstance, the complaint was deemed to be amicably resolved and withdrawn, in accordance with section 109 of the Data Protection Act 2018.

This case study demonstrates the benefits to both data controllers and to individual complainants of engaging in the amicable resolution process in a meaningful way. In this case, the data controller’s detailed explanation of how the oversight occurred, their offering of an apology and an undertaking to resolve the matter for the complainant, resulted in a good outcome for both parties. Most importantly, the complainant was able to exercise their right to obtain from the controller the erasure of personal data concerning them, as afforded to them under the GDPR.

20) Case study 20: Disclosure and unauthorised publication of a photograph

A data subject made a complaint to the DPC regarding the publication of their child’s image, name and partial address in a religious newspaper. The image used in the publication was originally obtained from a religious group’s Facebook page. The data subject informed the DPC that consent was not given for the wider use of the image through the publication in the newspaper. The concern was for the child’s privacy arising from the use of the image, name and partial address by the newspaper. In correspondence sent directly between the data subject and the newspaper the data subject cited Article 9 of the GDPR concerning special category personal data applies to their complaint because the image disclosed information regarding the child’s religious beliefs.

As part of its examination, the DPC engaged with the data controller and asked for a response to the complaint. The data controller informed the DPC they never intended any distress to the data subject or their family. A reporter had seen the image on the group’s Facebook page and asked permission to use it from a leading member of the religious group, subsequently this member granted permission for its usage. The newspaper stated the image was already available online through the group’s Facebook page and was taken at a public event and the address used was that of the religious group and not the child’s personal address.

In further response to the DPC’s queries, the newspaper informed the DPC that it was their normal practice to seek consent to take and use images and although in this circumstance the image was available on an open Facebook page the newspaper still contacted the religious group and queried if permission had been obtained to use the image. The leading member of the religious group they had contacted advised them that another person in loco parentis (acting in the place of a parent) had given permission. The newspaper stated to the DPC, that this person “was acting in loco parentis as far as [the newspaper] was concerned and consent had been therefore given.” The newspaper also informed the DPC they rely on Article 9(2)(a) and 9(2)(e) of the GDPR for the processing of special category personal data. The newspaper concluded that they had the required legitimate interest in publishing the photograph, the photograph was in a public domain through the open Facebook page, they took steps to ensure that consent was obtained to publish the photograph and the consent furnished was adequate and they were entitled to rely on same. The newspaper said they were satisfied they had complied with their obligations but they had reviewed and amended their internal policies on this issue.

The DPC provided the data subject with the response to the complaint and asked the data subject whether they considered their data protection concerns adequately addressed and amicably resolved. In addition to this the data subject was invited to make their observations on the response from the data controller. The data subject responded to inform the DPC the matter was not amicably resolved and that explicit consent should have been obtained. The DPC proceeded to conclude the examination and provide an outcome to both parties as required under section 109(5) of the Data Protection Act 2018 (the 2018 Act).

The DPC advised the data subject under section 109(5)(c) of the 2018 Act that the explanation put forward by the data controller concerning the processing of the child’s personal data in the circumstances of this complaint was reasonable. In saying this, the DPC wrote to the religious newspaper and under section 109(5)(f) of the 2018 Act recommended that it considers the Code of Practice from the Press Council, in particular principle 9 therein, ensuring that the principle of data minimisation is respected, and to conduct and record the balancing exercise between public interest in publication and the rights and interests of data subjects.

21) Case study 21: Legal basis for processing and security of processing

A data subject lodged a complaint with the DPC against a data controller following a delayed response to a subject access request. The data subject was concerned about the processing of their personal data between the data controller and a third party, a HR investigator (investigator). Such concerns related to the legal basis for processing the data subject’s personal data and the security of processing the personal data, as the investigator was using a Gmail account during the course of the examination.

The data subject had exercised their right under Article 15 of the General Data Protection Regulation (GDPR) by requesting access to their personal data. However, they had not received a response to their request within one month as required by Article 12(3) of the GDPR. Following a period of two months and still no response, the data subject informed the data controller that a complaint would be lodged with the DPC. Following the DPC’s engagement, the data controller provided the personal data relevant to the subject access request and explained the delay was due to a technical error in the email system. At this stage the data subject was satisfied they had received all personal data requested as well as some additional data. This data did not relate to the data subject and was un-redacted.

Upon review of the personal data received, the data subject raised concerns in relation to the processing of their personal data between the data controller and the investigator. As part of its examination, the DPC engaged with the data controller on this matter. The data controller citied section 46 of the Data Protection Act 2018 (the 2018 Act) and Articles 6(1)(c) and Article 9(2)(b) as their lawful basis for processing the personal data. In addition to this, the data subject was in fact an employee, as such the data controller highlighted their legal obligations under the Safety, Health and Welfare at Work Act 2005 as set out in their Employee Handbook. The data subject challenged this lawful basis as they were not previously made aware of such.

With regard to the investigator the data subject explained that no consent was sought for processing the personal data between the data controller and the investigator. The data controller explained that consent was not the only lawful basis under GDPR and stated Article 6(1)(b) as their lawful basis. The data subject contested this lawful basis stating the processing of personal data by the investigator was not necessary for compliance with the employment contract. The data subject also raised transparency concerns as when signing the employment contract they would not have anticipated the processing of their personal data by an investigator. When questioned on the use of a Gmail account by the investigator, the data controller stated the email would be encrypted between the data controller and the Gmail account and that no evidence was available of the data subject’s personal data being compromised.

During the examination of the complaint the issue arose about whether the investigator was a joint controller or a data processor. The data subject took the view that the investigator was a data processor while the data controller stated the investigator was a data controller in their own right and as a result there were no requirements under Article 28 of the GDPR. The DPC examined the facts in this complaint and established that the investigator was provided a list of individuals to interview in order to compile this report and from the terms of reference, interviews are listed as the primary means of gathering information to compile their report. The DPC also noted the investigator was precluded from deciding on or implementing any sanction arising from the findings of the report. Based on this information, the DPC found the investigator as a data processor on behalf of the data controller and noted that the data controller failed to provide a contract between them and the investigator as required under Article 28(3) of the GDPR.

Due to the failure of the data controller to comply with the one-month obligation under Article 12(3) of the GDPR, the DPC reminded the data controller of their obligations under Article 24 to implement appropriate technical and organisational measures to ensure compliance with the GDPR. In doing so the data controller should also ensure they only provide personal data relevant to the subject access request at hand and redact the personal data of third parties. Secondly, with regard to the lawful basis relied upon by the data controller the DPC were satisfied that such lawful basis were reasonable; however recommended they inform staff members in their staff data protection policies that they may rely on section 46 of the 2018 Act and Articles 6(1)(c) and 9(2)(b) of the GDPR for the processing of staff personal data. In addition to this, under section 109(5)(f) of the 2018 Act the DPC recommended the data controller ensures there is a contract in place when an investigator is involved, that they engage in regular testing of organisational and technical processes, and lastly provide the investigator with an organisation email address.

22) Case study 22: Erasure request and reliance on Consumer Protection Code

Following an unsuccessful application for a credit card, the data subject in this case sought to have their personal data erased under Article 17 of the General Data Protection Regulation (GDPR). When the erasure request was refused by the data controller, the data subject raised concerns with the DPC that their personal data was being unlawfully retained. The DPC engaged with the data controller in order to assess the reasoning for such refusal.

In response to the data subject’s initial erasure request, the data controller stated in line with provision 11.6 of the Consumer Protection Code 2012 and their Privacy Policy and Cookies Statement they had a legal obligation to retain the information provided. The data controller went further to explain that the personal data provided in the application would be retained for a period of six years from the date on which the service was provided.

As part of its examination, the DPC engaged with the data controller and requested a response to the complaint. The data controller stated that they were relying on Article 6(1)(c) of the GDPR to retain the personal data whereby processing is necessary for compliance with a legal obligation to which the data controller is subject. The data controller in this case was also subject to the Consumer Protection Code 2012 (CPC). On this basis the data controller relied on this lawful basis for the refusal of the erasure request. Under Article 17(3)(b) of the GDPR, a data subject’s right to erasure does not apply and may be restricted where the processing is necessary for compliance with a legal obligation. 

For reference, the CPC is a set of rules and principles that all regulated financial services firms must follow when providing financial products and services to consumers and was published by the Central Bank of Ireland in compliance with section 117 of the Central Bank Act 1989. Under section 117(4) of the Central Bank Act 1989, it is an offence for a regulated financial firm to fail to provide the Central Bank with information to demonstrate compliance with the CPC.

Provisions 11.5 and 11.6 of the CPC require data controllers to retain the records of a consumer for six years after the date on which a particular transaction is discontinued or completed. The required records include but are not limited to: all documents required for consumer identification; the consumer’s contact details; all correspondence with the consumer; all documents completed or signed by the consumer. The data subject contested this reliance as no service was provided, therefore they were of the view they were not a consumer and as such felt the data controller had no legal right to maintain the personal data. The CPC defines a consumer and includes where appropriate, a potential consumer. In addition to this, the data controller stated when the data subject applied for a credit card, the consideration of the application and subsequent decision was deemed a service.

Under section 109(5)(c) of the 2018 Act, the DPC advised the data subject that within the meaning of the CPC they were classified as a potential consumer. As a result the data controller is legally obliged to retain the personal data for a period of six years. The DPC did not consider any further action necessary at the time of issuing the outcome.

23)  Case study 23: Debt collector involvement

A data subject had contacted the DPC as they were not satisfied with the responses to a data subject access request and erasure request. This case was against a debt collector and the data subject raised concerns about how their personal data was obtained. The data subject explained that the debt had been cleared but they still received a letter from a debt collector. This letter referred to an outstanding amount owed to a third party.

The data subject outlined to the DPC that their subject access request was made through an online platform. The data subject did not receive a response to their Article 15 Access request or their erasure request under Article 17 of the General Data Protection Regulation (GDPR). Prior to the DPC involvement both parties engaged directly. In their correspondence to the data subject, the debt collector explained that the personal data was obtained from a third party. The personal data was then uploaded to their online system and a letter was issued to the data subject.

As part of its examination, the DPC engaged with the debt collector and requested that they outline their relationship with this third party. The debt collector informed the DPC they were acting as a data processor on behalf of the third party and that a data processor agreement, in line with Article 28(3) of the GDPR, was in place at the time they processed this personal data. The debt collector advised the DPC that this contract was now terminated and they would not be acting on behalf of the third party going forward. The DPC accepted this response and identified the debt collector as a data processor and the third party as the data controller. The data processor, stated that debt collection is in the public interest and as such they had a legitimate interest to process personal data where a data subject’s account has been legally assigned to them, or when they are acting under a legal contract. The data processor stated that the processing of the data subject’s personal data was necessary to collect the debt and is allowed even where the data subject does not consent to the processing; meaning the data processor relied on Articles 6(1)(b) and 6(1)(f) of the GDPR for processing the personal data.

The data processor in this case accepted that the data subject may have paid the outstanding debt but stated they could not be held responsible if the data subject pays the data controller directly and the data controller fails to notify the data processor to close the outstanding debt on their systems. The DPC highlighted that there appeared to be an error in the letter the data subject received. In this correspondence the debt collector referred to themselves as a data controller. The debt collector accepted this error and stated it should have read data processor, this error was caused by an oversight when using a template letter.

With regard to the subject access request, due to their data processor relationship they did not respond directly to the data subject’s access request but did share this with the third party, the data controller. In terms of the erasure request, the data processor informed the data subject that they would be required to retain the personal data for six months for taxation/financial/auditing purposes. The six months had passed prior to the DPC involvement and the data processor assured the DPC that the personal data had now been erased. The data processor apologised directly to the data subject and offered a payment as a gesture of good will.

The DPC advised the data subject under section 109(5)(c) of the 2018 Act that the data processor and data controller had a legitimate interest to collect debts and disclose personal data in order to collect the debts. The DPC acknowledged the errors in the correspondence provided to the data subject and under section 109(5)(f) of the 2018 Act recommended that the data processor engage in regular testing of organisational and technical processes to ensure compliance with the GDPR in order to comply with Article 28 of the GDPR.

24) Case Study 24: Appropriate security measures for emailed health data

The DPC received a complaint from the parent of a child whose health data was mistakenly disclosed to an unknown third party. The data was contained in a document attached to a misaddressed email that had been sent by an employee of a public body.

The child was the subject of a health-related assessment by a therapist employed by the public body. The therapist prepared a draft report, which was to be sent to a senior professional. Before sending it, the therapist decided to ask a colleague for a second opinion. The colleague was not in the office, so the therapist chose to send the draft report to the colleague’s personal email address. Soon after doing so, the therapist realised that the email address was incorrect. The public body’s IT service was not able to recall the misaddressed email. The recipient’s email service provider confirmed that the recipient’s account was active, but emails from the public body asking the recipient to delete the misaddressed email were not answered. The public body contacted the parent by telephone, in person and in writing to inform them of the error and apologise for it. It also notified the DPC of a personal data breach. The parent subsequently lodged a complaint with the DPC.

As part of its examination of the complaint, the DPC asked the public authority to explain the steps taken to secure deletion of the misaddressed email, its policy concerning the sending of work-related emails to staff members’ personal addresses, and the measures being adopted to prevent a recurrence of the breach.

In its response, the public body confirmed the sequence of events described above, including its attempts to recall the email and its interactions with the email service provider. It advised the DPC that it had reissued a copy of its data protection policy to all members of the team on which the therapist worked, and wrote to it reminding it that it is not permitted to send any information to personal email addresses, regardless of whether they were asked to do so. It was made clear that this included reports and other work-related documentation. Data protection was added as a fixed item on the agenda of the team’s bi-monthly meetings, and all team members were scheduled for data protection awareness training.

In assessing the matter, the central issue identified by the DPC was the obligation of a data controller to take appropriate security measures against risks including unauthorised disclosure of personal data. Appropriate security measures were to be identified having regard to factors including the technology available, the harm that could be caused by disclosure, and the nature of the data. Further, controllers must take all reasonable steps to ensure that their employees are aware of and comply with those measures.

The DPC’s view was that sending a draft report to a personal email address was clearly inappropriate having regard to the required level of security, and was contrary to the public body’s own data protection policies. However, the mere existence of those policies was not enough to satisfy the obligation to take reasonable steps to ensure its employees were aware of and complied with them. The public body had done so only after the breach had occurred.

This case highlights the risk-based approach of data protection legislation. Article 32 of the GDPR requires controllers (and, where applicable, processors) to implement technical and organisational measures to ensure appropriate security of the personal data they process. Persons who process personal data on behalf of the controller must do so only on the controller’s instructions, and therefore must be aware of relevant technical and organisational measures.

The appropriateness of security measures will be determined by reference to risks: the risk that a breach could pose to individuals’ right and freedoms, and the possibility of various types of breach, such as the loss, disclosure or unauthorised access to the data. Special category data, such as health data, has heightened protection under Article 9 of the GDPR. Security measures that are appropriate for these categories of data are therefore likely be more stringent. Controller must also bear in mind that risks often change over time; security measures must likewise be adapted to the circumstances.

25)  Case study 25: Access to employee's email on a corporate email service

The complainant was an employee who maintained that their employer had infringed their data protection rights by searching for, retrieving and reviewing a number of emails on their corporate account.

During an investigation involving other persons, the employer had come across emails that raised concerns regarding several employees, including the complainant. The employer then searched its corporate servers for emails of the complainant dating to a particular four-week period and involving specific individuals. The employer then monitored the complainant’s use of corporate email for communication with a specific person and retrieved several further emails. Based on these, the employer started disciplinary proceedings against the complainant on grounds that the complainant’s use of the corporate email service had breached applicable rules and policies.

It was not disputed that the emails comprised personal data, that the employer was the data controller, or that the employer’s actions in searching for, retrieving and viewing them constituted processing. The employer maintained that the processing was fair, lawful and proportionate, and that it had balanced the legitimate interests of both itself and the complainant in doing so.

The complainant’s employment contract expressly required the complainant to comply with the employer’s resolutions, regulations and directions. The employer’s corporate policies included an ‘Acceptable Use Policy’ relating to the corporate email service, a disciplinary policy that outlined types of conduct that could lead to disciplinary proceedings, and a ‘Code of Conduct’ that dealt with topics such as loyalty, ethics and integrity. All were in effect when the relevant emails were sent.

The Acceptable Use Policy allowed occasional and limited personal use of corporate email that was in line with the employer’s values and did not contravene its corporate policies. It said that the employer could and would monitor email for compliance with the Acceptable Use Policy and for other legitimate business purposes. Further, the employer reserved the right to examine information stored on its systems or networks, and it said that the employer “may monitor information stored on [its] systems or equipment, whether created for business or personal purposes, at any time”.

The DPC examined whether the processing was fair and whether the employer had a legal basis for processing the data in the way it did.

In relation to fairness of processing, the DPC first considered the controller’s obligation to give data subjects sufficient information to make clear the types of data that will be processed and the purposes of the processing. The DPC took the view that the Acceptable Use Policy made clear that all information on the employer’s systems, including employees’ personal emails, were liable to be examined to ensure compliance with the Acceptable Use Policy and for other legitimate purposes. In that regard, the DPC considered that ensuring compliance with the employer’s disciplinary procedures and Code of Conduct were legitimate purposes. Based on this, the DPC’s position was that the purposes had been made clear to the complainant.

The DPC then considered whether the processing in this case had in fact been for the purposes provided to the complainant. The DPC noted that the search of the complainant’s corporate email account had been prompted by a separate investigation that had raised concerns about the complainant relevant to the Acceptable Use Policy, the Code of Conduct and the disciplinary policy. The resulting search of the complainant’s email account was limited both as to the period covered and the individuals involved, as was the subsequent monitoring of the complainant’s use of email.

The DPC’s view was that the employer’s processing of the complainant’s emails during the initial investigation (which was not focused on the complainant) fell within the purposes stated in the Acceptable Use Policy. Similarly, its search for, retrieval and reading of emails from the four-week period, and its subsequent monitoring and reading of the complainant’s emails, all came within those purposes. Because the period and range of persons specified in the search and the monitoring were limited to those relevant to the investigation, the stated purposes were not exceeded.

The employer relied on its legitimate interest as the legal basis of the processing. The DPC noted that this required three elements: the processing must be for the pursuit of the legitimate interest, it must be necessary for that purpose, and the fundamental rights and freedoms of the person concerned – the complainant in this case – must not take precedence over the processor’s interest.

Concerning the first element, the DPC considered the terms of the Acceptable Use Policy, the Code of Conduct and the disciplinary policy. The DPC’s view was that these were legitimate and that the initial investigation fell within the interest of the employer in upholding them. In light of information disclosed by the initial investigation, it was within the employer’s legitimate interest to search for emails created during the four-week period and to monitor and retrieve certain specific emails.

Regarding the second element – that the processing be necessary for the pursuit of the legitimate interest – the DPC’s view was that the Acceptable Use Policy expressly concerned use of the corporate email service, and it would not be possible to investigate a potential breach and enforce its terms without the types of processing carried out in this case. The DPC noted in this regard that the employer’s search and monitoring of email was strictly limited in terms of the period and individuals involved.

The third element of the legitimate interest basis of processing was balancing of the processor’s interest against the rights and freedoms of the person concerned. The DPC noted that the complainant considered the relevant emails to be personal, and that the Acceptable Use Policy allowed limited and occasional use the email service for personal email. Further, the complainant did not consent to the monitoring and had not been informed of it until the employer notified them of the investigation. Against that, the DPC noted that the emails concerned aspects of the employer’s business and so could be considered relevant to the complainant’s work. The complainant had stated to the DPC that the contents of the emails were not inappropriate and, had the employer asked, the complainant would have allowed access to them. The employer considered the emails to be evidence of potential breaches by the complainant of the Acceptable Use Policy and Code of Conduct, and the possibility of monitoring had been notified in that Policy. The DPC’s view was that, on balance, the processing by the employer was limited in nature and did not infringe on respect for the complainant’s private life.

The DPC’s position was, in conclusion, that the processing was fair and that the employer had a legal basis for performing it.

This case demonstrates a number of important points. Employers may have a legitimate interest that justifies monitoring, retrieving or reading employees’ personal data on their systems, whether created for personal or work purposes. However, it is not enough just to inform employees that their employer reserves the right to monitor communications: processing must conform to the principles of data protection set out in Article 5 of the GDPR, and controllers such as employers must observe the transparency requirements of Chapter III (Articles 12 to 23). They must also consider the legal basis for processing on which they rely: Article 6 imposes a ‘necessity’ test for all but one of the available legal bases. This means in essence that the processing must be the only practical way to achieve the desired purpose, not just a more convenient means of doing so. (The exception is consent, and controllers should bear in mind that a data subject may withdraw consent no less easily than they gave it.)

Accordingly, before invoking a power to monitor employees’ communications, employers must consider carefully whether all data protection principles have been and will be observed, the precise legal basis relied on, and the nature and amount of processing required. The necessity of the processing and the effects on the rights and freedoms of data subjects must be carefully evaluated.

Employees should be aware of how their employer may collect and process their personal data, and the legal basis on which they rely. (Articles 12 to 15 of the GDPR provide for data controllers to provide information about these matters to data subjects.) Policy statements, employee handbooks and similar documents can provide important information about data protection rights.

26) Case study 26: Disclosure by a credit union of a member's personal data to a private investigations firm

The complainant in this case was a borrower from a credit union and was alleged to be in arrears on a loan. The credit union claimed to be unable to contact the complainant. The credit union disclosed personal data of the complainant to a private investigations firm with the intention of locating and communicating with the complainant. The data disclosed included the complainant’s name, address, former address, family status and employment status. Approximately four years later, the complainant became aware of that disclosure and complained to the DPC.

The private investigations firm had ceased to trade several years before the complaint and so was not in a position to assist the DPC’s investigation. The DPC asked the credit union to explain the legal basis on which it had disclosed the data, and why it considered it necessary to do so. The credit union informed the DPC that it did not have a written contract with the private investigations firm, so the DPC asked it to provide details of any internal policy or procedure concerning when it was appropriate to liaise with that firm.

Concerning the legal basis for the disclosure, the credit union claimed that the disclosure was necessary for the purposes of pursuing a legitimate interest and for the performance of its contract with the complainant. It also referred to a provision of section 71(2) of the Credit Union Act 1997 that allows a credit union to disclose a member’s account information where the Central Bank of Ireland (previously, the Registrar of Credit Unions) is of the opinion that doing so is necessary to protect shareholder or depositor funds or to safeguard the interests of the credit union. (The credit union was unable to say whether the Central Bank had expressed such an opinion in relation to this case.)

The credit union maintained that the disclosure was necessary because it had been unable to communicate with the complainant by letter, telephone or through the complainant’s solicitor. In its view, the complainant was seeking to evade its efforts to update its records and discuss the outstanding loan. (The complainant strongly disputed that, pointing out that they had made repayments shortly before the credit union contacted the private investigations firm.)

The credit union told DPC that its credit control policy dealt with cases where it was proposed that a member’s non-performing loan should be written off as a bad debt. Before doing so, the relevant provisions directed that the credit union should make “every effort…to communicate with the member, including the assistance of a third party” to try and continue with agreed arrangements and assist collection of the debt.

The DPC assessed that the legal basis for the disclosure and the existence of a data processing contract as the central issues in the complaint.

In light of all the facts presented, and on the basis of applicable legislation, the DPC concluded that the credit union had a legitimate interest in seeking to obtain up-to-date contact details in order to re-establish contact with the complainant with a view to discussing the repayment of the loan. The processing of personal data was necessary for the purposes of pursuing that legitimate interest. The DPC accepted that the disclosure could affect the complainant’s fundamental rights and legitimate interests. Against that, however, fulfilling the important social function provided by credit unions required that they be able to take action to engage with members whose loans fall into arrears. For that reason, the disclosure was warranted despite the potential prejudice to the complainant’s fundamental rights and freedoms or legitimate interests. The credit union therefore assert the pursuit of its legitimate interest in contacting the complainant and seeking repayment of the loan as the legal basis for disclosing personal data to the private investigations firm.

The DPC also considered whether section 71(2) of the Credit Union Act 1997 provided a legal basis for the disclosure in this case. The DPC noted that compliance with a legal obligation, such as under a court order or provision of a statute, can provide a legal basis for processing. However, section 71(2) (including the provision mentioned by the credit union in its submissions to the DPC) was permissive rather than mandatory in its effect: while it allowed credit unions to disclose information in certain circumstances, it did not require them to do so. Accordingly, the section did not justify the disclosure for the purposes of applicable data protection legislation.

The DPC noted that processing by a processor on behalf of a controller must be conducted under the terms of a contract in writing or in equivalent form that complies with applicable data protection legislation, and in particular ensures that the processing meets the obligations imposed on the controller. In the DPC’s opinion, the credit union’s credit control policy was not sufficient to meet this requirement, so the credit union had failed to meet its statutory obligation in this regard.

This case highlights several important issues for data controllers. Whenever a controller engages a processor to process data on its behalf, there is an unambiguous requirement to have a processing contract or equivalent measure that complies with Article 28(3) of the GDPR or other applicable legislation. These contracts benefit both controllers and processors by making clear what processing is required and how it is to be done. They also protect data subject by providing clarity on how and by whom their data is being processed, and for what purposes.

The case also shows the importance of being clear as to the legal basis for processing. Where the basis claimed is a legal obligation, it is not sufficient to simply show that the controller can legally choose to act in a particular way: the processing must be required by law for this legal basis to apply. Where a processor claims that processing is for the purpose of pursuing a legitimate interest, they must be able to show that the processing is necessary for that purpose, and that they have carefully balanced that interest against the rights and freedoms of persons who may be affected by it. If the interest does not outweigh those rights and freedoms, it does not provide a legal basis for the processing.

27) Case study 27: Data accuracy

The complainant in this case had made a complaint to a professional regulatory body about the conduct of a regulated person. That complaint was not upheld by the professional regulatory body. In his complaint to the DPC, the complainant alleged that the professional regulatory body had inaccurately recorded personal data relating to them in the minutes of its meeting. The complainant also alleged that the professional regulatory body had inaccurately recorded the same personal data relating to the complainant in a letter from it to a third party.

Before commencing an investigation into this complaint, the DPC reviewed the information provided and established that the professional regulatory body was identified as the relevant data controller in relation to the complaint, as it controlled the contents and use of the complainant’s personal data for the purposes of investigating the complaint. The data in question was personal data relating to the complainant, the complainant could be identified from it and the data related to the complainant as an individual. The DPC was therefore satisfied that the complaint should be investigated to determine if a contravention of data protection legislation had occurred.

During the course of the investigation of this complaint, the professional regulatory body accepted that the personal data in question had been recorded inaccurately and, in relation to the data recorded in the minutes, corrected the data by way of the insertion of a clarification. On this basis, this office considered that the personal data recorded in the meeting minutes and the letter to the third party had been recorded inaccurately, in contravention of data protection legislation.

This office also examined whether the professional regulatory body had processed the complainant’s personal data fairly, as required by data protection legislation. In order to comply with the requirement to process personal data fairly, data controllers must ensure that data subjects are provided with or have made readily available to them certain information. This office reviewed the information that the professional regulatory body stated was available to individuals about making a complaint, in the form of the information booklet. This booklet did not contain, in particular, any details about individuals’ right of access to personal data relating to them and individuals’ rights to rectify inaccurate data concerning them. Since the information booklet did not contain all of the information that was required to be provided to data subjects under data protection legislation and since the professional regulatory body did not provide any other details regarding other measures that it had in place at the relevant time to address its fair processing obligations, the DPC was not satisfied that the professional regulatory body had complied with its fair processing obligations. 

Under the GDPR, data controllers must ensure that personal data are accurate and, where necessary, kept up to date, and every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay. Under Article 16 of the GDPR, a data subject has the right (subject to certain exceptions) to obtain from the data controller without undue delay the rectification of inaccurate personal data concerning him or her.

The GDPR also requires that personal data be processed fairly and in a transparent manner. A data controller should provide a data subject with any information necessary to ensure fair and transparent processing, taking into account the specific circumstances and context in which the data are processed. In particular, where personal data are collected from a data subject, Article 13 of the GDPR requires that the data controller provide the data subject with, amongst other things, information as to the identify and contact details of the controller and its data protection officer (where applicable), the purpose of the processing, the recipients or categories of recipients of the data and information as to the rights to rectification and erasure of personal data.

28) Case study 28: Retention of data by a bank relating to a withdrawn loan application

The complainant in this case had made a loan application to a bank. The complainant subsequently withdrew the loan application and wrote to the bank stating that they were withdrawing consent to the processing of any personal data held by the bank relating to the loan application and requesting the return of all documents containing the complainant’s personal data. In response, the bank informed the complainant that it had stopped processing all of the complainant’s personal data, with the exception of data contained in records which the bank stated it was required to retain and process under the Central Bank of Ireland’s Consumer Protection Code. The complainant was not satisfied with this response, and argued, in their complaint to this Office, that in circumstances where the bank had obtained the complainant’s personal data on the basis of the complainant’s consent, the bank was not permitted to continue to process these data on a different legal basis (i.e. processing which is necessary for compliance with a legal obligation to which the bank is subject). The complainant also argued that the continued processing by the bank of their personal data was for a purpose which was not compatible with the purpose for which the data were originally obtained, in contravention of data protection legislation.

This office established that the bank was identified as the relevant data controller in relation to the complaint, as it controlled personal data which the complainant had provided to the bank when making a loan application. The data in question were personal data relating to the complainant (consisting of, amongst other things, a completed loan application form and supporting documentation) as the complainant could be identified from it and the data related to the complainant as an individual. This office was therefore satisfied that the complaint should be investigated to determine if a breach of data protection legislation had occurred.

During the course of the investigation of this complaint, this Office reviewed the bank’s loan application form, which provided that, by signing the form, a person consented to the bank storing, using and processing their personal data for a range of purposes, including to process applications for credit or financial services. However, this Office noted that the purposes for which the complainant had given their consent did not include processing for the purpose of compliance with the bank’s legal obligations generally, and specifically did not include the processing of the complainant’s personal data for the purpose of compliance with the Consumer Protection Code. Accordingly, this office considered that at the time of collection of the complainant’s personal data the Bank did not claim to rely on consent as the legal basis for the collection and processing of the complainant’s personal data in order to comply with its legal obligations. Rather, this office considered that the bank could validly rely on the lawful basis that the processing was necessary in order to take steps at the request of the data subject prior to entering into a contract.

This Office noted that where a loan application is subsequently withdrawn or unsuccessful and the bank does not enter into a contract with the applicant, the retention of personal data relating to the loan application can no longer be on the basis that the processing was necessary in order to take steps at the request of the data subject prior to entering into a contract, as there is no longer the possibility of entering into a contract with the data subject. As such, the bank identified a separate legal basis for the retention of the complainant’s personal data relating to the loan application, namely that this processing was necessary for compliance with a legal obligation to which the bank was subject.

This Office noted that the Consumer Protection Code obliged regulated entities to retain details of “individual transactions” for six years after the date on which the particular transaction is discontinued or complete. This Office considered, however, that a loan application which is subsequently withdrawn or ultimately unsuccessful is not a ‘transaction’ for the purpose of the Consumer Protection Code. This Office then noted that the Consumer Protection Code also obliged regulated entities to retain “all other records” for six years from the date on which the regulated entity ceased to provide any product or service to the consumer, including potential consumer, concerned. However, this Office did not consider that records relating to a loan application which is subsequently withdrawn to fall within the scope of this requirement under the Consumer Protection Code either. Accordingly, this Office considered that it was not necessary for the bank to retain personal data relating to the complainant’s withdrawn loan application for the purpose of compliance with its legal obligations under the Consumer Protection Code, and considered that the bank had not identified a lawful basis under data protection legislation for the retention of the complainant’s personal data relating to their loan application.

Under Article 6 of the GDPR, data controllers must have a lawful basis for any processing of personal data. The available lawful bases include that the data subject has given consent to the processing of their personal data for one or more specific purposes, that the processing is necessary for the performance of a contract to which the data subject is a party or in order to take steps at the request of the data subject prior to entering into a contract, and that the processing is necessary for compliance with a legal obligation to which the data controller is subject. Data controllers should note also that the processing of personal data for purposes other than those for which the personal data were originally collected is only allowed where the processing is compatible with the purposes for which the data were initially collected.

29) Case study 29: Access to information relating to a bank's credit assessment

The complainant in this complaint made a request to a bank under data protection legislation to supply the complainant with a copy of all personal data relating to them held by the bank. The complainant alleged, in particular, that the bank had failed to provide them with any internal analyses which used the complainant’s personal data to assess the amount of credit the bank would extend to them.

This Office established that the bank was identified as the relevant data controller in relation to the complaint, as it controlled personal data which the complainant provided to the bank when making a loan application.  The data in question was personal data relating to the complainant (consisting of, amongst other things, a completed loan application form and supporting documentation) as the complainant could be identified from it and the data related to the complainant as an individual. This Office was therefore satisfied that the complaint should be investigated to determine if a breach of data protection legislation had occurred.

During the course of the investigation of this complaint, this Office engaged with the bank regarding the nature of any personal data to which the complainant might have been entitled. The bank took the view that the complainant was not entitled to details of its internal analysis and algorithms or any internal decision thresholds upon which it based its lending decision as, in the view of the bank, this information was not personal data, and, in addition, was market sensitive and was the intellectual property of the bank. In particular, the bank did not provide the complainant with details of the complainant’s credit score or the bank’s calculation of the complainant’s net disposable income which form part of its credit assessment criteria.

This Office considered the explanations provided by the bank and took the view that the complainant’s net disposable income figure and credit scope both constituted personal data relating to the complainant as the complainant could be identified from the details and they related to the complainant as an individual. Furthermore, as the bank had not identified a relevant exception under data protection legislation on which it could withhold this data from the complainant, this Office considered that the bank had failed to comply with the complainant’s request for access to their data. However, this Office agreed that the credit scoring models used by the bank in its credit assessment process were not personal data relating to the complainant and that, as such, the complainant was not entitled to a copy of this information.

Finally, this office considered that the bank had further contravened its obligations under data protection legislation by failing to respond to the request made by the complainant within the applicable statutory time limit.

Under Article 15 of the GDPR, data subjects have a right to obtain from data controllers confirmation as to whether or not personal data concerning them are being processed and, where that is the case, access to that personal data. This right only extends to the personal data of the data subject, meaning any information relating to that data subject by which the data subject is identified or identifiable. The data controller must respond to a data subject access request without undue delay and in any event within one month of receipt of the request. However, the right of access to personal data is subject to a number of exceptions under the GDPR and the Data Protection Act 2018 (in particular, sections 59 to 61), such as where compliance with the request for access would adversely affect the rights and freedoms of others.

30) Case study 30: Use of employee's swipe-card data for disciplinary purposes

The complainant in this case was an employee who was the subject of disciplinary proceedings by their employer. An aspect of those proceedings concerned the complainant’s time-keeping, and the employer sought to rely on swipe-card data derived from the complainant’s entry into and exit from the workplace during the relevant period. As a result of an internal appeal process, the employer subsequently agreed not to use the data for this purpose and removed it from the complainant’s disciplinary record. However, the complainant asked the DPC to continue its investigation of the complaint.

The DPC’s investigation focused on the data protection principle that data must be obtained and processed fairly. This includes an obligation to give data subjects’ information including the purpose or purposes for which the data are intended to be processed.

In this case, the employer had not informed the complainant of the use of swipe-card data for the purpose of disciplinary proceedings. (During the investigation, the employer informed the DPC that the complainant’s case was the only one in which it had used swipe-card data for disciplinary purposes.) Similarly, the employer had not informed the complainant or other employees that swipe-card data collected in the workplace was intended to be used for time-keeping purposes.

The employer had failed to inform the complainant about the use of swipe-card data for time-keeping and disciplinary purposes. The DPC therefore concluded that the employer had not obtained and processed that data fairly.

This case demonstrates the importance of fairness and transparency in protecting data protection rights. Controllers such as employers may have valid legal bases for processing personal data, whether on grounds of performance of contract, legitimate interest or otherwise. However, the principles of data protection set out in Article 5 of the GDPR must be observed regardless of the legal basis that is relied on.

31)   Case study 31: Disclosure of a journalist's name and mobile phone number by a public figure

The complainant in this case was a journalist who emailed a public figure to ask questions about decisions that the public figure had taken in relation to their work. The public figure used their Twitter account to publish a copy of the email. The journalist’s name, work email address and mobile phone number were legible in the published copy of the email. The journalist reported receiving a number of threatening text messages afterwards.

The journalist asked the public figure to delete the published copy of the email. The public figure did so, but also published a Tweet saying that the journalist’s mobile phone number was available online. This included a link to a discussion board message posted by the journalist six years previously, while a student, which included the same mobile number. The journalist complained to the DPC.

As part of its investigation, the DPC asked the public figure to identify the legal basis for disclosing the journalist’s data. The public figure’s response queried whether the journalist’s name and contact details constituted personal data. It also asserted that, because the journalist had previously made that information available on the internet, the journalist had impliedly consented to its publication by the public figure. The journalist rejected that assertion.

The DPC took the position that the journalist’s name, email address and mobile phone number were personal data because the journalist was clearly identifiable by them. Concerning the legal basis for disclosing them, the DPC noted that, while data protection law provided for several possible legal bases for processing, the only basis raised by the public figure had been consent. The DPC’s view was that a media enquiry to a public figure from a journalist acting in that capacity did not amount to valid consent to the sharing of any personal data in the enquiry. For those reasons, the public figure’s disclosure of the data breached data protection law.

This case highlights several important issues. Article 6 of the GDPR provides for six legal bases on which a processor can justify processing personal data. Consent is one of these, but the GDPR sets out important requirements including as to how consent is given, the right to withdraw consent and the need for controllers to be able to demonstrate that data subjects have given consent. While other legal bases exist, controllers must bear in mind that these are all subject to a ‘necessity’ test and their own specific requirements.

32) Case study 32: Further processing for a compatible purpose

This complainant owned a rental property that was managed by a letting agency on the complainant’s behalf. The building in which the complainant’s rental property was located was managed by a management company which was responsible for the collection of management fees from the owners of the properties in the buildings. The management company sent an email to the complainant’s letting agent regarding management fees outstanding on the complainant’s property. However, the letting agent did not act on the complainant’s behalf in relation to the payment of management fees.

The Commission determined that the data controller was the management company because it controlled the contents and use of the complainant’s personal data for the purposes of acting as the management company of the building where the complainant’s property was located. The data in question was deemed to be personal data consisting of the status of payment on the complainant’s account, because the complainant could be identified from it and it related to the complainant as an individual. Therefore, the Commission was satisfied that an investigation should be carried out to determine if a breach of the relevant legislation had occurred.

In response to the complaint, the data controller argued that it acted as a management company in respect of many properties and many developments and that the role of letting agents varies considerably from property to property. It indicated that, in some cases, the letting agent is appointed on behalf of a property owner to collect management fees and, on this basis, it thought it was appropriate to contact the complainant’s letting agent regarding outstanding fees.

The Commission noted that when personal data is processed by a data controller, there are certain legal obligations that the data controller must comply with. Of particular relevance to this complaint were the obligations: (1) To obtain such data for specific purposes and not to further process it in a manner that is incompatible with those purposes, (2) that the data must be relevant and adequate and the data controller must not process more of the data than is necessary to achieve the purpose for which it was collected, and (3) to have appropriate security measures in place to protect the personal data. In addition, the Commission noted that the legislation also requires that there must be a lawful basis for processing the personal data.

In relation to the first obligation (not to process personal data in a manner that is incompatible with the reasons for which it was collected), the Commission noted that personal data regarding the payment of the complainant’s management fee was collected for the purposes of financial record keeping and the collection of fees. However, because the letting agent didn’t have a role in the payment of the management fees, the data controller shared the complainant’s personal data with a party who it had no reason to share it with. On this basis, the Commission held that the disclosure of the complainant’s personal data was for a purpose that was incompatible with the purpose for which it was collected.

In circumstances where it was not necessary for the data controller to share the personal data in question with the letting company, the Commission determined that the personal data was not relevant or adequate and was excessive for the purposes for which it was processed.

In relation to whether appropriate security measures were in place, the Commission held that because there was an absence of understanding on the part of the data controller in relation to the letting agent’s role, (and the complainant’s personal data were disclosed as a result) the data controller had failed to meet its obligations in this regard.

Finally, the Commission determined that the data controller had no lawful basis for making the disclosure to the letting agent, because none of the legal bases for processing could be said to apply. This meant that the data controller had committed a further breach of the legislation in this regard.

Under Article 6 of the GDPR, a data controller must have a valid legal basis for collecting personal data. However, Article 6(4) of the GDPR provides that where processing of personal data is carried out for a purpose other than that for which the data were initially collected, this is only permitted where that further processing is compatible with the purposes for which the personal data were initially collected. In addition, under Article 5(1)(c) of the GDPR, personal data which are processed must be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed and under Articles 5(1)(f) and 32 of the GDPR, personal data must be processed in a manner that ensures appropriate security of the data, including security against unauthorised disclosure. For this purpose, data controllers are required to have in place appropriate technical and organisational measures to ensure the confidentiality of personal data.

33) Case study 33: Fair and lawful processing of CCTV images of a customer

This complaint concerned the processing of the complainant’s personal data in the form of a still image from CCTV footage taken in a betting shop, by distributing that image to various betting shops in the chain with a warning note to staff in order to prevent the complainant from placing bets.

The Commission determined that the betting shop was the data controller because it controlled and processed the personal data in question. The data were (amongst other things) an image of the complainant and internal notes circulated to staff of the data controller about the complainant. The data were personal data because they related to the complainant as an individual and the complainant could be identified from the data.

In response to the complaint, the data controller put forward a number of reasons for processing the complainant’s personal data and sought to argue that there was a valid legal basis for each purpose, as provided for in data protection legislation.

The reasons and corresponding legal bases presented by the data controller included the following:

  • Legal and Regulatory Obligations: The data controller argued that it is required to retain and use personal data in order to comply with certain legal and regulatory obligations, such as to detect suspicious betting activity and fraudulent transactions under applicable criminal justice legislation. The legal basis put forward by the data controller was that the processing was lawful because it was necessary for the data controller to comply with a legal obligation.
  • Risk Management: The data controller claimed that it records personal data relating to customers for commercial risk management. The legal basis put forward in this regard was that the processing was lawful because it was necessary for the purposes of the legitimate interests pursued by the data controller.
  • Profiling: The data controller confirmed that it carries out profiling of customer betting activity to (amongst other things) improve customer experience. The data controller argued that such processing is lawful as it is necessary for compliance with legal obligations and for the purposes of the legitimate interests pursued by the data controller.

The Commission decided that the data controller had identified an appropriate lawful basis for each purpose for which it processed personal data relating to its customers.

The Commission then considered whether the obligation to process personal data fairly had been complied with by the data controller. In this context, the Commission noted that the data controller is obliged to provide the complainant with information in relation to the key elements of the collection and use of the complainant’s personal data. The data controller here had provided the complainant with an internal company document and confirmed that the complainant’s personal data had been processed in accordance with this document. However, the document was dated after the date on which the complainant’s personal data was processed. On this basis, the Commission noted that it was not clear that the required information had been provided to the complainant and therefore the data controller had failed to process the complainant’s personal data fairly.

Finally the Commission considered the period of time the personal data had been retained for. In this regard, it noted that the relevant legislation requires that a data controller keep personal data for no longer than is necessary for the purposes for which the data are processed. The complainant’s personal data had been kept for approximately seven years.  The Commission considered that because the data controller had a legitimate interest in retaining the complainant’s data (for commercial risk management), the data controller had acted in accordance with the legislation in this regard.

Under Article 6 of the GDPR, a data controller must have a valid lawful basis for processing personal data. Amongst the available lawful bases are that the processing of personal data is necessary for the purpose of the legitimate interests pursued by the data controller or that the processing is necessary for compliance with a legal obligation to which the data controller is subject. The data controller must have a lawful basis not just for the initial obtaining of the personal data, but also for their ongoing processing, including storage, and the data must not be kept for longer than is necessary for the purpose for which they are processed (Article 5(1)(e) GDPR).

In addition to having a valid lawful basis for processing of personal data, however, a data controller must comply with a number of further obligations in relation to personal data being processed. In particular, personal data must be processed fairly and transparently. To this end, a data controller is required to provide a data subject with certain information under Article 13 of 14 of the GDPR, in accordance with the requirements of Article 12 GDPR. The information required to be provided to the data subject includes the identity and contact details of the controller and the controller’s data protection officer, where applicable, the purposes of the processing, and the recipients or categories of recipients of the data, if any. The information must be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language.

34) Case study 34: Disclosure of personal and financial data to a third party and erasure request

A data subject provided their personal and financial data to an organisation (the data controller) as part of their relative’s application for a scheme. The application was unsuccessful and the applicant was issued with a refusal letter, which included a breakdown of the data subject’s personal and financial data. The data subject made a complaint to the Data Protection Commission (DPC) regarding the lack of transparency in the application process and the disclosure of their personal and financial data to their relative. The data subject requested the return of their personal data from the data controller. The data subject also requested that their personal data be erased by the data controller under Article 17 of the General Data Protection Regulation (GDPR), and if erasure was not an option, their legal basis for retaining their data. 

Prior to the commencement of an examination by the DPC, the data subject made suggestions to amicably resolve their complaint, which included, among other things, a ‘goodwill gesture’ from the data controller. However, due to the role of the organisation, the data controller was not in a position to facilitate this request.

As part of its examination, the DPC engaged with the data controller and requested a response to the data subject’s complaint. The data controller stated that while it is part of their procedure to inform applicants of their reasons for refusal, only a partial disclosure should be made in their decision letters where information was gathered from a third party. With regards to the data subject’s erasure request, the data controller advised that the personal data provided would be retained for the lifetime of the applicant plus 10 years. The data controller explained that the data is retained for this period as the data in question may affect any future applications by the applicant.

Subsequently the data subject’s erasure request was refused by the data controller as they advised they are relying on Article 17(3)(b) of the GDPR, which restricts the obligations on data controllers to erase personal data where the personal data is required for compliance with a legal obligation. Also, the data controller relied on Article 23(1)(e) of the GDPR, which states that a data subject’s rights may be restricted for:

“Important objectives of general public interest of the Union or of a Member State, in particular an important economic or financial interest of the Union or of a Member State, including monetary, budgetary and taxation a matters, public health and social security.”

An apology was issued to the data subject by the data controller, as a result of the disclosure of their personal data in the refusal letter issued to their relative, the applicant. The data subject queried if this disclosure was reported to the DPC as a breach. Under Article 33 of the GDPR, a data controller is required to report a personal data breach to the relevant competent authority without undue delay, unless the data breach is unlikely to result in a risk to the rights and freedoms of natural persons. A data breach is described in Article 4(12) of the GDPR as: “A breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”. The DPC found that the disclosure was a result of human error and not identified as a systemic issue.

Through its examination, the DPC found that the refusal letter which resulted in the disclosure of the data subject’s personal data, could be distinguished from other records retained by the data controller as it did not directly follow their guidelines. As such, the DPC invited the data controller to erase or redact the data subject’s personal data from the decision letter held on file. In addition, an amended letter could be issued to the applicant redacting the data subject’s personal data. The data controller advised they would reissue the refusal letter and request the applicant return the initial letter sent. The data controller also advised they would delete the initial letter from their records.

Under section 109(5)(c) of the 2018 Act, the DPC advised the data subject that the explanation put forward by the data controller in the circumstances of their complaint was reasonable. While the data controller acknowledged the disclosure of the data subject’s personal data to their relative, the applicant, they issued an apology for same, and indicated that the original refusal letter will be amended on their system, while an updated letter will issue to the applicant.

Further, under section 109(5)(f) of the 2018 Act, the DPC recommended the data controller provide updated training to their staff regarding their guidance for decision letters.

35) Case study 35: Unlawful processing and disclosure of special category data

A data subject submitted a complaint to the Data Protection Commission (DPC) against their bank (the data controller) as they believed their personal data was processed unlawfully. The data subject explained that they held a mortgage with the data controller, and this mortgage was sold to another bank, as part of a loan sale agreement. The data subject complained that this sale was processed without their prior knowledge or consent and was specifically concerned about the data controller sharing their personal email address and mobile phone number with another bank as they deemed this as an excessive disclosure of personal data. While the data subject did not object to their name, address or landline number being shared, they believed their email address and mobile phone number were “sensitive” personal data and the disclosure of same was disproportionate.

Prior to contacting the DPC, the data subject engaged with the data controller directly regarding their complaint. The data controller responded to the data subject and advised that their lawful basis for processing their personal data was Article 6(1)(f) of the General Data Protection Regulation (GDPR) which states: “Processing is necessary for the purposes of the legitimate interests pursued by the controller.”

Upon commencing their examination, the DPC shared the data subject’s complaint with the data controller and requested a detailed response. The data controller informed the DPC that as part of their Data Privacy Notice, a copy of which is provided to their customers, details that the data controller may sell assets of the company in order to manage their business. This is also further detailed in the loan offer letter to mortgage applicants.

In relation to the sharing of excessive personal data, the data controller outlined that they do not consider an email address or a mobile phone number to be sensitive information nor do they fall under special categories of personal data under Article 9 of the GDPR.

The DPC advised that while consent is one of six lawful basis for processing personal data, it is lawful to process personal data without prior consent once one of the five other bases, which are listed in Article 6 of the GDPR, are met. In this instance the data controller was relying on Article 6(1)(f) and as such, they are required to conduct a balancing test to ensure that the legitimate interest that are pursued by the controller are not overridden by the interests, rights, or fundamental freedoms of the data subject. The data controller confirmed to the DPC that they had conducted a balancing test and it was confirmed that the processing of personal data, in this instance, did not override the interests, rights or fundamental freedoms of the data subject.

The data controller further explained that it was necessary for the data controller to share the data subject’s contact information with the other bank as they were the new data controllers for the data subject’s loan. The data controller also clarified that they do not differentiate between different types of contact information, i.e. landline and mobile numbers as this information was provided to the data controller for the purpose of contacting customers. As such, this information is required by the bank managing the loan.

Article 9 of the GDPR describes special category personal data as:

“personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation.”

As such, the DPC clarified to the data subject that mobile numbers and email addresses do not fall into this category. Under section 109(5)(c) of the 2018 Act the DPC advised the data subject that, having examined their complaint, the DPC found no evidence that their personal data was processed unlawfully. While the data controller relied on a legitimate basis to process data, they did so in a transparent manner, and kept the data subject fully informed at all key stages of the sale, so it was conducted with the data subject’s prior knowledge. The DPC did not consider any further action necessary at the time of issuing the outcome.

36) Case study 36: Unlawful processing and erasure request

Following their trip to a leisure facility (the data controller), a data subject submitted a complaint to the Data Protection Commission (DPC) as they were unhappy with how the data controller processed their personal data. The data subject also wanted to exercise their rights under Article 17 of the General Data Protection Regulation (GDPR) and have their, and their families, data deleted by the organisation. Prior to contacting the DPC, the data subject requested the erasure of their data directly from the data controller and this request was refused.

The data subject explained to the DPC that, during their stay at the leisure facility, they believed their personal data was processed unlawfully as they were repeatedly asked to provide details of their booking to staff, in order to gain access to facilities on site such as restaurants and activities. The data subject believed this to be excessive processing and stated at the time they were not given a choice to object to such processing or they could not receive full access to the facilities.

In line with their examination of the complaint, the DPC contacted the data controller and shared the details of the data subject’s complaint. The data controller advised the DPC that their lawful basis for processing personal data is Article 6(1)(f) of the General Data Protection Regulation (GDPR) also commonly referred to as, legitimate interest. The data controller further explained that they request customer’s details prior to accessing facilities or making a purchase in order “to understand patterns and to improve the range of services and facilities available to guests”. This is also detailed in their privacy policy, which is available on their website.

On foot of the data subject’s complaint, the data controller reviewed their policies and identified a training gap with their staff. Following this identification, the data controller briefed their staff to ensure that they were aware that customers were not obliged to provide details of their booking when accessing certain facilities. The data controller also advised that they updated their Data Protection Regulation Department Operating Procedure to reflect this procedure more clearly.

In regards to the data subject’s erasure request, the data controller advised the DPC that they have removed the data subject for all direct marketing communications. However, they were unable to erase any other personal data relating to the data subject, and their family, as it is held in accordance with their retention policy. The data controller’s retention policy states that all personal data is held on file as it may be required in defence of a legal claim and only deleted after the youngest member of the booking reaches the age of 21 years, in accordance with statutory limitation periods.

Under section 109(5)(f) of the 2018 Act the DPC recommended that the data controller continue to provide training to all its employees on its obligations and the rights of data subjects under data protection legislation and to keep this training up to date.

The DPC further recommended under section 109(5)(f) of the 2018 Act that the data controller delete all personal data in accordance with their retention period.

The DPC did not consider any further action necessary at the time of issuing the outcome as they noted that the data controller had retrained all staff, apologised to the data subject and offered them compensation as a result of their complaint.

37) Case study 37: Disclosure, withdrawing consent for processing and subject access request

A data subject brought a complaint to the Data Protection Commission (DPC) against their former employer (the data controller). The data subject had a number of data protection concerns namely:

  • The disclosure of their personal email address in a group email by being included in the Carbon Copy (CC) field,
  • The inclusion of their image on the data controllers social media
  • The data subject was not satisfied to the response received from the data controller regarding a subject access request.

In line with the examination of the complaint, the DPC contacted the data controller and shared the details of the complaint. The data controller informed the DPC that the data subject had previously signed a settlement agreement, which waived their right to make any complaints or claims against the company under the Data Protection Acts 1988, 2003 and 2018. In response, the DPC advised the data controller that they were not a party to that agreement and that the DPC has a statutory obligation to examine complaints to the extent appropriate. An enforcement of any settlement agreement is a matter between the data controller and data subject.

In relation to the disclosure of the data subject’s email address in a group email, the data controller acknowledged that the Blind Carbon Copy (BCC) function should have been used in this instance. The data controller also advised that this incident had been reported to the DPC as a breach under Article 33 of the General Data Protection Regulation (GDPR) and additional measures have been put in place to avoid the incident re-occurring. Staff training has been rolled out and the data subject’s email address has been removed from the auto-collected email addresses on file. The DPC noted that the circumstances of the breach arose as a result of human error and has not been identified as a systemic issue.

Under Article 17 of the GDPR, the data subject requested the removal of their image from the data controller’s social media outlets without undue delay. The data subject withdrew their consent for the processing of their personal data under Article 17(1)(b) of the GDPR. The data controller conducted a search of their social media and removed any posts, which identified the data subject. The data controller advised that where third parties further used these images, the data subject would have to submit an erasure request to these organisations directly.

The data subject also made a subject access request under Article 15 of the GDPR to the data controller. The data controller complied with the request; however, restrictions were applied under Section 162 of the 2018 Acts to restrict the data subject’s access to correspondence between the data controller and their legal advisors. While the DPC notes that a right of an individual to access personal data is a fundamental right and any restriction must be interpreted narrowly, the requirement that the restriction of data subjects’ rights be necessary and proportionate, is not contained within section 162 of the 2018 Act. Accordingly, not all access requests can be complied with and based on the information provided to the DPC, the DPC found that the correspondence between the data controller and their legal advisers should not be released in response to a data subject access request.

Further to the above, the DPC noted that the data controller had failed to comply with their obligations under Article 12(3) of the GDPR in that, data controllers must respond to data protection requests from data subjects within one month of receiving those requests. A data controller shall inform the data subject of any such extension within one month of receipt of the request, together with the reasons for the delay. However, it was noted that the data controller extended the response period of the subject access request after the initial one-month time period had lapsed.

As such, under section 109(5)(f) the DPC wrote to the data controller and reminded them of their obligations under Articles 12(3) and Article 33 of the GDPR.

38) Case study 38: Unlawful processing of special category data

A data subject issued a complaint to the Data Protection Commission (DPC) against their employer (data controller) regarding the processing of their health data under Article 9 of the General Data Protection Regulation (GDPR). The data subject explained to the DPC that they had been signed off work by their GP and so, presented their medical certificate to their employer, in an envelope addressed to the organisation’s Medical Officer. A staff member in an acting-up manager role, opened the medical cert; however, this person’s role was not as a medical officer. Before contacting the DPC the data subject contacted their employer to address their concerns that they felt their sensitive personal data had been unlawfully processed; however, they did not receive a response to their complaint.

As part of its examination, the DPC engaged with the data controller and shared the details of the data subject’s complaint. The data controller responded to the DPC and explained that, as per their organisation’s Standard Operating Procedures, as there was no medical officer on duty on the day in question, the responsibility and authority for granting leave, sick or otherwise, automatically falls to the manager on the day, who in this instance was the manager who processed the medical certificate.

The data subject did not accept the explanation provided by the data controller and contested that a medical certificate should not be processed by anyone who is not the designated medical officer.

Through its examination the DPC found that, under Articles 6(1)(b), (c), (f) and 9(2)(b) of the GDPR, the data controller had legitimate bases to process the data subject’s sensitive personal data under the GDPR and so no unlawful processing had occurred. No further action against the data controller was considered necessary in relation to the data subject’s complaint.

39) Case study 39: Disclosure of personal data (Applicable Law – GDPR & Data Protection Act 2018)

A data subject issued a complaint to the Data Protection Commission (DPC) against their owner management company (data controller) regarding the disclosure of their personal data under the General Data Protection Regulation (GDPR). The data subject explained to the DPC that an email containing their personal data was circulated by a property management company on behalf of an owner management company (OMC) and contained information regarding the payment of annual services charges.

Before contacting the DPC the data subject contacted the OMC to address their concerns of the disclosure of their personal data. The OMC responded that its policy was to include such personal data in emails to all clients. The data subject confirmed that it had not seen, nor signed this policy.

Following the engagement of the DPC the data controller cited a clause in its OMC Memorandum of Association which allowed for the disclosure of payment or non-payment of service charges to other unit owners.

The DPC provided both parties with guidance from this office for consideration, “Data Protection Considerations Relating to Multi-Unit Developments and Owners’ Management Companies”. The guidance indicated that the disclosure must be justified as both necessary and proportionate to achieve a specific, explicit and legitimate purpose, in accordance with data protection law.

The data controller informed the DPC that a balancing test was conducted and highlighted that the processing of the personal data was necessary to achieve the legitimate interest of the management company to obtain payment of service charges.

Under section 109(5)(c) of the 2018 Act the DPC advised that the data controller had not been able to provide an adequate lawful basis for the processing of personal data as outlined in the complaint.

The outcome reminded the data controller of their obligations as a data controller under Articles 5, 6 and 24 of the GDPR and under section 109(5)(f) of the 2018 Act, the DPC recommended that the data controller review their Memorandum of Association to ensure compliance with the DPC guidance; consider alternative methods to resolve the non-payment of service charges and consider and balance any legal obligation or legitimate interest against the rights and interests of the data subject.

40) Case study 40: Fair processing of personal data (Applicable Law – GDPR & Data Protection Act 2018)

A data subject issued a complaint to the Data Protection Commission (DPC) against their employer (data controller) regarding the processing of their personal data under the General Data Protection Regulation (GDPR). The data subject explained to the DPC that details of a confidential matter as part of a reference was given to a third party (a prospective employer). Before contacting the DPC the data subject contacted the data controller to address their concerns as they felt their personal data had been unlawfully processed; however, they did not receive a satisfactory response to their complaint.

The DPC notes that the provision of a reference about a staff member from a present/former employer, to a third party, such as a prospective employer, will generally involve the disclosure of personal data. The data subject mentioned that the data controller disclosed a confidential matter in the reference provided to the prospective employer.

As part of its examination, the DPC engaged with the data controller and shared the details of the data subject’s complaint. The data controller responded to the DPC and explained that, it is relying on consent and legitimate interest for disclosing the confidential matter. The data controller outlined that in balancing the data subject’s rights against the interests of the third party (and those to whom it provides care) it determined that it had a duty of care to ensure that the recipient of the reference (prospective employer) received a reference which was true, accurate, fair and relevant to the role which the data subject had applied for.

The data controller was satisfied that the data was processed, fairly and in a transparent manner. It further stated that due to the nature of the employment it had a duty of care not only to the people they support, the staff members, but also to prospective employers who provide support services to same category of clients.

It is important to consider whether the status of the data controller, the applicable legal or contractual obligations (or other assurances made at the time of collection) could give rise to reasonable expectations of stricter confidentiality and stricter limitations on further use. The DPC has taken into consideration whether the data controller could have achieved the same result without disclosing the confidential details to the prospective employer. The statements made in the reference were based on facts which could be proven and were necessary to achieve the legitimate interests of and the duty of care of the data controller’s clients.

The DPC is satisfied that despite the duty of confidence, and in circumstances where the data subject nominated the data controller to provide the reference, thus consented to the sharing of the data subject’s relevant personal data to a prospective employer, the prospective employer’s legitimate interest and the wider public interest justifies the disclosure of the confidential matter.

Having examined the matter thoroughly, under section 109(5)(c) of the 2018 Act the DPC advised the data subject that the explanation put forward by the data controller in the circumstances of this complaint are reasonable and no unlawful processing had occurred. Accordingly, no further action against the data controller was considered necessary in relation to the data subject’s complaint.

41) Case study 41: Unlawful processing of photograph and erasure request under Article 17 of GDPR (Applicable Law – GDPR & Data Protection Act 2018)

A data subject submitted a complaint to the Data Protection Commission (DPC) regarding the publication of their historical image in a newspaper (data controller). The data subject explained to the DPC that the article was published without their knowledge and without their consent. Before contacting the DPC the data subject contacted the data controller to address their concerns that they felt their personal data had been unlawfully processed and requesting erasure of the image from the newspaper under Article 17 of the General Data Protection Regulation (GDPR); however, the data controller rejected all elements of the data subject’s request.

As part of its examination, the DPC engaged with the data controller and asked for a lawful basis under Article 6 of the GDPR for processing the data subject’s personal data in the manner outlined in this complaint. The data controller informed the DPC that it is not relying on Article 6 of the GDPR for processing the data subject’s personal data and it advised that it is relying on section 43 of the Data Protection Act 2018, (the 2018 Act), (data processing and freedom of expression and information), namely that processing of personal data for the purpose of exercising the right to freedom of expression and information, including processing for journalistic purposes or for the purposes of academic, artistic or literary expression, shall be exempt. The data controller further explained that the data subject was not the subject of the news article in question, that a significant number of years have passed since the photograph was taken and as such the data subject was not readily identified.

In relation to the data subject’s erasure request, the data controller relied on Section 43 of the 2018 Act as their basis for refusing to erase the image from the article.

Having considered all the elements of this complaint, the DPC found that the newspaper had a lawful basis under Section 43 of the 2018 Act and Article 85 of the GDPR to publish the data subject’s historical image in a news article.

The DPC notes that the journalistic exemption does not exempt a data controller from the whole of the GDPR and data protection acts. A data controller must have consideration for their remaining obligations under the GDPR and the 2018 Act. The DPC found the processing of the data subject’s personal data by the data controller to be proportionate, considering that the image in question is a historical image in which it can be reasonably assumed that the data subject is no longer readily identifiable from same. The DPC acknowledges that a third party is the main person of interest and directly quoted within the article and therefore the data subject is not the subject of discussion.

The DPC advised the data subject under section 109(5)(c) of the 2018 Act that the explanation put forward by the data controller concerning the processing of their personal data in the circumstances of this complaint was reasonable.

42)  Case study 42: Technical and organisational measures

In this case, the complainant’s family were members of a sports club staffed by volunteers. Following a dispute with the sports club that involved them making a complaint about a member of the club the complainant made a subject access request (SAR) to a body that governed the league that the sports club participated in. Following on from the copy of personal data they received as a result of the SAR, the complainant noticed that details of their complaint were shared with the sports club by the overseeing body and, that as part of their regular communications for club purposes, both organisations had volunteer members who were using their work or personal email address for club related purposes and details of their complaint had been sent to these email addresses. The complainant believed that this amounted to the unauthorised sharing of their personal data with third parties in the form of the employers of the members using work email accounts and represented a data protection breach.

The DPC contacted both the local club and the overseeing body to examine its legal bases under Article 6 of the GDPR for the processing of the complainant’s personal data in this way and to enquire as to how both organisations fulfil its personal data integrity and confidentiality obligations under Article 5.1(f). The sports club responded that as part of its complaints investigation procedure it was obliged to share details of the complaint with specified members of the executive and that it did so under Article 6.1(b) of the GDPR as the complainant’s membership of the club constituted a contract with the club. It also stated that the club did not require volunteers to use a dedicated club email address for conducting business as some members preferred to use the email client they were familiar with. With regard to the technical and organisational measures used by the club to ensure personal data security and integrity it had no specific measures in place due to associated costs of such measures.

The league body responded that as part of its complaint investigation policy it was obliged to share details of the complaint with the sports club and acknowledged it had done so through the use of a personal email address of a club member despite having an official club email address of that member on file. It stated that it had processed the complainant’s personal data under Article 6.1(f) of the GDPR as it had a legitimate interest in sharing the outcome of the complaint with specific officers in the sports club executive in order to make them aware of the findings and recommendations arising out of the complaint made against the club.

The DPC found that both the sports club and the league body had demonstrated an appropriate legal basis for the processing of the complainant’s personal data, however it was determined that neither organisation had met the requirements of Article 5.1(f) in terms of having appropriate technical and organisational measures in place for the processing of personal data as there are a number of free email service providers available which could have been used for official purposes.

The community and voluntary sectors perform a valuable role in our society through the provision of social and leisure based activities. The processing of club members’ personal data is a necessary part of club participation and often voluntary groups do not have significant resources allocated to data protection. Notwithstanding this every data controller must fulfil the obligations under data protection legislation and ensure it has appropriate administrative procedures in place to protect the personal data of its members and staff. The use of personal or work email addresses for voluntary organisations’ data processing does not meet the required standard in terms of data security as it means the data controller has lost control of the personal data it is responsible for as it no longer has access to, or control of the personal data of its members. This practice means the organisation is also unable to service subject access requests in the event that a particular member leaves the organisation or changes jobs.

Data controllers must ensure they have appropriate technical and organisational measures in place regardless of the voluntary nature of the organisation. The use of data protection policies and procedures does not always involve financial cost and data controllers are obliged to ensure that every reasonable effort has been made to adhere to appropriate security controls. For further information and resources, visit the ‘know your obligations’ for organisations section of our website .

43)  Case study 43: Access request following account suspension

The data subject’s (DS) complaint related to their dissatisfaction with Meta Platforms Ireland Limited’s (data controller) response to their access request in accordance with Article 15 of the General Data Protection Regulation (GDPR). The data subject had made their access request following the suspension of their messaging account. The data subject specifically sought a copy of their personal data associated with the messaging account.

The data subject had initially contacted the data controller through their child’s account, as they were unable to submit a request directly through their messaging application, due to their suspension and the fact that data controller’s contact form only accepted queries related to users’ Facebook accounts and not Messenger-only accounts. Despite further engagement with the data controller, the data subject was unable to access their data on the basis that they could not verify their ownership of the messenger account to the satisfaction of the data controller.

The DPC wrote to the data controller, outlining the data subject’s complaint and requesting that it address their access request. Following its engagement with the DPC, the data controller agreed to provide a link to the data subject that would allow them to download a copy of their personal data, provided the data subject could provide a secure email address in order to verify themselves.

The data subject provided the DPC with a secure email address, which the DPC communicated to the data controller. The data controller contacted the data subject directly with a link to download their data while also providing the DPC with evidence of same.

The DPC wrote to the data subject to notify them of the data controller’s response above, and emphasised that the link provided was time sensitive and would expire later that day. The data subject responded stating that they had successfully retrieved their personal data and were satisfied with the outcome of their complaint.  

44)  Case study 44: Right to be forgotten - removal of online news article and photograph

An individual contacted an online news outlet after they found themselves to be the subject of a news article, which included a photograph of them and their family. The individual believed that the photograph and the contents of the article were provided to the news outlet through a third party and stated that they were unaware their image or story would be published.

The individual contacted the news outlet directly and made a ‘Right to be Forgotten’ request, under Article 17 of the GDPR, for the removal of the article and photograph. The news outlet responded to the individual and advised that it would remove the online article, however, the removal process could take up to two weeks.

The news article remained online for a number of months and as a result, the individual submitted a complaint to the DPC. The individual submitted a copy of their original complaint to the news outlet and its response.

The DPC contacted the news outlet directly and queried if there were any overriding legitimate grounds for which it continued to process the individual’s data in light of their erasure request, taking into account the new outlet’s previous response.

The news outlet responded directly to the DPC and advised that it had de-indexed the article at the time of the original request, however due to a technical fault some articles had re-appeared online. Because of the DPC’s interaction, the news outlet de-indexed the article permanently and advised that a new procedure has been put in place to ensure this would not occur again.

The DPC contacted the individual and shared a copy of the new outlet’s response advising that the news article had been de-indexed permanently. The DPC concluded its case file on this complaint and deemed the complaint amicably resolved by the individual and the organisation.

45)  Case study 45: Unlawful processing of personal data by a waste management company

An individual submitted a complaint to the DPC in relation to the alleged unlawful processing of their personal data, in the form of contact information, by a waste management company. The individual explained that they had received multiple text messages from the organisation regarding its services.

The individual contacted the organisation to inform it that they were not a customer of this service. However, the organisation advised that the contact details were provided for the residence in which the individual was currently residing. The organisation was unable to confirm how it had originally obtained the individual’s contact details. However, it advised that it had now removed the individual’s contact details from its database.

The DPC questioned the organisation regarding how it obtained the individual’s contact details. In its response, the organisation advised that it had taken over a different waste management company and that the individual was a previous customer of the former waste management company. Therefore, it was through this changeover between the two companies that the individual’s contact details were obtained.

The organisation advised that it obtained the individual’s contact details under Article 6(1)(b) of the GDPR, which states that processing shall be lawful if “ necessary for the performance of a contract to which the data subject…”

The organisation further advised that all text messages issued to the individual related to a service for which it was providing to the individual as part of their contract.

The DPC informed the individual of the organisation’s explanation. In response, the individual informed the DPC that it was neither a customer of the former waste management company, nor the current waste management company, and therefore the explanation put forward by the organisation was inaccurate.

The organisation came back to the DPC with two further possible explanations as to how it had obtained the individual’s contact details. It stated that either:

  • The individual was a former client of a second former waste management company of which it had taken over; or
  • The organisation, through human error, had entered one of its customer’s contact details into its database incorrectly and those incorrect details matched that of the individual.

The organisation advised that the latter was most likely the cause for obtaining the individual’s data. The organisation apologised to the individual and offered to make a charitable donation as a goodwill gesture to a charity of the individual’s choice.

The organisation was unable to confirm exactly how it had obtained the individual’s contact details, however it had deleted them from its database. As a gesture of goodwill, a donation was made to two charities of the individual’s choice. As the DPC does not have the power to award compensation, this was accepted as an amicable resolution between the individual and the organisation.

46)  Case study 46: Request for erasure of biometric data from employer database

An individual working in the hospitality sector, contacted their employer requesting the erasure of their biometric data from its database. The organisation had introduced a biometric fingerprint scanner to record its employees’ attendance.

Article 9 of the GDPR refers to processing of “special category” personal data. Article 9(1) of the GDPR states that: “…processing of…biometric data for the purpose of uniquely identifying a natural person…shall be prohibited”. However, there are exemptions that are applicable to this article.

In their correspondence to their employer, the individual stated that they believed that it did not have a lawful basis for the processing of their special category data and that it was unlawfully obtained by the organisation. The individual received no response to their request.

The DPC engaged with the organisation and queried if it had any overriding legitimate grounds to continue processing the individual’s personal data in light of their erasure request made in accordance with Article 17 of the GDPR. The organisation responded to the DPC and provided a copy of its response to the individual, which was dated prior to the individual making a complaint to the DPC.

The DPC noted that this response was also within the statutory timeframe provided in Article 12(3) of the GDPR. In its response, the organisation stated that it was no longer using the biometric clocking system and that it contacted the software provider in order to erase all data that was registered with the biometric system. The organisation also provided a copy of its correspondence with the software provider confirming it had erased the data of all employees had been erased.

The DPC shared this response with the individual and the individual thanked the DPC for its assistance with their complaint. As there were no outstanding issues relating to this complaint, the complaint was concluded by amicable resolution.

47)  Case study 47: Complaint of excessive personal data requested by a letting agent

An individual lodged a complaint with the DPC after they had viewed a rental property. They alleged that the letting agent requested excessive personal data from them during the application process.

The individual complained that the organisation unfairly obtained and processed their personal data in relation to a prospective tenant application form. The individual was unsuccessful in their application and submitted an erasure request under Article 17 of the GDPR to the organisation for the deletion of all of their personal data.

The DPC contacted the organisation requesting its lawful basis for obtaining and processing the individual’s personal data. The organisation stated it was relying on Article 6(1)(f) of the GDPR as its lawful basis for processing the individual’s data as it had a legitimate interest for the processing.

The organisation advised that it requested the individual’s personal data to progress their application for a potential tenancy at one of the properties the organisation represents. It further stated that this was in line with Article 5(1)(b) of the GDPR, which states that personal data shall be “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes…”

As part of the application process, the organisation had requested the following information from the individual:

  • Copies of identification
  • Proof of current address
  • Contact details
  • Employment references
  • Current/previous landlord references
  • Two-months bank statements
  • Reason for moving
  • Intended move in date

The individual stated that they believed this was an excessive amount of personal data to provide to the letting agent. However, they believed if they did not comply with the request, their application would not be processed further. The letting agent stated that the information is required for it to ensure the applicant is the person they state they are and that the applicant can afford the property.

The DPC examined the complaint and found that the organisation did not meet the principle of data minimisation under Article 5(1)(c) of the GDPR, which states: “personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed” . The DPC determined that the volume of personal data requested from the individual as a prospective tenant was excessive for the initial stage of an application process.

This volume of personal data is generally not required until a prospective tenant is chosen by the landlord and such information is reviewed to ensure they meet the requirements, that is, ability to pay. Further, it was determined that should the organisation request the same volume of personal data from all prospective tenants who are interested in a property, that this level of processing will generally not be consistent with its obligation under the GDPR to collect only relevant personal data and to limit it to what is necessary in relation to the original purpose.

The DPC found that the organisation had neither sufficiently demonstrated its lawful basis under Article 6(1)(f) of the GDPR nor had it met the data protection principle of data minimisation under Article 5(1)(c) of the GDPR wherein it requested excessive personal data as part of its application process.

As part of their initial complaint, the individual made an erasure request under Article 17 of the GDPR for the deletion of their personal data by the organisation. The organisation responded to the individual and advised it had erased all personal data relating to the individual and that it had never shared the individual’s personal data with a third party. The organisation fulfilled the Article 17 request within the prescribed timeframe as set out in Article 12(3) of the GDPR and so the DPC found that there was no contravention of the GDPR in relation to the individual’s erasure request.  

48)  Case study 48: Excessive CCTV cameras in the workplace complaint

An individual raised a concern with their employer regarding what they believed was excessive CCTV cameras in the workplace. The individual stated that they were never informed that the cameras were being installed and had concerns that the cameras were recording both audio and video. The organisation advised the individual that the cameras were installed for the safety of staff and that no audio was recorded.

The individual submitted a complaint to the DPC as they were dissatisfied with the response received from the organisation. As part of its examination, the DPC queried the organisation on the alleged audio recordings by the CCTV cameras. The organisation confirmed that no audio was recorded and provided the DPC with a letter from the CCTV system supplier which further clarified that no audio was recorded by the cameras.

The organisation informed the DPC that it initially installed the cameras following a series of security issues including theft in the workplace. However it also stated that the cameras were installed for the safety of staff when working alone. Whilst the individual argued that they were unaware the cameras had been installed, the organisation stated that the cameras had been in place for a number of years and that training had been provided to all staff in relation to same.

Article 6(1)(d) of the GDPR states that processing personal data shall lawful if “processing is necessary in order to protect the vital interests of the data subject or of another natural person” . The organisation cited Article 6(1)(d) as its lawful basis stating that the cameras are necessary to protect the vital interests of its staff.  It further cited Article 6(1)(f) which states that processing shall be lawful if “processing is necessary for the purposes of the legitimate interests pursued by the controller...” as the organisation has a legitimate interest in the security of the workplace, safety of staff and prevention of crime.

The DPC informed the organisation that recital 46 of the GDPR states that: “Processing of personal data based on the vital interest of another natural person should in principle take place only where the processing cannot be manifestly based on another legal basis” . This lawful basis may be relied upon by an organisation where the processing of personal data is necessary to protect a person’s life or mitigate against a serious threat to a person. As such, the DPC advised the organisation that it cannot rely on Article 6(1)(d) as its lawful basis for the use of CCTV cameras in the workplace.

The organisation confirmed that in line with Article 6(1)(f) of the GDPR, it had conducted a legitimate interest balancing test prior to the installation of the CCTV cameras. The organisation further stated that the processing is limited to what is necessary and citing its requirement for safety purposes. It stated that footage is retained for a period of 20 days and then is written over by new footage and only management have access to the footage which is password protected.

Following its examination of the complaint, the DPC found that the organisation had demonstrated its lawful basis for the processing of data by means of CCTV cameras under Article 6(1)(f) of the GDPR. The DPC shared its guidance on Data Protection in the Workplace with the organisation and reminded it that access to the CCTV footage should be limited to authorised personnel for the purpose of responding to an incident.

For more information, read the DPC’s Data Protection in the Workplace: Employer Guidance (PDF, 1.119mb) .

49)  Case study 49: Rectification request regarding inaccurate information in a Section 20 report

An individual submitted a rectification request to a Government agency, in accordance with Article 16 of the GDPR, for details of a Section 20 report to be rectified — they believed it contained inaccurate data in relation to them and their child. The organisation acknowledged the request, however, it did not respond to the request for a number of months. The individual contacted the DPC as they believed that they had not received a sufficient response to their request.

The DPC contacted the organisation in relation to the complaint. The organisation advised the DPC that it was not the data controller for the Section 20 report or its contents, as this report had been ordered by and prepared for the courts. Therefore, the organisation was not in a position to action a rectification request for this report.

A Section 20 report is a court-ordered document requested under Section 20 of the Child Care Act 1991, as amended, where a court can request for an investigation of a child’s circumstances. The Judge at bar retains control of and makes the decisions regarding this report, therefore making the court the data controller of this report.

In this complaint, the Government agency did not refuse the rectification request rather, as it was not the data controller of the report; it could not action the rectification request. In its response to the individual, it outlined the circumstances in which a rectification request can be made. However, it failed to inform the individual that it was not the data controller, or to whom their request should correctly be directed.

The DPC is prohibited, by law, under Article 55(3) of the General Data Protection Regulation (GDPR) from supervising the data processing activities of the court when it is acting in a judicial capacity. Article 55(3) of the GDPR states that:

“Supervisory authorities shall not be competent to supervise processing operations of courts acting in their judicial capacity.”

As such, the DPC did not have jurisdiction to examine this complaint further and redirected the individual to the correct data controller. Notwithstanding the above, the DPC noted that whilst the Government agency was not the data controller of the report and could not action a rectification request for same, it did not comply with its obligations under Article 12(3) of the GDPR as it failed to respond to the individual regarding their request within the statutory timeframe.

50)  Case study 50: Hospital refuses erasure request of special category data

An individual submitted a complaint to the DPC after a hospital refused their erasure request, made in accordance with Article 17 of the GDPR.

The individual initially submitted a request to the hospital for the erasure of their special category personal data, as they believed the data held was inaccurate — the individual stated they were previously misdiagnosed by the hospital. The individual had since received a different diagnosis from different medical facility.

Article 17(1)(a) of the GDPR states that a data controller shall erase personal data that is no longer necessary for its original purposes. However, Article 17(3)(c) excludes the application of Article 17(1) in circumstances where the processing is necessary, “for reasons of public interest in the area of public health in accordance with points (h) and (i) of Article 9(2) as well as Article 9(3)” .

The DPC contacted the hospital and requested its lawful basis for the continued processing of the individual’s special category data. Special category data, also referred to as ‘sensitive data’ is defined under Article 9(1) of the GDPR as, but not limited to “… data concerning health ...”

Under Article 9 of the GDPR, there is a general prohibition on the processing of special category data. However, Article 9(2) of the GDPR sets out certain circumstances in which special category data can be processed. In this instance, the hospital stated that it continued to process the data under Articles 9(2)(h) and (i) of the GDPR.

Article 9(2)(h) of the GDPR states: “ processing is necessary for the purposes of preventive or occupational medicine, medical diagnosis…” While Article 9(2) (i) states: “processing is necessary for reasons of public interest in the area of public health… ”

The DPC noted that a medical diagnosis is an opinion that is given at a point in time. Where the individual received a second diagnosis, a number of years later, this does not eradicate the fact that the individual was at a point in time diagnosed with a different illness. Further, this does not in turn make the initial diagnosis inaccurate or incorrect as per the GDPR. The hospital refused the individual’s erasure request and the DPC found this refusal would be pursuant to article 17(3) of the GDPR.

The DPC found that the hospital had demonstrated a lawful basis for the processing of the special category data under Articles 6 and 9 of the GDPR. However, the DPC noted that the hospital responded to the initial erasure request outside of the permitted timeframe and therefore did not comply with its obligations under Article 12(3) of the GDPR.

  • 028 9032 1000
  • Get a free consultation

O’Reilly Stewart Solicitors

Case Studies: Healthcare

Breach of data privacy case settles for £5,000.

O’Reilly Stewart Healthcare recently acted for a client who, in the course of medical treatment being provided by a local NHS Trust, suffered a breach of privacy.

Our client was being treated by a local NHS Trust when her medical records were disclosed to another patient without her consent. These records contained highly sensitive personal information, detailing the nature of our client’s medical history. This caused considerable upset and distress to our client.

The law in respect of Data Protection has progressed following the General Data Protection Regulations (GDPR) enacted by the European Union, which is transcribed into UK law via the Data Protection Act 2018. The GDPR states that “ any person who has suffered material or non-material damage as a result of an infringement of this Regulation shall have the right to receive compensation from the controller or processor for the damage suffered .” Moreover, the disclosure of medical records without consent by the patient represents a breach of the privacy rights of the patient under Article 8 of the European Convention of Human Rights.

After taking instructions from our client, we obtained medical evidence from a Consultant Psychiatrist to report on the extent of psychiatric upset and distress caused to our client by the unauthorised disclosure. Following negotiations with the Trust, the case settled in the sum of £5,000 without admission of liability.

Cases of this nature are becoming more and more common because of the impact of the Data Protection Act 2018.

Our client had the following kind words to say:

“Throughout this traumatic legal ordeal the solicitors at O’Reilly Stewart made every effort to reassure me and were always kind and sensitive to my needs. They were so professional and always went that extra mile.”

Our Healthcare Experts

For further information and advice please contact our experienced legal team.

Joe Moore

Conveyancing Calculator

Receive a comprehensive and accurate quote in minutes with our simple conveyancing calculator.

Please tell us if you are:

We are just going to ask you few essential questions.

Here is your estimated conveyancing costs:

PROFESSIONAL FEES
Our professional Fees £0.00
Search Fees £50.00
Onboarding ID Verification (per person) £18.30
Incidentals (including postage etc) £35.00
Total VAT £0.00
£0.00
PROFESSIONAL FEES
Our professional Fees £0.00
Search Fees £50.00
Onboarding ID Verification (per person) £12.00
Incidentals (including postage etc) £35.00
Total VAT £0.00
£0.00
STAMP DUTY LAND TAX
Stamp duty land tax £0.00
OUTLAYS
Registration fees (including priority search) £0.00
EJO and Bankruptcy Searches £28.00
Updated folio search £7.00
Total VAT £0.00
£0.00
Registration fees may be cheaper if the property is subject to Compulsory First Registration. This will be confirmed when the title deeds have been checked.
EJO and Bankruptcy Searches will cost £28.00 per surname.
VAT exempt.
OUTLAYS
EJO and Bankruptcy Searches £28.00
Title Searches (including folio/map/statutory charges) £26.00
Property Certificates £174.00
Ordinance Survey Map £38.40
Registration of vacated mortgage £20.00
Total VAT £0.00
£0.00
EJO and Bankruptcy Searches will cost £28.00 per surname.
VAT exempt.
OUTLAYS
EJO and Bankruptcy Searches £28.00
Title Searches (including folio/map/statutory charges) £26.00
Property Certificates £174.00
Ordinance Survey Map £38.40
Registration of vacated mortgage £20.00
Registration fees (including priority search) £0.00
Updated folio search £7.00
Total VAT £0.00
£0.00
EJO and Bankruptcy Searches will cost £28.00 per surname.
Registration fees may be cheaper if the property is subject to Compulsory First Registration. This will be confirmed when the title deeds have been checked.
VAT exempt.
OUTLAYS
Registration fees (including priority search) £0.00
EJO and Bankruptcy Searches £28.00
Title Searches (including folios/map/statutory charges) £33.00
Registration of vacated mortgage £20.00
Total VAT £0.00
£0.00
EJO and Bankruptcy Searches will cost £28.00 per surname.
VAT exempt.
TOTAL £0

If you're not getting a mortgage, please request a call back or get in touch directly in order to get a quotation.

Please enter your contact details so we can get in touch regarding your quotation.

Thank you for requesting a call back.

One of our solicitors will be in touch with you shortly.

Request a free consultation

Please enter your contact details so we can get in touch regarding your consultation.

Thank you for requesting a consultation.

Watch CBS News

Cash App customers can now claim more than $2,500 each in a $15 million settlement. Here's how.

By Aimee Picchi

Edited By Alain Sherter

Updated on: August 15, 2024 / 12:22 PM EDT / CBS News

Cash App customers may be able to claim more than $2,500 each as part of a $15 million class-action settlement for data and security breaches at the mobile payment service. 

People whose accounts were accessed without their authorization or who had fraudulent withdrawals or transfers can file claims, provided they had or currently have an account between August 23, 2018, and August 20, 2024, according to the settlement website . 

The class-action pointed to a 2021 incident the company disclosed in 2022, in which a former employee downloaded reports of some U.S. users without permission. It also noted another breach, disclosed in 2023, where an unauthorized user accessed some Cash App accounts using phone numbers that were linked to them.

Plaintiffs also alleged that Cash App and Block, the app's parent company, failed to install controls to block unauthorized users and that the company then mishandled customer complaints about the security breaches and fraudulent transactions.

Cash App and Block also agreed to take steps toward strengthening data security as part of the settlement. 

Cash App and Block have denied any wrongdoing. But to settle the litigation, they agreed to pay $15 million. Beyond attorneys fees and administration costs, that money will go to impacted customers who submit eligible claims.

How to file a Cash App claim

Customers can file a claim at this site . You'll either need to enter a notice ID and confirmation code from a mailed or emailed notice, or you can file a claim if you haven't received such a notice.

What if I have more than one Cash App account?

The settlement website says that each claimant should only submit one claim form. If you have multiple accounts, you should list your $Cashtag identifier — which is the unique username created for each account — and information about your claims on one claim form.

What is the deadline for filing a Cash App claim?

Customers have until November 18 to file a claim under the settlement. 

How much money can customers get in the settlement? 

It's unclear for now because the amount will depend on how many people file claims. 

However, people who were impacted by the data and security breaches may submit claims for up to $2,500 for reimbursement of out-of-pocket losses. They must have third-party documentation to back up their claim, according to the settlement website.

Out of pocket expenses can include:

  • Costs for credit monitoring or identity theft insurance, requesting a credit report or a credit freeze
  • Costs incurred from canceling a payment card or getting a replacement card
  • Costs related to closing a bank account and opening a new bank account
  • Overdraft fees that haven't been refunded
  • Late or missed payment fees or charges that haven't been refunded

Customers can also claim for up to three hours of lost time, at a rate of $25 per hour, the site added. 

Additionally, Cash App users can file a claim to get reimbursed for transaction losses. Those claims also require submitting documentation such as a copy of a police report.

What happens if the settlement fund can't fully pay every claim?

If there's not enough money in the settlement to pay every approved claim in full, payments will be made on a reduced pro rata basis. 

What if I move after filing a claim? 

The settlement website says people who change their mailing address after submitting a claim are responsible for alerting the claims administrator about their new contact information. To do that, you'll have to call 1-866-615-9740 or send your new address in writing to:

Cash App Security Settlement Administrator 1650 Arch Street, Suite 2210 Philadelphia, PA 19103

What other options do I have?

If you want to exclude yourself from the settlement, class members have the option to "opt out" before November 1. This allows you sue or be part of another related lawsuit against the defendants down the road. You can also object to the settlement agreement by writing to the court before November 1.

Finally, you also can choose to do nothing. But if you opt for no action, you will not get any payments and potentially also give up the right to pursue another lawsuit with claims covered in the settlement.

Is it safe to use payment apps? 

Payment apps like Cash App, Zelle and Venmo can be convenient and safe, but they are also frequent targets of scammers and fraudsters, which is why the American Bankers Association and others  urge extra caution  around using them. 

Scammers sometimes pretend to be someone you know and say they need money for an emergency, or claim to want to send you a prize or payment, as long as you send them money first, the Federal Trade Commission says . But once you send someone money through a payment app, it's almost impossible to get the funds back, the ABA cautions.

It's safest to confirm that you know to whom you're sending money, and avoid clicking on any links in unexpected emails, texts or message requests.

— With reporting by the Associated Press.

Aimee Picchi is the associate managing editor for CBS MoneyWatch, where she covers business and personal finance. She previously worked at Bloomberg News and has written for national news outlets including USA Today and Consumer Reports.

More from CBS News

Here's how to freeze your credit after Social Security number breach

Here's what to know about an alleged breach of Social Security numbers

3 smart ways to protect your identity right now

Lawsuit: Kansas school employee locked teen with Down syndrome in closet

New & Notable

Understanding healthcare data breach lawsuit trends

Understanding healthcare data breach lawsuit trends

Lawsuits often follow a healthcare data breach, but understanding what drives litigation trends can help healthcare organizations prepare.

Enzo Biochem pays $4.5M for health data security failures

Enzo Biochem pays $4.5M for health data security failures

State attorneys general from New York, Connecticut and New Jersey issued a $4.5 million penalty to Enzo Biochem, Inc. following a 2023 ransomware attack that resulted in health data security failures.

Fix for Azure Health Bot vulnerabilities prevents exploitation

Fix for Azure Health Bot vulnerabilities prevents exploitation

Researchers disclosed two Azure Health Bot vulnerabilities to Microsoft for which fixes were deployed before the flaws could be exploited.

Latest healthcare cyberattacks highlight operational risks

Latest healthcare cyberattacks highlight operational risks

Recent cyberattacks against OneBlood and McLaren Health Care shed light on the operational challenges that targeted organizations face.

Insights is the research division of Xtelligent Healthcare Media. Our work aims to leverage Xtelligent’s diverse readership of healthcare professionals across various sectors of the industry to understand real-world challenges and identify effective solutions.

breach of data protection act case study

Value-Based Care for Providers

breach of data protection act case study

Healthcare Staffing Challenges

breach of data protection act case study

The New Medicare Advantage

Healthcare strategies: a podcast.

A podcast for healthcare professionals seeking solutions to today's and tomorrow's top challenges. Hosted by the editors of Xtelligent Healthcare Media, this podcast series focuses on real-world use cases that are leading to tangible improvements in care quality, outcomes, and cost.

Guests from leading provider, payer, government, and other organizations share their approaches to transforming healthcare in a meaningful and lasting way.

Latest News

HHS settles HIPAA right of access case with EMS company

HHS settles HIPAA right of access case with EMS company

HHS imposed a $115K civil monetary penalty against American Medical Response over alleged HIPAA right of access failures.

Ransomware attack hits blood donation nonprofit

Ransomware attack hits blood donation nonprofit

Blood donation nonprofit OneBlood is operating at a "significantly reduced capacity" due to a ransomware attack affecting its software system.

Average cost of a healthcare data breach sits at $9.77M

Average cost of a healthcare data breach sits at $9.77M

Healthcare data breach costs fell by 10.6% in 2024 but remain higher than in any other industry, IBM found in its yearly report.

Pharmacy group sues UHG over Change Healthcare data breach

Pharmacy group sues UHG over Change Healthcare data breach

The National Community Pharmacists Association and dozens of providers sued UnitedHealth Group and its subsidiaries over losses suffered due to the Change Healthcare data breach.

OIG audit: HHS secretary must improve cloud security controls

OIG audit: HHS secretary must improve cloud security controls

HHS-OIG auditors recommended that the HHS Office of the Secretary address gaps in its cloud security controls to better safeguard its cloud information systems.

Global IT outage forces hospitals to cancel appointments

Global IT outage forces hospitals to cancel appointments

A global IT outage resulting from a faulty update to CrowdStrike's threat detection platform forced hospitals to cancel non-urgent appointments and surgeries.

What health IT pros can learn from the CrowdStrike outage

What health IT pros can learn from the CrowdStrike outage

Following the CrowdStrike outage, experts recommended that health IT security practitioners focus on building resilience and tackling third-party risk.

What is the Health Breach Notification Rule, Who Does It Apply To?

What is the Health Breach Notification Rule, Who Does It Apply To?

The Federal Trade Commission’s Health Breach Notification Rule applies to vendors of personal health records, including health apps and other non-HIPAA-covered entities.

Breaking Down the NIST Cybersecurity Framework, How It Applies to Healthcare

Breaking Down the NIST Cybersecurity Framework, How It Applies to Healthcare

Healthcare organizations can strengthen their overall security postures by using the NIST Cybersecurity Framework's collection of standards and best practices.

How HHS-OIG conducts cybersecurity audits

How HHS-OIG conducts cybersecurity audits

Healthcare organizations and HHS entities can use the recommendations provided in HHS-OIG cybersecurity audit reports to strengthen the security of their systems.

Key considerations for selecting an EHR vendor include assessing practice needs, conducting a thorough market scan and evaluating...

The proposed rule, which is available for public comment until October 8, 2024, would require HHS contractors to use certified ...

The Traverse Exchange interoperability network supports nationwide health information exchange (HIE) for MEDITECH customers, ...

A population-based deep learning analysis of CDC data shows disparities in anxiety and depression prevalence among adults during ...

An anatomy-aware generative AI tool could improve the synthesis of high-quality computed tomography images for research and ...

Synthetic data generation and use can bolster clinical research, application development and data privacy protection efforts in ...

COMMENTS

  1. Top 10 Privacy and Data Protection Cases of 2021: A selection

    Inforrm covered a wide range of data protection and privacy cases in 2021. Following my posts in 2018, 2019 and 2020 here is my selection of most notable privacy and data protection cases across 2021:. Lloyd v Google LLC [2021] UKSC 50 In the most significant privacy law judgment of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 ...

  2. PDF A Case Study of the Capital One Data Breach

    New data protection and privacy laws and recent cyber security regulations, such as the General Data Protection Regulation (GDPR) that went into effect in Europe in 2018, demonstrate a strong trend and growing concern on how to protect businesses and customers from the significant increase in cyber-attacks. ... A Case Study of the Capital One ...

  3. Case Studies: High-Profile Cases of Privacy Violation

    The settlement: In January 2018, the company entered into a settlement to pay $650,000 to resolve allegations it collected personal information from children without obtaining parental consent, in violation of COPPA. VTech was also required to implement a data security program that is subject to audits for the next 20 years. 6.

  4. Data Protection: Actions Taken by Equifax and Federal Agencies in

    GAO was asked to report on the major breach that occurred at Equifax in 2017. This report (1) summarizes the events regarding the breach and the steps taken by Equifax to assess, respond to, and recover from the incident and (2) describes actions by federal agencies to respond to the breach.

  5. Case Study: Equifax Data Breach

    The case study of the Equifax data breach exemplifies flaws inherent in management of Credit Reporting Agencies (CRAs). CRAs aggregate and sell historical credit information of individuals and companies. ... Regulatory Relief, and Consumer Protection Act, which allows consumers free credit freezes and the ability to place one year fraud alerts ...

  6. 7 Data Breach Case Studies Involving Human Error

    In September 2018, the Information Commissioner's Office issued Equifax a fine of £500,000, the maximum penalty amount allowed under the Data Protection Act 1998, for failing to protect the personal information of up to 15 million UK citizens during the data breach. 2. Ericsson data breach—Mobile services go dark when the certificate expires

  7. GDPR: Key cases so far

    Marriot International suffer unprecedented data breach. On 19 November last year, Marriott International announced that the personal data of 500 million of its customers had been compromised. ... Although this incident occurred in 2014 and therefore decided under the Data Protection Act 1998, this case demonstrates how vital it is that ...

  8. Top 10 Privacy and Data Protection Cases of 2018: a selection

    In this post we round up some of the most legally and factually interesting privacy and data protection cases from England and Europe from the past year. Cliff Richard v. The British Broadcasting Corporation [2018] EWHC 1837 (Ch). This was Sir Cliff Richard's privacy claim against the BBC and was the highest profile privacy of the year.

  9. Top 10 Privacy and Data Protection Cases 2022

    It should be noted that the claim was one under the Data Protection Act 2018, not the GDPR. In finding for the claimant on the data protection grounds, but dismissing those for misuse of private information, the Judge made a declaration and awarded £250 damages. It should be noted the "data breach was at the lowest end of the spectrum."

  10. PDF A Case Study of the Capital One Data Breach: Why Didn't ...

    The purpose of this research was to understand if compliance requirements would help prevent a major data breach incident at Capital One, one of the largest financial institutions in the U.S. This case study aims to understand the technical modus operandi of the cyberattack, map out exploited vulnerabilities, and identify the related compliance ...

  11. Data, distress, and damage: UK data protection and privacy case law in

    As with our 2021 roundup, in this article we look beyond the fines and regulatory guidance to focus on the data protection and privacy developments in UK case law over the previous year.Whilst we may not have seen decisions as fundamental as 2021's Lloyd v.Google [2021] UKSC 50 (Lloyd v.Google), 2022 case law has built on these precedents and provides guidance on other distinct and important ...

  12. Case studies and examples

    Here are some case studies additional to those in the code. Data sharing to improve outcomes for disadvantaged children and families. Sharing with partners in the voluntary or private sector. Landlord and tenant data sharing. Sharing medical records of care home residents. Ensuring children's welfare: data sharing by local authorities with ...

  13. PDF Uber 2016 Data Breach: A critical case study

    act ethically in disclosing this security matter and acted unlawfully to hide the incident. With the benefit hindsight, following are the major key points related to the case: • Two hackers were involved in the data breach. • According to the Information Commissioners office in the UK and the Data Protection

  14. Case study: Google's €50 million GDPR fine

    W hat happened in a nutshell.. On January 21, 2019, the French Data Protection Authority (the "CNIL") imposed a fine of €50 million on Google LLC under the EU General Data Protection ...

  15. The Normative Power of the GDPR: A Case Study of Data Protection Laws

    India. Informational privacy has won the recognition of the Supreme Court of India and the right to privacy as a fundamental right under the Constitution and has underscored the right to life and personal liberty [].This is the first time the Supreme Court has pronounced the right of individuals to their personal data, and privacy and data protection has been placed high on the national agenda ...

  16. Data Protection Breaches

    The nurse who accessed the data was the man's partner at the time. The patient claimed that the breach of the Data Protection Act 1998 (DPA) and the way his subsequent complaint regarding the matter was handled had made worse a pre-existing paranoid personality disorder and prevented him from working. He was awarded damages of £12,500 for ...

  17. Recent trends in data breach litigation

    Recent trends in data breach litigation. When the Data Protection Act 2018 (DPA 2018) came into force, bringing the General Data Protection Regulation (GDPR) into English law, there was speculation that the floodgates were about to open for data breach claims from individuals against businesses misusing personal data. 3 minute read.

  18. Personal data breach examples

    An employee lost his briefcase, containing work on an unencrypted laptop and unredacted paper files relating to a sensitive court case - including information on criminal convictions and health information. Initially, the employee told his manager that he believed the laptop was encrypted and the paper files were redacted.

  19. What is the Punishment for Breaking the Data Protection Act?

    Breaking the Data Protection Act - Case Study. ... It's vital to understand that a 'data breach' doesn't just refer to a 'stolen data' incident, and legally encompasses a variety of incidents. Defined as "a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or ...

  20. Case Studies

    The DPC then considered whether the data controller had committed a breach of data protection legislation. In this regard, the DPC noted that data controllers must comply with certain legal principles that are set out in the relevant legislation. ... in accordance with section 109 of the Data Protection Act 2018. This case study demonstrates ...

  21. Breach of Data Privacy case settles for £5,000

    Cases of this nature are becoming more and more common because of the impact of the Data Protection Act 2018. Our client had the following kind words to say: "Throughout this traumatic legal ordeal the solicitors at O'Reilly Stewart made every effort to reassure me and were always kind and sensitive to my needs.

  22. How to file a Cash App claim

    Cash App users could claim $2,500 over data breach lawsuit 03:58. Cash App customers may be able to claim more than $2,500 each as part of a $15 million class-action settlement for data and ...

  23. Case Study

    Case Study: Breach of Data Protection Act HHD Solicitor: Damian Deazley, Partner, specialising in commercial and civil litigation.. What Happened: Our client was an employee of a large multi-national organisation working as a Customer Services Advisor.He had been off work for a period of time with a pre-existing illness. During this absence his employer disclosed personal information relating ...

  24. Healthtech Security Information, News and Tips

    Healthcare data breach costs fell by 10.6% in 2024 but remain higher than in any other industry, IBM found in its yearly report. Pharmacy group sues UHG over Change Healthcare data breach The National Community Pharmacists Association and dozens of providers sued UnitedHealth Group and its subsidiaries over losses suffered due to the Change ...