Author Archives: Hunton Andrews Kurth LLP

Dutch DPA Publishes Post-GDPR Complaints Report

On December 13, 2018, the Dutch Data Protection Authority (“Autoriteit Persoonsgegevens”) (the “Dutch DPA”) published a report on the complaints it has received since the EU General Data Protection Regulation (“GDPR”) became applicable on May 25, 2018 (the “Report”). The GDPR gives data subjects the right to lodge a complaint with the relevant national supervisory authority when they believe that their personal data is processed in a way violative of the GDPR (see article 77 of the GDPR).

View the Report and the press release (in Dutch).

Facts and Figures

In the past six months, (between May 25, 2018 and November 25, 2018), 22,679 individuals have contacted the Dutch DPA to obtain more information about the GDPR or to file a complaint. The Dutch DPA has received 9,661 complaints from data subjects, of which 44% are pending.

The Report states that 32% of the complaints relate to infringements of data subjects’ rights, such as the right of access and the right to erasure. Fifteen percent of the complaints are grounded in what data subjects consider to be overreach in data collection—that more personal data is gathered than is necessary to achieve the purpose(s) underlying the collection. An additional 12% of complaints allege companies impermissibly share individuals’ personal data , because they do so without informing the data subject of such sharing or by disregarding the data subject’s wishes.

The Dutch DPA also indicated that it has been involved in 331 international complaints concerning companies with cross-border activities or with several establishments in Europe. The Dutch DPA indicated that 176 of these were filed directly with the Dutch DPA, and that it acted as lead supervisory authority for a total 21 of the complaints. Separately, the Dutch DPA acted as a concerned supervisory authority in 119 international complaints; 36 complaints were transferred to the Dutch DPA by other national supervisory authorities as they related to companies established in the Netherlands.

Handling of the Complaints

The Report indicates that, in most cases, the Dutch DPA has responded to  complaints by (1) sending a letter to the named company explaining the applicable requirement, (2) initiating a mediation, or (3) discussing the alleged violation with the company and actions to remediate such violation. The Dutch DPA indicated that most companies then adapted their behavior. According to the Report, 11 investigations stemming from complaints have been initiated.

To date, the Dutch DPA has primarily focused on resolving alleged rights violations and obliging companies to take remediating measures. The Report indicates, however, that in the future, complaints will more often lead to investigations and sanctions.

Most Affected Sectors

According to the Report, most complaints were filed against business service providers (i.e., 41 % of the complaints), companies in the IT sector (12%), the government (10%), financial institutions (9%) and companies in the health care sector (9%).

ICO Notifies More Than 900 Organizations of Failure to Pay Required Data Protection Fee

EU data protection authorities (“DPAs”) are proving their willingness as enforcers with respect to the GDPR, not just with regard to the most serious acts of non-compliance but also for errors of a more administrative nature. Under the previous regime, DPAs typically required companies to register their processing activities with the regulator, but the GDPR now permits organizations to maintain data processing inventories internally, only showing them to DPAs when there is a particular need to do so. In the UK, the Information Commissioner’s Office (“ICO”) introduced a requirement for organizations to pay a “data protection fee,” which data controllers falling under the ICO’s scope must pay once a year. Those companies that fail to pay the fee risk incurring a fine of up to £4,350 each.

Between September and November of this year, more than 900 organizations have received notification from the ICO of its intent to fine as a result of their failure to pay the data protection fee. The notifications have been delivered to organizations operating across a number of sectors, from construction to finance, and more than 100 companies have had fines levied against them, with the proceeds contributing to the UK Treasury’s Consolidated Fund. Those notified were given eight days to pay the fine before further legal action was taken by the ICO.

For small organizations, with no more than 10 members of staff and revenues of less than £632,000, the fee is limited to £40 per year, but for the larger organizations, reporting revenues of more than £36 million and employing more than 250 staff members, the required fee is a more sizeable £2,900, based on the increased level of risk the company and its data processing activities present. The fee supports the ICO, which now employs 670 staff members in the UK, in conducting its investigations, providing advice, and preparing guidance relating to the UK’s data protection regime. The specific charges levied are now set out in the UK’s Data Protection (Charges and Information) Regulations 2018.

Australia and Chinese Taipei Join the APEC CBPR System

On November 23, 2018, both Australia and Chinese Taipei joined the APEC Cross-Border Privacy Rules (“CBPR”) system. The system is a regional multilateral cross-border transfer mechanism and an enforceable privacy code of conduct and certification developed for businesses by the 21 APEC member economies.

The Australian Attorney-General’s Department recently announced that APEC endorsed Australia’s application to participate and that the Department plans to work with both the Office of the Australian Information Commissioner and organizations to implement the CBPR system requirements in a way that ensures long-term benefits for Australian businesses and consumers.

In Chinese Taipei, the National Development Council announced that Chinese Taipei has joined the system. According to the announcement, Chinese Taipei’s participation will spur local enterprises to seek overseas business opportunities and help shape conditions conducive to cross-border digital trade.

Australia and Chinese Taipei become the seventh and eighth countries to participate in the system, joining the U.S., Mexico, Canada, Japan, South Korea and Singapore. Both nations’ decisions to join the system further highlights the growing international status of the CBPR system, which implements the nine high-level APEC Privacy Principles set forth in the APEC Privacy Framework. Several other APEC economies are actively considering joining.

Argentina DPA Issues Guidelines on Binding Corporate Rules

The Agency of Access to Public Information (Agencia de Acceso a la Información Pública) (“AAIP”) has approved a set of guidelines for binding corporate rules (“BCRs”), a mechanism that multinational companies may use in cross-border data transfers to affiliates in countries with inadequate data protection regimes under the AAIP.

As reported by IAPP, pursuant to Regulation No. 159/2018, published December 7, 2018, the guidelines require BCRs to bind all members of a corporate group, including employees, subcontractors and third-party beneficiaries. Members of the corporate group must be jointly liable to the data subject and the supervisory authority for any violation of the BCRs.

Other requirements include:

  • restrictions on the processing of special categories of personal data and on the creation of files containing personal data relating to criminal convictions and offenses;
  • protections such as providing for the right to object to the processing of personal data for the purpose of unsolicited direct marketing;
  • complaint procedures for data subjects that include the ability to institute a judicial or administrative complaint using their local venue; and
  • data protection training to personnel in charge of data processing activities.

BCRs also should contemplate the application of general data protection principles, especially the legal basis for processing, data quality, purpose limitation, transparency, security and confidentiality, the data subjects’ rights, and the restriction to subsequent cross-border data transfer to non-adequate jurisdictions. BCRs that do not reflect the guidelines’ provisions must submit the relevant material to the AAIP for approval within 30 calendar days from the date of transfer. Approval is not required if BCRs that track the guidelines are used.

Lisa Sotto, Head of Hunton’s Privacy and Cybersecurity Practice, Kicks Off FTC Data Security Panel

In connection with its hearings on data security, the Federal Trade Commission hosted a December 12 panel discussion on “The U.S. Approach to Consumer Data Security.” Moderated by the FTC’s Deputy Director for Economic Analysis James Cooper, the panel featured private practitioners Lisa Sotto, from Hunton Andrews Kurth, and Janis Kestenbaum, academics Daniel Solove (GW Law School) and David Thaw (University of Pittsburgh School of Law), and privacy advocate Chris Calabrese (Center for Democracy and Technology). Lisa set the stage with an overview of the U.S. data security framework, highlighting the complex web of federal and state rules and influential industry standards that result in a patchwork of overlapping mandates. Panelists debated the effect of current law and enforcement on companies’ data security programs before turning to the “optimal” framework for a U.S. data security regime. Among the details discussed were establishing a risk-based approach with a baseline set of standards and clear process requirements. While there was not uniform agreement on the specifics, the panelists all felt strongly that federal legislation was warranted, with the FTC taking on the role of principal enforcer.

View an on-demand recording of the hearing. For more information on the data security hearings, visit the FTC’s website.

AOL Successor Agrees to Pay $4.95 Million in COPPA Enforcement Action

On December 4, 2018, the New York Attorney General (“NY AG”) announced that Oath Inc., which was known as AOL Inc. (“AOL”) until June 2017 and is a subsidiary of Verizon Communications Inc., agreed to pay New York a $4.95 million civil penalty following allegations that it had violated the Children’s Online Privacy Protection Act (“COPPA”) by collecting and disclosing children’s personal information in conducting online auctions for advertising placement. This is the largest-ever COPPA penalty.

The NY AG alleged that AOL used its display ad exchange to conduct billions of ad space auction websites that AOL knew to be directed to children under the age of 13 and subject to COPPA. AOL is said to have gained this knowledge from clients who flagged child-directed properties to AOL, and from its own internal reviews. In all, AOL is alleged to have conducted 2 billion auctions of display ad space from these websites.

The settlement requires AOL to (1) establish and maintain a comprehensive COPPA compliance program; (2) retain an objective, third-party professional to assess the privacy controls that the company has implemented; (3) implement and maintain functionality that enables website operators that sell ad inventory through AOL systems to indicate each website or portion of a website that is subject to COPPA; and (4) destroy all personal information collected from children. In a statement, Oath indicated that it is “wholly committed to protecting children’s privacy online” and agreed to make comprehensive reforms of its business practices to ensure that children are protected from improper targeted advertising online.

FTC Seeks Public Comment on Identity Theft Rules

On December 4, 2018, the Federal Trade Commission published a notice in the Federal Register indicating that it is seeking public comment on whether any amendments should be made to the FTC’s Identity Theft Red Flags Rule (“Red Flags Rule”) and the duties of card issuers regarding changes of address (“Card Issuers Rule”) (collectively, the “Identity Theft Rules”). The request for comment forms part of the FTC’s systematic review of all current FTC regulations and guides. These periodic reviews seek input from stakeholders on the benefits and costs of specific FTC rules and guides along with information about their regulatory and economic impacts.

The Red Flags Rule requires certain financial entities to develop and implement a written identity theft detection program that can identify and respond to the “red flags” that signal identity theft. The Card Issuers Rule requires that issuers of debit or credit cards (e.g., state credit unions, general retail merchandise stores, colleges and universities, and telecom companies) implement policies and procedures to assess the validity of address change requests if, within a short timeframe after receiving the request, the issuer receives a subsequent request for an additional or replacement card for the same account.

The FTC is seeking comments on multiple issues, including:

  • Is there a continuing need for the specific provisions of the Identity Theft Rules?
  • What benefits have the Identify Theft Rules provided to consumers?
  • What modifications, if any, should be made to the Identify Theft Rules to reduce any costs imposed on consumers?
  • What modifications, if any, should be made to the Identify Theft Rules to increase their benefits to businesses, including small businesses?
  • What evidence is available concerning the degree of industry compliance with the Identify Theft Rules?
  • What modifications, if any, should be made to the Identify Theft Rules to account for changes in relevant technology or economic conditions?

The comment period is open until February 11, 2019, and instructions on how to make a submission to the FTC are included in the notice.

Hunton Recognized in Chambers and Partners 2019 FinTech Guide

Hunton Andrews Kurth LLP is pleased to announce that the firm was recognized in the inaugural Chambers and Partners 2019 FinTech guide. The guide commends the firm for attaining an “excellent reputation for the strengths of its data protection and cybersecurity practice, where it counsels FinTech businesses on privacy issues in commercial contracts and transactional matters.”

In addition, Lisa Sotto, partner and chair of the Privacy and Cybersecurity practice, is one of only two lawyers ranked in the Band 1 category for USA: Legal: Data Protection & Cyber Security.

The Chambers and Partners FinTech guide provides expert legal commentary on key issues for businesses. The guide covers the important developments in the most significant jurisdictions.

CNIL Launches Public Consultation on Draft Standards on Data Processing for Managing Business Activities and Unpaid Invoices

On November 29, 2018, the French Data Protection Authority (the “CNIL”) launched an online public consultation regarding two new CNIL draft standards (“Referentials”) concerning the processing of personal data to manage (1) business activities and (2) unpaid invoices

Background

Following the 2018 update to the French Data Protection Act for purposes of implementing the EU General Data Protection Regulation (“GDPR”), the CNIL may issue guidelines, recommendations or standards called “Referentials.” These Referentials are not compulsory: they are mainly intended as guidance for carrying out specific data processing activities under the GDPR. Each Referential lists the purposes of the data processing in question, the legal basis for that data processing, the types of personal data that may be processed for those purposes, the data retention periods and the associated security measures. By providing this information, the Referential is also intended to aid data controllers to carry out a data protection impact assessment (“DPIA”) as necessary. Data controllers may refer to a Referential to describe the measures the controllers implement, or envision implementing, in order to comply with the necessity and proportionality requirements of the GDPR, to honor data subjects’ rights, and to address risks to data subjects’ rights and freedoms.

CNIL’s Draft Referential on Data Processing for Managing Business Activities

This draft Referential updates the CNIL’s Simplified Norm No. 48 on the management of customers and prospective customers. It therefore intends to cover standard customer data processing activities carried out by any data controller, except (1) health or educational institutions; (2) banking or similar institutions; (3) insurance companies; and (4) operators subject to approval by the French Online Gambling Regulatory Authority. It does not, however, cover the following customer data processing activities: (1) fraud detection and prevention; (2) preventing, on a temporary or permanent basis, data subjects from receiving or accessing services or goods (e.g., due to unpaid invoices); (3) profiling; (4) monitoring store traffic; (5) enriching databases with information collected by third parties. Interestingly, the draft Referential refers to the CNIL’s December 2013 guidelines in advising how to comply with the EU/French cookie law rules, thereby confirming the validity of its previous guidelines even post-GDPR, pending the adoption of the draft ePrivacy Regulation.

CNIL’s Draft Referential on Data Processing for Managing Unpaid Invoices

This draft Referential intends to cover the processing of personal data for managing unpaid invoices. It does not cover the processing of customer data for detecting risks of non-payment, or to identify other infringements (such as discourtesy shown by customers).

The public consultation on the two draft Referentials will be open until January 11, 2019. The new Referentials will then likely be adopted by the CNIL in plenary session.

Privacy Blog Nominated for Best AmLaw Blog of 2018 – Please Vote to Win

Hunton Andrews Kurth’s Privacy & Information Security Law Blog has been nominated in The Expert Institute’s 2018 Best Legal Blog Contest for Best AmLaw Blog of 2018. For nearly 10 years, our award-winning privacy blog has provided readers with current information and legal commentary on news stories; breaking international, federal and state legislation; and other issues on privacy, data protection and cybersecurity. We appreciate your continued support and readership, and ask that you please take a moment to vote for our blog. Click here to vote.

FTC’s Upcoming Hearing Will Address U.S. Approach to Data Security

The Federal Trade Commission published the agenda for the ninth session of its Hearings on Competition and Consumer Protection in the 21st Century (“Hearings Initiative”), a wide-ranging series of public hearings. The ninth session, to take place on December 11-12, 2018, will focus on data security. Lisa Sotto, chair of Hunton Andrews Kurth’s Privacy and Cybersecurity practice, is one of five panel participants discussing “The U.S. Approach to Consumer Data Security.” The panel will be moderated by James Cooper, Deputy Director for Economic Analysis of the FTC’s Bureau of Consumer Protection.

Supreme Court of Pennsylvania Ruling on Common Law Duty to Protect Electronic Employee Data

On November 21, 2018, the Supreme Court of Pennsylvania ruled that a putative class action filed against UPMC (d/b/a The University of Pittsburg Medical Center) should not have been dismissed.

The case arose from a data breach in which criminals accessed UPMC’s computer systems and stole the personal and financial information of 62,000 current and former UPMC employees. This information included names, birth dates, Social Security numbers, addresses, tax forms and bank account data, all of which the employees were required to provide as a condition of employment. The plaintiffs alleged that UPMC was negligent in the collection and storage of this information, and breached an implied contract in connection with the event. The trial court dismissed the case, which the intermediate appellate court affirmed.

Pennsylvania’s highest court, however, disagreed. The court held that: (1) an employer has a duty under Pennsylvania common law to use reasonable care to safeguard its employees’ sensitive personal information that it stores on Internet-accessible computer systems; and (2) Pennsylvania’s economic loss doctrine did not bar the plaintiffs’ negligence claim.

The court explained that it was not creating a new, affirmative duty. Rather, “the case is one involving application of an existing duty to a novel factual scenario.” In other words, the duty was presumed due to UPMC’s alleged risk-causing conduct. Indeed, the court stressed that due to the early procedural posture of the case, it was required to accept as true the plaintiffs’ allegations that UPMC’s conduct created the risk of the data breach. The presence of a third party’s criminal conduct also was not a superseding cause that cut off UPMC’s liability because UPMC’s alleged conduct created a situation where UPMC knew, or should have known, that a third party might try to compromise its network.

The court next found that the economic loss doctrine, as applied in Pennsylvania, did not preclude all negligence claims seeking purely “economic damages” (i.e., monetary damages that do not involve personal injury or property damage). After discussing prior Pennsylvania economic loss doctrine cases, the court concluded that the common law duty it had recognized existed independently from any contractual obligation between the parties, thus precluding application of the economic loss doctrine. As the court noted, this approach to the economic loss doctrine is not taken by all states.

CIPL Publishes Report on Artificial Intelligence and Data Protection in Tension

The Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth LLP recently published the first report in its project on Artificial Intelligence (“AI”) and Data Protection: Delivering Sustainable AI Accountability in Practice.

The report, entitled “Artificial Intelligence and Data Protection in Tension,” aims to describe in clear, understandable terms:

  • what AI is and how it is being used all around us today;
  • the role that personal data plays in the development, deployment and oversight of AI; and
  • the opportunities and challenges presented by AI to data protection laws and norms.

The report describes AI capabilities and examples of public and private uses of AI applications in society. It also looks closely at various tensions that exist between well-established data protection principles and the requirements of AI technologies.

The report concludes with six general observations:

  • Not all AI is the same;
  • AI is widely used in society today and is of significant economic and societal value;
  • AI requires substantial amounts of data to perform optimally;
  • AI requires data to identify and guard against bias;
  • The role of human oversight of AI is likely to and will need to change for AI to deliver the greatest benefit to humankind; and
  • AI challenges some requirements of data protection law.

The report is a level-setting backdrop for the next phase of CIPL’s AI project – working with data protection officials, industry leaders and others to identify practical ways of addressing challenges and harnessing the opportunities presented by AI and data protection.

After this next phase, CIPL expects to release a second report, Delivering Sustainable AI Accountability in Practice, which will address some of the critical tools that companies and organizations are starting to develop and implement to promote accountability for their use of AI within existing legal and ethical frameworks, as well as reasonable interpretations of existing principles and laws that regulators can employ to achieve efficient, effective privacy protection in the AI context. The report will also touch on considerations for the developing data protection laws cognizant of AI and other innovative technologies.

To read the first report in detail and to learn more about the observations detailed above, please see the full report.

UK ICO Issues Warning to Washington Post Over Cookie Consent Practices

On November 19, 2018, The Register reported that the UK Information Commissioner’s Office (“ICO”) issued a warning to the U.S.-based The Washington Post over its approach to obtaining consent for cookies to access the service.

The Washington Post presents readers with three options to access its service: (1) free access to a limited number of articles dependent on consent to the use of cookies and tracking for the delivery of personalized ads; (2) a basic subscription consisting of paid access to an unlimited number of articles that is also dependent on consent to the use of cookies and tracking; or (3) a premium subscription consisting of paid access to an unlimited number of articles with no on-site advertising or third party ad tracking for a higher fee.

Responding to a complaint submitted by a reader of The Register, the ICO concluded that since The Washington Post has not offered a free alternative to accepting cookies, consent cannot be freely given and the newspaper is in contravention of Article 7(4) of the EU General Data Protection Regulation (“GDPR”). Article 7(4) provides that “when assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.”

The ICO has issued a written warning to The Washington Post to ensure access to all three subscription levels without users having to consent to the use of cookies. Although The Washington Post is a U.S.-based company, Article 3(2) of the GDPR provides that the regulation applies to the processing of personal data of individuals in the EU by a controller or processor established outside the EU where the processing activities are related to the offering of goods or services to those individuals inside the EU.

Despite issuing a warning, the ICO has noted that if the newspaper decides not to change its practices for obtaining consent for cookies, there is nothing else the regulator can do on the matter. Aside from issues around resources to pursue cross-border enforcement, there continues to be uncertainty around the GDPR’s extraterritorial applicability and its enforceability against non-EU based organizations.

As we previously reported, the FTC and ICO signed a Memorandum of Understanding (the “Memorandum”) in 2014 to facilitate mutual assistance and the exchange of information in investigating and enforcing covered privacy violations. However, the term “covered privacy violation” refers to practices that violate the applicable privacy laws of one participant country to the Memorandum and that are the same or substantially similar to practices prohibited by privacy laws in the other participant country. As U.S. privacy law does not address the issue of cookie consent, the issue is unlikely to fall under the scope of the Memorandum.

The European Data Protection Board is expected to release guidance around the GDPR’s extraterritorial applicability in the coming weeks.

UK and EU Draft Withdrawal Agreement

On November 14, 2018, the UK government and the EU agreed upon the text of a draft Withdrawal Agreement in relation to the UK’s impending exit from the European Union on March 29, 2019. The draft Withdrawal Agreement provides for a transition period under which the UK will remain subject to a number of its EU membership obligations, during the period starting when the UK leaves the EU on March 29, 2019 to the end of the transition period on December 31, 2020. The draft Withdrawal Agreement provides the following in relation to data protection law:

  • EU data protection law, including the General Data Protection Regulation (“GDPR”) and the e-Privacy Directive, will continue to apply to personal data of data subjects outside the UK that are (i) processed in the UK in accordance with the GDPR before the end of the transition period on December 31, 2020, and (ii) processed in the UK after the end of the transition period on the basis of the draft Withdrawal Agreement.
  • To the extent that a declaration states that the UK provides an adequate level of protection is issued by the European Commission during the transition period, then EU data protection law (including the GDPR and the e-Privacy Directive) will no longer apply in the UK to personal data of data subjects outside the UK. If, however, such declaration of adequacy ceases to be applicable, the UK commits to ensuring an adequate level of protection for the processing of the relevant personal data that is essentially equivalent to that provided by EU data protection law. Although not explicitly stated in the text of the draft Withdrawal Agreement, this obligation appears to extend beyond the end of the transition period.
  • Notwithstanding the above, Chapter VII of the GDPR, relating to cooperation between supervisory authorities and the consistency mechanism, will not apply in the UK during the transition period. As such, organizations will not be permitted to designate the UK Information Commissioner’s Office (“ICO”) as lead authority for GDPR purposes. In addition, the ICO will, during the transition period, have a significantly limited role in relation to the European Data Protection Board. The ICO will be entitled to attend meetings of the European Data Protection  Board in some cases, but will no longer have voting rights.

In practical terms, assuming that the draft Withdrawal Agreement is adopted in its current form, personal data flows between the EU and the UK will likely continue unrestricted during the transition period, until at least December 31, 2020. The draft Withdrawal Agreement itself does not, however, address the relationship between the UK and the EU after the end of the transition period, which will be subject to whatever final deal, if any, is agreed between the EU and the UK. As the draft Withdrawal Agreement is currently written, however, it appears to contemplate a declaration of adequacy in relation to the UK, which if issued would address transfers of personal data from the EU to the UK after the end of the transition period.  As such, it appears that any immediate threat to personal data transfers between the UK and the EU has been staved off, and transfers are likely to continue unaffected during the transition period.

Before being agreed between the UK and the European Council, the draft Withdrawal Agreement must be approved by the UK Parliament. Following multiple resignations from Theresa May’s government yesterday, it looks increasingly unlikely that the draft Withdrawal Agreement will be approved in its current form. If the draft Withdrawal Agreement is not approved, then there remains the prospect of the UK leaving the EU without any transition period or immediate free trade agreement, or any arrangements in place to protect the free flow of personal data between the EU and UK. If, however, a new draft is proposed and agreed upon before the March deadline, it is possible that some of the non-contentious provisions (which may include those relating to data protection) could be carried over into that new proposal.

CIPL Publishes Legal Note on the ePrivacy Regulation and the EU Charter of Fundamental Rights

On November 12, 2018, the Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth LLP published a legal note on the ePrivacy Regulation and the EU Charter of Fundamental Rights. It was written for CIPL by Dr. Maja Brkan, assistant professor of EU law at Maastricht University, David Dumont, Counsel at Hunton Andrews Kurth, and Dr. Hielke Hijmans, CIPL’s Senior Policy Advisor. 

The note contributes to an important and recurring legal discussion on the proposed ePrivacy Regulation.

The proposal aims to protect the confidentiality of communications, and in particular addresses the confidentiality of content data and metadata of individuals and legal persons, implementing Article 7 of the EU Fundamental Rights Charter (“right to privacy”). In contrast, the GDPR implements Article 8 of the Charter (“right to data protection”).

The legal note argues that the difference between Articles 7 and 8 of the Charter has limited relevance  in connection to the ePrivacy Regulation. It aims to demonstrate that EU law, and in particular the Charter, does not preclude a risk based approach, nor the processing of content data and metadata on the basis of legitimate interest, provided that the necessary safeguards protecting the individuals’ communications are put in place. Neither Article 7 nor Article 52.1 of the Charter enumerate the grounds for limitation of fundamental rights. They do not prescribe that the right to privacy can be limited only on the basis of particular justificatory grounds, such as consent of the user.

The note also addresses a few related issues, such as the sensitive nature of content data and metadata, as well as the robust protection GDPR provides individuals if an organization relies on legitimate interests as a legal basis for processing electronic communications data, due to the increased accountability measures organizations need to take.

CIPL’s note also deals with the confidentiality of communications of legal persons and explains that this confidentiality is indeed not a matter of privacy, but is protected under other EU law provisions.

EU Commission Responds to NTIA Request for Comment on Developing the Administration’s Approach to Consumer Privacy

On November 9, 2018, the European Commission (“the Commission”) submitted comments to the U.S. Department of Commerce’s National Telecommunications and Information Administration (“NTIA”) in response to its request for public comments on developing the administration’s approach to consumer privacy.

In its comments, the Commission welcomes and agrees with many of the high-level goals identified by NTIA, including harmonization of the legal landscape, incentivizing privacy research, employing a risk-based approach and creating interoperability at a global level. The Commission also welcomes that the key characteristics of a modern and flexible privacy regime (i.e., an overarching law, a core set of data protection principles, enforceable individual rights and an independent supervisory authority with effective enforcement powers) are also at the core of NTIA’s proposed approach to consumer privacy. The Commission structured its specific suggestions around these key characteristics.

In particular, the Commission makes specific suggestions around:

  • Harmonization: The Commission notes that overcoming regulatory fragmentation associated with an approach based on sectoral law in favor of a more harmonized approach would create a level playing field, and provide necessary certainty for organizations while ensuring consistent protection for individuals.
  • Ensuring Trust: The Commission recommends that ensuring trust should guide the development of the US privacy policy formulation, and notes that giving individuals more control over their data will increase trust levels with organizations and in turn result in a greater willingness to share data on the part of consumers.
  • Data Protection Principles: The Commission commends NTIA on the inclusion of certain core data protection principles such as reasonable minimization, security, transparency and accountability, but suggests the further explicit inclusion of other principles such as lawful data processing (i.e., the requirement to process data pursuant to a legal basis, such as consent), purpose specification, accuracy and specific protections for sensitive categories of data.
  • Breach Notification: The Commission suggests the specific inclusion of a breach notification requirement to enable individuals to protect themselves from and mitigate any potential harm that might result from a data breach. While there are already state breach notification laws in place, the Commission believes organizations and individuals could benefit from the harmonization of such rules.
  • Individual Rights: The Commission believes that any proposal for a privacy regime should go beyond the inclusion of only traditional individual rights, such as access and correction, and should include other rights regarding automated decision-making (e.g., the right to explanation or to request human intervention) and rights around redress (e.g., the right to lodge a complaint and have it addressed, and the right to effective judicial redress).
  • Oversight and Enforcement: The Commission notes that the effective implementation of privacy rules critically depends on having robust oversight and enforcement by an independent and well-resourced authority. In this regard, the Commission recommends strengthening the FTC’s enforcement authority, the introduction of mechanisms to ensure effective resolution of individual complaints and the introduction of deterrent sanctions.

The Commission notes in its response that while this consultation only covers a first step in a process that might lead to federal action, it stands ready to provide further comments on a more developed proposal in the future.

NTIA’s request for comments closed on November 9, 2018 and NTIA will post the comments it received online shortly.

 

Privacy Advocacy Organization Files GDPR Complaints Against Data Brokers

On November 8, 2018, Privacy International (“Privacy”), a non-profit organization “dedicated to defending the right to privacy around the world,” filed complaints under the GDPR against consumer marketing data brokers Acxiom and Oracle. In the complaint, Privacy specifically requests the Information Commissioner (1) conduct a “full investigation into the activities of Acxiom and Oracle,” including into whether the companies comply with the rights (i.e., right to access, right to information, etc.) and safeguards (i.e., data protection impact assessments, data protection by design, etc.) in the GDPR; and (2) “in light of the results of that investigation, [take] any necessary further [action]… that will protect individuals from wide-scale and systematic infringements of the GDPR.”

The complaint alleges that the companies’ processing of personal data neither comports with the consent and legitimate interest requirements of the GDPR, nor the GDPR’s principles of:

  • transparency (specifically relating to sources, recipients and profiling);
  • fairness (considering individuals’ reasonable expectations, the lack of a direct relationship, and the opaque nature of processing);
  • lawfulness (including whether either company’s reliance on consent or legitimate interest is justified);
  • purpose limitation;
  • data minimization; and
  • accuracy.

The complaint emphasizes that Acxiom and Oracle are illustrative of the “systematic” problems in the data broker and AdTech ecosystems, and that it is “imperative that the Information Commissioner not only investigate[] these specific companies, but also take action in respect of other relevant actors in these industries and their practices.”

In addition to the complaint against Acxiom and Oracle, Privacy submitted two separate joined complaints against credit reference data brokers Experian and Equifax, and AdTech data brokers Quantcast, Tapad and Criteo.

BayLDA Publishes Review on Audits

On November 7, 2018, the Data Protection Authority of Bavaria for the Private Sector (the “BayLDA”) issued a press release describing audits completed and pending in Bavaria since the EU General Data Protection Regulation (“GDPR”) took force.

The BayLDA initially focused on informing entities about changes brought by the GDPR. Subsequently, this year the BayLDA launched data protection investigations throughout Bavaria to check compliance, raise awareness of the risks inherent to the processing of personal data and incite entities to effectively and adequately protect this data.

As of now, the BayLDA has audited a small number of entities. The audit structure is fairly predictable, beginning with a written examination that is followed by on-site visits to selected entities to verify the information provided. The BayLDA’s aim is to conduct active audits to explain the criteria to these entities and to detail what is expected of them. To this end, the BayLDA publishes the review letters sent to each entity to enable others to understand the requirements and how to comply.

The BayLDA has focused largely on cybersecurity issues (particularly on the security of online shops and ransomware in medical practices), the accountability of large companies, the duty of companies to disclose to job candidates the processing of their personal data during the application process and, finally, the implementation of the GDPR in small and medium-sized enterprises.

The BayLDA intends to continue its wave of audits, including through two investigative approaches it has commenced—one, auditing large, international companies to assess whether they comply with data protection regulations when selecting service providers and, in particular, whether they have implemented a reporting process in the event of a data breach; second, focusing on the issue of “erasure of data,” particularly in connection to SAP systems.

CNIL Publishes DPIA Guidelines and List of Processing Operations Subject To DPIA

On November 6, 2018, the French Data Protection Authority (the “CNIL”) published its own guidelines on data protection impact assessments (the “Guidelines”) and a list of processing operations that require a data protection impact assessment (“DPIA”). Read the guidelines and list of processing operations (in French).

CNIL’s Guidelines

The Guidelines aim to complement guidelines on DPIA adopted by the Article 29 Working Party on October 4, 2017, and endorsed by the European Data Protection Board (“EDPB”) on May 25, 2018. The CNIL crafted its own Guidelines to specify the following:

  • Scope of the obligation to carry out a DPIA. The Guidelines describe the three examples of processing operations requiring a DPIA  provided by Article 35(3) of the EU General Data Protection Regulation (“GDPR”). The Guidelines also list nine criteria the Article 29 Working Party identified as useful in determining whether a processing operation requires a DPIA, if that processing does not correspond to one of the three examples provided by the GDPR. In the CNIL’s view, as a general rule a processing operation meeting at least two of the nine criteria requires a DPIA. If the data controller considers that processing meeting two criteria is not likely to result in a high risk to the rights and freedoms of individuals, and therefore does not require a DPIA, the data controller should explain and document its decision for not carrying out a DPIA and include in that documentation the views of the data protection officer (“DPO”), if appointed. The Guidelines make clear that a DPIA should be carried out if the data controller is uncertain. The Guidelines also state that processing operations lawfully implemented prior to May 25, 2018 (e.g., processing operations registered with the CNIL, exempt from registration or recorded in the register held by the DPO under the previous regime) do not require a DPIA within a period of 3 years from May 25, 2018, unless there has been a substantial change in the processing since its implementation.
  • Conditions in which a DPIA is to be carried out. The Guidelines state that DPIAs should be reviewed regularly—at minimum, every three years—to ensure that the level of risk to individuals’ rights and freedoms remains acceptable. This corresponds to the three-year period mentioned in the draft guidelines on DPIAs adopted by the Article 29 Working Party on April 4, 2017.
  • Situations in which a DPIA must be provided to the CNIL. The Guidelines specify that data controllers may rely on the CNIL’s sectoral guidelines (“Referentials”) to determine whether the CNIL must be consulted. If the data processing complies with a Referential, the data controller may take the position that there is no high residual risk and no need to seek prior consultation for the processing from the CNIL. If the data processing does not fully comply with the Referential, the data controller should assess the level of residual risk and the need to consult the CNIL. The Guidelines note that the CNIL may request DPIAs in case of inspections.

CNIL’s List of Processing Operations Requiring a DPIA

The CNIL previously submitted a draft list of processing operations requiring a DPIA to the EDPB for its opinion. The CNIL adopted its final list on October 11, 2018, based on that opinion. The final list includes 14 types of processing operations for which a DPIA is mandatory. The CNIL provided concrete examples for each type of processing operation, including:

  • processing operations for the purpose of systematically monitoring the employees’ activities, such as the implementation of data loss prevention tools, CCTV systems recording employees handling money, CCTV systems recording a warehouse stocking valuable items in which handlers are working, digital tachograph installed in road freight transport vehicles, etc.;
  • processing operations for the purpose of reporting professional concerns, such as the implementation of a whistleblowing hotline;
    processing operations involving profiling of individuals that may lead to their exclusion from the benefit of a contract or to the contract suspension or termination, such as processing to combat fraud of (non-cash) means of payment;
  • profiling that involves data coming from external sources, such as a combination of data operated by data brokers and processing to customize online ads;
  • processing of location data on a large scale, such as a mobile app that enables to collect users’ geolocation data, etc.

The CNIL’s list is non-exhaustive and may be regularly reviewed, depending on the CNIL’s assessment of the “high risks” posed by certain processing operations.

Next steps

The CNIL is expected to soon publish its list of processing operations for which a DPIA is not required.

Medical Transcription Vendor Agrees to $200,000 Settlement with New Jersey Attorney General

On October 30, 2018, ATA Consulting LLC (doing business as Best Medical Transcription) agreed to a $200,000 settlement with the New Jersey Attorney General resulting from a server misconfiguration that allowed private medical records to be posted publicly online. The fine was suspended to $31,000 based on the company’s financial condition. Read the settlement.

The New Jersey Attorney General’s investigation found that a patient had discovered that a Google search revealed portions of her medical records, which were viewable without a password. The patient notified her medical provider, Virtua Medical Group (“Virtua”), which used medical record transcription services provided by Best Medical Transcription. The investigation concluded that a software update changed certain security restrictions previously implemented by Best Medical Transcription and permitted anonymous access (i.e., no password required) to the site where files containing patient medical information were stored. This misconfiguration permitted anyone to conduct a Google search to locate and download the complete files. The investigation found that approximately 1,650 records were exposed on the Internet in this manner.

In addition to the settlement payment, Best Medical Transcription was enjoined from committing future violations of various privacy and security requirements, including HIPAA, the Security Rule, the Breach Notification Rule and the Privacy Rule. Virtua previously agreed to pay a $418,000 fine and enhance its data security practices in connection with the incident.

CNIL Details Rules On Audience and Traffic Measuring In Publicly Accessible Areas

On October 17, 2018, the French data protection authority (the “CNIL”) published a press release detailing the rules applicable to devices that compile aggregated and anonymous statistics from personal data—for example, mobile phone identifiers (i.e., media access control or “MAC” address) —for purposes such as measuring advertising audience in a given space and analyzing flow in shopping malls and other public areas. Read the press release (in French).

The CNIL observed that more and more companies use such devices. In shopping malls, these devices can (1) compile traffic statistics and determine how many individuals have visited a shopping mall over a limited time range; (2) model the routes that individuals take through the shopping mall; and/or (3) calculate the rate of repeating visitors. In public areas, they can (1) determine how many individuals walked past an audience measuring device (e.g., an advertising panel); (2) determine the routes taken by these individuals from one advertising panel to another; (3) estimate the amount of time individuals stand in line; (4) assess the number of vehicles driving on a road, etc.

Against that background, the CNIL identified the three following scenarios:

Scenario 1 – When data is anonymized at short notice (i.e., within minutes of collecting the data)

The CNIL defines anonymization as a specific data processing operation which renders individuals no longer identifiable. (Such processing must comply with various criteria set forth in Opinion 05/2014 of the former Article 29 Working Party on anonymization techniques. According to the CNIL, this includes ensuring a high collision rate between several individuals—for instance, in the context of MAC-based audience measurement devices, the processing must allow multiple MAC addresses to match the result of single-identifier processing.)

In this scenario, anonymization must be performed promptly, i.e., within minutes of collecting the data. In the CNIL’s view, this reduces the risk that an individual would be able to access identifying data. To that end, CNIL recommends anonymizing the data within 5 minutes. After that period, no identifying data should be retained.

The CNIL noted that data controllers may rely on their legitimate interest as a legal basis for the processing under the EU General Data Protection Regulation (“GDPR”). The CNIL recommended, however, that data controllers provide notice to individuals, using a layered approach in accordance with the guidelines of the former Article 29 Working Party on transparency under the GDPR. The CNIL provided an example of a notice that would generally satisfy the first layer of a layered privacy notice, though emphasized that notice should be tailored to the processing—particularly with respect to the individuals’ data protection rights. Since the data is anonymized, individuals cannot exercise their rights of access to and rectification of their personal data, and restriction to the processing of their data. Therefore, the notice does not have to mention these rights. However, individuals must be able to object to the collection of their data, and the notice should refer to that right of (prior) objection.

Scenario 2 – When data is immediately pseudonymized and then anonymized or deleted within 24 hours

In this second scenario, data controllers may rely on their legitimate interest as a legal basis for the processing provided that they:

  • Provide prior notice to individuals;
  • Implement mechanisms to allow individuals to object to the collection of their data (i.e., prior objection to the processing). These mechanisms should be accessible, functional, easy to use and realistic;
  • Set up procedures to allow individuals to exercise their rights of access, rectification and objection after data has been collected; and
  • Implement appropriate technical measures to protect the data, including a reliable pseudonymization process of MAC addresses (with the deletion of the raw data and the use of a salt or key). The pseudonymized data must be anonymized or deleted at the end of the day.

Further, the CNIL recommended using multiple modalities to provide notice to individuals, such as posting a privacy notice at entry and exit points of the shopping mall, on Wi-Fi access points, on every advertising device (e.g., on every advertising panel when the processing is carried out on the street), on the website of the shopping mall, or through a specific marketing campaign.

With respect to the individuals’ data protection rights, the CNIL made it clear that individuals who pass audience measuring devices must be able to object to the collection and further processing of their personal data. Companies wishing to install such a device must implement technical solutions that allow individuals to easily exercise this right of objection both a priori and a posteriori: these solutions must not only allow individuals to obtain the deletion of the data already collected (i.e., to exercise their right of objection a posteriori) but also prevent any further collection of their personal data (prior objection). In the CNIL’s view, the right of objection can be exercised using one of the following means:

  • Through a dedicated website or app on which individuals enter their MAC address to object to the processing. (The data controller is responsible for explaining to individuals how to obtain their MAC address so that they can effectively object to the processing of their data.) If an individual exercises his/her right of objection via this site or app, the data controller must delete all the data already collected and must no longer collect any data associated with that MAC address; or
  • Through a dedicated Wi-Fi network that allows the automatic collection of the devices’ MAC address for the purposes of objecting to the processing. If an individual exercises his/her right of objection via this network, the data controller must delete all the data that has been already pseudonymized and must not further collect the MAC address. The CNIL recommended using a clear and explicit name for that network such as “wifi_tracking_optout”.

According to the CNIL, data controllers should not recommend that individuals turn off the Wi-Fi feature of their phone to avoid being tracked. Such a recommendation is inadequate for purposes of enabling individuals to exercise of their right of objection.

Scenario 3 – All other cases

In the CNIL’s view, if the device implemented by the data controller does not strictly comply with the conditions listed in the two previous scenarios, the processing may only be implemented with the individuals’ consent. The CNIL stated that individuals must be able to withdraw consent, and that withdrawing consent should be as simple as granting consent. Individuals should also be able to exercise all the other GDPR data protection rights. In terms of notice, the CNIL recommended providing notice using multiple modalities (as in the second scenario).

Data Protection Impact Assessment and CNIL’s Authorization

The CNIL also reported that, in all the above scenarios, the processing will require a data protection impact assessment to be carried out prior to the implementation of the audience/traffic measuring devices, in so far as such devices assist in the systematic monitoring of individuals through an innovative technical solution.

Additionally, the CNIL’s prior authorization may be required in certain cases.

New Ohio Law Creates Safe Harbor for Certain Breach-Related Claims

Effective November 2, 2018, a new Ohio breach law will provide covered entities a legal safe harbor for certain data breach-related claims brought in an Ohio court or under Ohio law if, at the time of the breach, the entity maintains and complies with a cybersecurity program that (1) contains administrative, technical and physical safeguards for the protection of personal information, and (2) reasonably conforms to one of the “industry-recognized” cybersecurity frameworks enumerated in the law.

The program must additionally be designed to (1) protect the security and confidentiality of the information, (2) protect against any anticipated threats or hazards to the security or integrity of the information, as well as (3) protect against unauthorized access to and acquisition of the information that is likely to result in a material risk of identity theft or other fraud to the individual to whom the information relates. In determining the necessary scale and scope of the program, businesses should consider what is reasonable in light of the size and complexity of the covered entity, the nature and scope of its activities, the resources available to them, the sensitivity of the information to be protected, and the cost and availability of tools to improve information security and reduce vulnerabilities.

While this safe harbor will not apply to breach of contract claims or statutory violations in a breach suit, covered entities may raise this affirmative defense against tort claims that allege a failure to implement reasonable information security controls that result in a data breach. However, the covered entity will bear the burden of demonstrating that its program meets all of the requirements under the law. This may be hard for businesses to prove since many of the frameworks provide generalizations regarding what is required, but not specifics, and since these frameworks do not tend to have formal certification processes. Moreover, because such frameworks are often revised to keep up with new technologies and risks, it may be difficult for businesses to conform to the updates within the statute-mandated, one-year time limit from the revision date.

This law is the first in the U.S. to offer an incentive to businesses that take steps to ensure that there are policies and procedures in place to protect against data breaches. It remains to be seen whether other states will enact similar laws.