Category Archives: legislation

Ransomware to land cyber-crooks decades in Maryland prisons if new bill passes

Ransomware attacks have been increasing steadily for a few years, and operators gain confidence with every new strike. While cyber-experts burn the midnight oil coming up with solutions to thwart this dangerous form of malware, lawmakers in the U.S. state of Maryland are trying a shortcut – they aim to increase prison time for ransomware operators.

Experts have long insisted that caving in to ransomware operators’ demands not only encourages them to strike again, but it also doesn’t ensure you get your data back. Using a security solution to prevent attacks undoubtedly helps, but the best defences against ransomware remain vigilance and offline backups.

Because of the way ransomware works, though, operators often remain at large. That’s why legislators in Maryland have decided to give future cyber-crooks a scare, by increasing slammer time to 10 years for any ransomware attack resulting in losses greater than $1,000.

Via DelmarvaNow:

Maryland Senate bill 151, cross-filed with House bill 211, would define ransomware attacks that result in a loss greater than $1,000 as a felony, subject to a fine of up to $100,000 and a maximum sentence of 10 years in prison.

Under current Maryland laws, a ransomware attack that extorts a loss less than $10,000 is considered a misdemeanor, while a breach that results in a loss greater than $10,000 is a felony.

The new bill would punish any ransomware attack on any entity, regardless of the operators’ scope or intentions. But according to bill sponsor Sen. Susan Lee, the proposal mainly aims to stop attacks on hospitals – Maryland has seen a number of healthcare institutions hit heavily by ransomware in recent years.

“No industry is safe from ransomware, most importantly our hospitals,” Senator Lee said.

“Ransomware attacks on hospitals are a continuing problem across the country and often create major problems for the facilities, including loss of lives, misdiagnoses and other technological disadvantages for doctors and patients,” Lee told reporters.

The news is certainly encouraging. If the bill passes and succeeds in reducing ransomware attacks in the state of Maryland see a decrease in ransomware attacks, legislators from other states will have a precedent when deciding their next course of action against cyber-crime.

Draft CCPA Regulations Expected Fall 2019

As we previously reported, the California Consumer Privacy Act of 2018 (“CCPA”) delays the California Attorney General’s enforcement of the CCPA until six months after publication of the Attorney General’s implementing regulations, or July 1, 2020, whichever comes first. The California Department of Justice anticipates publishing a Notice of Proposed Regulatory Action concerning the CCPA in Fall 2019.

The regulations aim to (1) establish procedures to facilitate consumers’ rights under the CCPA and (2) provide guidance to businesses regarding how to comply. As required under the CCPA, the regulations will address:

  • the categories of personal information;
  • the definition of unique identifiers;
  • any exceptions necessary to comply with state or federal law, including, but not limited to, those relating to trade secrets and intellectual property rights;
  • rules and procedures for (1) the submission of a consumer request to opt out of the sale of personal information pursuant to Section 1798.145(a)(1); (2) business compliance with a consumer’s opt-out request; and (3) the development and use of a recognizable and uniform opt-out logo or button by all businesses to promote consumer awareness of the opportunity to opt-out of the sale of personal information;
  • adjusting the monetary threshold in Section 1798.140(c)(1)(A) in January of every odd-numbered year to reflect any increase in the Consumer Price Index;
  • the establishment of rules, procedures and any exceptions necessary to ensure that the notices and information that businesses are required to provide are relayed in a manner that may be easily understood by the average consumer, are accessible to consumers with disabilities, and are available in the language primarily used to interact with the consumer; and
  • the establishment of rules and procedures related to the verification of consumer requests.

Written comments may be submitted by email to privacyregulations@doj.ca.gov or by mail to the California Department of Justice, ATTN: Privacy Regulations Coordinator, 300 S. Spring St., Los Angeles, CA 90013. The deadline to submit written comments is March 8, 2019.

Testimony: There’s No Internet of Things Risk in Repair

A proposed right to repair law in New Hampshire won't make the Internet of Things one iota less secure. It will benefit consumers and the planet by extending the useful life of a wide range of connected devices, while making it easier to keep them secure throughout their useful life.

The post Testimony: There’s No Internet of Things Risk in...

Read the whole entry... »

Related Stories

Massachusetts Amends Data Breach Law; Imposes Additional Requirements

On January 10, 2019, Massachusetts Governor Charlie Baker signed legislation amending the state’s data breach law. The amendments take effect on April 11, 2019.

Key updates to Massachusetts’s Data Breach Notification Act include the following:

  • The required notice to the Massachusetts Attorney General and the Office of Consumer Affairs and Business Regulation will need to include additional information, including the types of personal information compromised, the person responsible for the breach (if known) and whether the entity maintains a written information security program. Under Massachusetts 201 CMR § 17.03, any entity that owns or licenses personal information about a Massachusetts resident is currently obligated to develop, implement and maintain a comprehensive written information security program that incorporates the prescriptive requirements contained in the regulation.
  • If individuals’ Social Security numbers are disclosed, or reasonably believed to have been disclosed, the company experiencing a breach must offer credit monitoring services at no cost for at least 18 months (42 months, if the company is a consumer reporting agency). Companies also must certify to the Massachusetts attorney general and the Director of the Office of Consumer Affairs and Business Regulation that their credit monitoring services are compliant with state law.
  • The amended law explicitly prohibits a company from delaying notice to affected individuals on the basis that it has not determined the number of individuals affected. Rather, the entity must send out additional notices on a rolling basis, as necessary.
  • If the company experiencing a breach is owned by a separate entity, the individual notice letter must specify “the name of the parent or affiliated corporation.”
  • Companies are prohibited from asking individuals to waive their right to a private action as a condition for receiving credit monitoring services.

California DOJ to Hold Series of Public Forums on CCPA

The California Department of Justice will host six public forums on the California Consumer Privacy Act of 2018 (“CCPA”) to provide the general public an opportunity to participate in the CCPA rulemaking process. Individuals may attend or speak at the events or submit written comments by email to privacyregulations@doj.ca.gov or by mail to the California Department of Justice, ATTN: Privacy Regulations Coordinator, 300 S. Spring St., Los Angeles, CA 90013.

The forums will take place in January and February throughout the state of California. The first event will be held on January 8, 2019, at the Milton Marks Conference Center in San Francisco.  View the full schedule.

Serbia Enacts New Data Protection Law

On November 9, 2018, Serbia’s National Assembly enacted a new data protection law. The Personal Data Protection Law, which becomes effective on August 21, 2019, is modeled after the EU General Data Protection Regulation (“GDPR”).

As reported by Karanovic & Partners, key features of the new Serbian law include:

  • Scope – the Personal Data Protection Law applies not only to data controllers and processors in Serbia but also those outside of Serbia who process the personal data of Serbian citizens.
  • Database registration – the Personal Data Protection Law eliminates the previous requirement for data controllers to register personal databases with the Serbian data protection authority (“DPA”), though they will be required to appoint a data protection officer (“DPO”) to communicate with the DPA on data protection issues.
  • Data subject rights – the new law expands the rights of data subjects to access their personal data, gives subjects the right of data portability, and imposes additional burdens on data controllers when a data subject requests the deletion of their personal data.
  • Consent – the Personal Data Protection Law introduces new forms of valid consent for data processing (including oral and electronic) and clarifies that the consent must be unambiguous and informed. The prior Serbian data protection law only recognized handwritten consents as valid.
  • Data security – the new law requires data controllers to implement and maintain safeguards designed to ensure the security of personal data.
  • Privacy by Design – the new law obligates data controllers to implement privacy by design when developing new products and services and to conduct data protection impact assessments for certain types of data processing.
  • Data transfers – the Personal Data Protection Law expands the ways in which personal data may be legally transferred from Serbia. Previously, data controllers were required to obtain the approval of the Serbian DPA for any transfers of personal data to non-EU countries. The new law permits personal data transfers based on standard contractual clauses and binding corporate rules approved by the Serbian DPA. Organizations can also transfer personal data to countries deemed to provide an adequate level of data protection by the EU or the Serbian DPA or when the data subject consents to the transfer.
  • Data breaches – like the GDPR, the new law requires data controllers to notify the Serbian DPA within 72 hours of a data breach and will require them to notify individuals if the data breach is likely to result in a high risk to the rights and freedoms of individuals. Data processors must also notify the relevant data controllers in the event of a data breach.

The new law also imposes penalties for noncompliance, but these are significantly lower than those contained in the GDPR. The maximum fines in the new Serbian law are only 17,000 Euros, while the maximum fines in the GDPR can reach up to 20 million Euros or 4% of an organization’s annual global turnover.

In 2012, Lisa Sotto, partner and chair of the Privacy and Cybersecurity practice at Hunton Andrews Kurth, advised the Serbian government on steps to enhance Serbia’s data protection framework.

Draft Bill Imposes Steep Penalties, Expands FTC’s Authority to Regulate Privacy

On November 1, 2018, Senator Ron Wyden (D-Ore.) released a draft bill, the Consumer Data Protection Act, that seeks to “empower consumers to control their personal information.” The draft bill imposes heavy penalties on organizations and their executives, and would require senior executives of companies with more than one billion dollars per year of revenue or data on more than 50 million consumers to file annual data reports with the Federal Trade Commission. The draft bill would subject senior company executives to imprisonment for up to 20 years or fines up to $5 million, or both, for certifying false statements on an annual data report. Additionally, like the EU General Data Protection Regulation, the draft bill proposes a maximum fine of 4% of total annual gross revenue for companies that are found to be in violation of Section 5 of the FTC Act.

The draft bill also proposes to grant the FTC authority to write and enforce privacy regulations, to establish minimum privacy and cybersecurity standards, and to create a national “Do Not Track” system that would allow consumers to prevent third-party companies from tracking internet users by sharing or selling data and targeting advertisements based on their personal information.

Senator Wyden stated, “My bill creates radical transparency for consumers, gives them new tools to control their information and backs it up with tough rules.”

California Enacts Blockchain Legislation

As reported on the Blockchain Legal Resource, California Governor Jerry Brown recently signed into law Assembly Bill No. 2658 for the purpose of further studying blockchain’s application to Californians. In doing so, California joins a growing list of states officially exploring distributed ledger technology.

Specifically, the law requires the Secretary of the Government Operations Agency to convene a blockchain working group prior to July 1, 2019. Under the new law, “blockchain” means “a mathematically secured, chronological and decentralized ledger or database.” In addition to including various representatives from state government, the working group is required to include appointees from the technology industry and non-technology industries, as well as appointees with backgrounds in law, privacy and consumer protection.

Under the new law, which has a sunset date of January 1, 2022, the working group is required to evaluate:

  • the uses of blockchain in state government and California-based businesses;
  • the risks, including privacy risks, associated with the use of blockchain by state government and California-based businesses;
  • the benefits associated with the use of blockchain by state government and California-based businesses;
  • the legal implications associated with the use of blockchain by state government and California-based businesses; and
  • the best practices for enabling blockchain technology to benefit the State of California, California-based businesses and California residents.

In doing so, the working group is required to seek “input from a broad range of stakeholders with a diverse range of interests affected by state policies governing emerging technologies, privacy, business, the courts, the legal community and state government.”

The working group is also tasked with delivering a report to the California Legislature by January 1, 2020, on the potential uses, risks and benefits of blockchain technology by state government and California businesses. Moreover, the report is required to include recommendations for amending relevant provisions of California law that may be impacted by the deployment of blockchain technology.

The Future of Voice, Fraud, and the Impact to CX | A Recap

Voice is growing out of the call center, out of your telephone and is growing into the next interface. In previous years, we have released fraud reports revolving around the call center, but with the expansion of voice, and the fraud that follows, we have shifted our perspective to voice intelligence – after all, voice is everywhere: your digital assistant, your latest kitchen appliance, and even your car.

The eras of economies have passed us by, first characterized by digitalization, then the wave of mobile devices, and now by voice – paving the way to the conversational economy. These economies are accompanied by their own collection of problems – and fraudsters are not letting up. There has been a 350% increase from 2013 to 2017 in phone fraud, and a 47% increase from last year. Banks and the insurance industry are experiencing a higher level of fraud, with a 20% and 36% increase in fraud year over year respectively.

So how did we get to these increased fraud rates?

There have been an increasing amount of data breaches year over year; last year, there were 1,300 data breaches. These breaches make it easy for criminals to commit fraud – ultimately feeding into the $1.5 trillion cybercrime market. Additionally, a lot of enterprises rely heavily on KBAs, or knowledge-based authentication questions, which function as secrets for security. These “secrets” can be easily hacked through social engineering or through the black market.

The arrival of the omnichannel has not helped with containing fraud – consumers want to be able to contact a business through any channel, with the expectations for the experience to remain consistent. However, there are consequences for the omnichannel – it allows fraudsters to use resources from one channel to access an individual’s details in another channel. Lastly, as we build more tools to stop fraud, fraudsters are evolving quickly and learning how to combat these security measures.

Overall, fraud is the ultimate impact to customer experience – your customers have expectations for who they do business with, and if they expect their data to be safe with you, this should be upheld. We’re living in a world where consumers are likely to switch who they do business with if their customer experience expectations are fulfilled.

For more information on the future of voice, fraud in the voice channel, and the impact it has on customer experience, tune into our on-demand webinar here.

The post The Future of Voice, Fraud, and the Impact to CX | A Recap appeared first on Pindrop.

APEC Cross-Border Privacy Rules Enshrined in U.S.-Mexico-Canada Trade Agreement

On September 30, 2018, the U.S., Mexico and Canada announced a new trade agreement (the “USMCA”) aimed at replacing the North American Free Trade Agreement. Notably, the USMCA’s chapter on digital trade recognizes “the economic and social benefits of protecting the personal information of users of digital trade” and will require the U.S., Canada and Mexico (the “Parties”) to each “adopt or maintain a legal framework that provides for the protection of the personal information of the users[.]” The frameworks should include key principles such as: limitations on collection, choice, data quality, purpose specification, use limitation, security safeguards, transparency, individual participation and accountability.

In adopting such a framework, Article 19.8(2) directs the Parties to consider the principles and guidelines of relevant international bodies, such as the APEC Privacy Framework and the OECD Recommendation of the Council concerning Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, and Article 19.8(6) formally recognizes the APEC Cross-Border Privacy Rules (the “APEC CBPRs”) within their respective legal systems:

Art. 19.8(6) Recognizing that the Parties may take different legal approaches to protecting personal information, each Party should encourage the development of mechanisms to promote compatibility between these different regimes. The Parties shall endeavor to exchange information on the mechanisms applied in their jurisdictions and explore ways to extend these or other suitable arrangements to promote compatibility between them. The Parties recognize that the APEC Cross-Border Privacy Rules system is a valid mechanism to facilitate cross-border information transfers while protecting personal information.

In addition, Article 19.14(1)(b) provides that “the Parties shall endeavor to… cooperate and maintain a dialogue on the promotion and development of mechanisms, including the APEC Cross-Border Privacy Rules, that further global interoperability of privacy regimes.”

The APEC CBPRs were developed by the 21 APEC member economies as a cross-border transfer mechanism and comprehensive privacy program for private sector organizations  to enable the accountable free flow of data across the APEC region. Organizations must be certified by a third-party APEC recognized Accountability Agent to participate in this system. The CBPRs are binding and enforceable against participating companies.

The USMCA must still pass the U.S. Congress, the Canadian Parliament, and the Mexican Senate.

California Enacts New Requirements for Internet of Things Manufacturers

On September 28, 2018, California Governor Jerry Brown signed into law two identical bills regulating Internet-connected devices sold in California. S.B. 327 and A.B. 1906 (the “Bills”), aimed at the “Internet of Things,” require that manufacturers of connected devices—devices which are “capable of connecting to the Internet, directly or indirectly,” and are assigned an Internet Protocol or Bluetooth address, such as Nest’s thermostat—outfit the products with “reasonable” security features by January 1, 2020; or, in the bills’ words: “equip [a] device with a reasonable security feature or features that are appropriate to the nature and function of the device, appropriate to the information it may collect, contain, or transmit, and designed to protect the device and any information contained therein from unauthorized access, destruction, use, modification, or disclosure[.]”

According to Bloomberg Law, the Bills’ non-specificity regarding what “reasonable” features include is intentional; it is up to the manufacturers to decide what steps to take. Manufacturers argue that the Bills are egregiously vague, and do not apply to companies that import and resell connected devices made in other countries under their own labels.

The Bills are opposed by the Custom Electronic Design & Installation Association, Entertainment Software Association and National Electrical Manufacturers Association. They are sponsored by Common Sense Kids Action; supporters include the Consumer Federation of America, Electronic Frontier Foundation and Privacy Rights Clearinghouse.

CIPL Submits Comments on Draft Indian Data Protection Bill

On September 26, 2018, the Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth LLP submitted formal comments to the Indian Ministry of Electronics and Information Technology on the draft Indian Data Protection Bill 2018 (“Draft Bill”).

CIPL’s comments on the Draft Bill focus on several key issues that are of particular importance for any modern-day data protection law, including increased emphasis on accountability and the risk-based approach to data processing, interoperability with other data protection laws globally, the significance of having a variety of legal bases for processing and not overly relying on consent, the need for extensive and flexible data transfer mechanisms, and the importance of maximizing the effectiveness of the data protection authority.

Specifically, the comments address the following key issues:

  • the Draft Bill’s extraterritorial scope;
  • the standard for anonymization;
  • notice requirements;
  • accountability and the risk-based approach;
  • legal bases for processing, including importance of the reasonable purposes ground;
  • sensitive personal data;
  • children’s data;
  • individual rights;
  • data breach notification;
  • Data Protection Impact Assessments;
  • record-keeping requirements and data audits;
  • Data Protection Officers;
  • the adverse effects of a data localization requirement;
  • cross-border transfers;
  • codes of practice; and
  • the timeline for adoption.

These comments were formed as part of CIPL’s ongoing engagement in India. In January 2018, CIPL responded to the Indian Ministry of Electronics and Information Technology’s public consultation on the White Paper of the Committee of Experts on a Data Protection Framework for India.

Senate Commerce Committee Holds Hearing on Examining Consumer Privacy Protections

On September 26, 2018, the U.S. Senate Committee on Commerce, Science, and Transportation convened a hearing on Examining Consumer Privacy Protections with representatives of major technology and communications firms to discuss approaches to protecting consumer privacy, how the U.S. might craft a federal privacy law, and companies’ experiences in implementing the EU General Data Protection Regulation (“GDPR”) and the California Consumer Privacy Act (“CCPA”).

After introductory remarks by Senator and Chairman of the Committee John Thune (R-SD) and Senator Bill Nelson (D-FL), representatives from AT&T, Amazon, Google, Twitter, Apple and Charter Communications provided testimony on the importance of protecting consumer privacy, the need for clear rules that still ensure the benefits that flow from the responsible use of data, and key principles that should be included in any federal privacy law. A question and answer session followed, with various senators posing a variety of questions to the witnesses, covering topics such as comparisons to global data privacy regimes, the current and potential future authority of the Federal Trade Commission, online behavioral advertising and political advertising, current privacy tools and issues surrounding children’s data.

Key views expressed by the witnesses from the hearing include:

  • support for the creation of a federal privacy law and a preference for preemption rather than a patchwork of different state privacy laws;
  • agreement that the FTC should be the regulator for a federal privacy law but the authority of the FTC under such a law should be discussed and examined further;
  • concern around a federal privacy law attempting to copy the GDPR or CCPA. A federal privacy law should seek to avoid the difficulties and unintended consequences created by these laws and the U.S. should put its own stamp on what the law should be; and
  • agreement that a federal law should not be unduly burdensome for small and medium sized enterprises.

An archived webcast of the hearing is available on the Senate Commerce Committee’s website.

The hearing marked the first of several as the U.S. debates whether to adopt federal privacy legislation. The next hearing is scheduled for early October where Andrea Jelinek, head of the European Data Protection Board, Alastair MacTaggert, California privacy activist, and representatives from consumer organizations will participate and answer questions on consumer privacy, the GDPR and the CCPA.

CCPA Amendment Bill Signed Into Law

On September 23, 2018, California Governor Jerry Brown signed into law SB-1121 (the “Bill”), which makes limited substantive and technical amendments to the California Consumer Privacy Act of 2018 (“CCPA”). The Bill takes effect immediately,  and delays the California Attorney General’s enforcement of the CCPA until six months after publication of the Attorney General’s implementing regulations, or July 1, 2020, whichever comes first. 

We have previously posted about the modest changes that SB-1121 makes to the CCPA. As reported in BNA Privacy Law Watch, the California legislature may consider broader substantive changes to the CCPA in 2019.

New Federal Credit Freeze Law Eliminates Fees, Provides for Year-Long Fraud Alerts

Effective September 21, 2018, Section 301 of the Economic Growth, Regulatory Relief, and Consumer Protection Act (the “Act”) requires consumer reporting agencies to provide free credit freezes and year-long fraud alerts to consumers throughout the country. Under the Act, consumer reporting agencies must each set up a webpage designed to enable consumers to request credit freezes, fraud alerts, extended fraud alerts and active duty fraud alerts. The webpage must also give consumers the ability to opt out of the use of information in a consumer report to send the consumer a solicitation of credit or insurance. Consumers may find links to these webpages on the Federal Trade Commission’s Identity Theft website.

The Act also enables parents and guardians to freeze their children’s credit if they are under age 16. Guardians or conservators of incapacitated persons may also request credit freezes on their behalf.

Section 302 of the Act provides additional protections for active duty military. Under this section, consumer reporting agencies must offer free electronic credit monitoring to all active duty military.

For more information, read the FTC’s blog post.

CCPA Amended: Enforcement Delayed, Few Substantive Changes Made

On August 31, 2018, the California State Legislature passed SB-1121, a bill that delays enforcement of the California Consumer Privacy Act of 2018 (“CCPA”) and makes other modest amendments to the law. The bill now goes to the Governor for signing. The provisions of the CCPA will become operative on January 1, 2020. As we have previously reported, the CCPA introduces key privacy requirements for businesses. The Act was passed quickly by California lawmakers in an effort to remove a ballot initiative of the same name from the November 6, 2018, statewide ballot. The CCPA’s hasty passage resulted in a number of drafting errors and inconsistencies in the law, which SB-1121 seeks to remedy. The amendments to the CCPA are primarily technical, with few substantive changes.

Key amendments to the CCPA include:

  • Enforcement:
    • The bill extends by six months the deadline for the California Attorney General (“AG”) to draft and adopt the law’s implementing regulations, from January 1, 2020, to July 1, 2020. (CCPA § 1798.185(a)).
    • The bill delays the AG’s ability to bring enforcement actions under the CCPA until six months after publication of the implementing regulations or July 1, 2020, whichever comes first. (CCPA § 1798.185(c)).
    • The bill limits the civil penalties the AG can impose to $2,500 for each violation of the CCPA or up to $7,500 per each intentional violation, and states that a violating entity will be subject to an injunction. (CCPA § 1798.155(b)).
  • Definition of “personal information”: The CCPA includes a number of enumerated examples of “personal information” (“PI”), including IP address, geolocation data and web browsing history. The amendment clarifies that the listed examples would constitute PI only if the data “identifies, relates to, describes, is capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household.” (CCPA § 1798.140(o)(1)).
  • Private right of action:
    • The amendments clarify that a consumer may bring an action under the CCPA only for a business’s alleged failure to “implement and maintain reasonable security procedures and practices” that results in a data breach. (CCPA § 1798.150(c)).
    • The bill removes the requirement that a consumer notify the AG once the consumer has brought an action against a business under the CCPA, and eliminates the AG’s ability to instruct a consumer to not proceed with an action. (CCPA § 1798.150(b)).
  • GLBA, DDPA, CIPA exemptions: The original text of the CCPA exempted information subject to the Gramm-Leach-Bliley Act (“GLBA”) and Driver’s Privacy Protection Act (“DPPA”), only to the extent the CCPA was “in conflict” with either statute. The bill removes the “in conflict” qualification and clarifies that data collected, processed, sold or disclosed pursuant to the GLBA, DPPA or the California Information Privacy Act is exempt from the CCPA’s requirements. The revisions also exempt such information from the CCPA’s private right of action provision. (CCPA §§ 1798.145(e), (f)).
  • Health information:
    • Health care providers: The bill adds an exemption for HIPAA-covered entities and providers of health care governed by the Confidentiality of Medical Information Act, “to the extent the provider or covered entity maintains patient information in the same manner as medical information or protected health information,” as described in the CCPA. (CCPA § 1798.145(c)(1)(B)).
    • PHI: The bill expands the category of exempted protected health information (“PHI”) governed by HIPAA and the Health Information Technology for Economic and Clinical Health Act to include PHI collected by both covered entities and business associates. The original text did not address business associates. (CCPA § 1798.145(c)(1)(A)).
    • Clinical trial data: The bill adds an exemption for “information collected as part of a clinical trial” that is subject to the Federal Policy for the Protection of Human Subjects (also known as the Common Rule) and is conducted in accordance with specified clinical practice guidelines. (CCPA § 1798.145(c)(1)(C)).
  • Notice of right of deletion: The original text of the CCPA stated that a business must disclose on its website or in its privacy policy a consumer’s right to request the deletion of her PI. The bill modifies this requirement, stating that a business must disclose the right to deletion “in a form that is reasonably accessible to consumers.” (CCPA § 1798.105(b)).
  • First Amendment protection: The bill adds a provision to the CCPA, which states that the rights afforded to consumers and obligations imposed on businesses under the CCPA do not apply if they “infringe on the noncommercial activities of a person or entity” as described in Art. I, Section 2(b) of the California constitution, which addresses activities related to the free press. This provision is designed to prevent First Amendment challenges to the law. (CCPA § 1798.150(k)).
  • Preemption:
    • The bill adds to the CCPA’s preemption clause that the law will not apply in the event its application is preempted by, or in conflict with, the U.S. Constitution. The CCPA previously referenced only the California Constitution. (CCPA § 1798.196).
    • Certain provisions of the CCPA supersede and preempt laws adopted by local entities regarding the collection and sale of a consumer’s PI by a business. The bill makes such provisions of the Act operative on the date the bill becomes effective.

The California State Legislature is expected to consider more substantive changes to the law when it reconvenes in January 2019.

California AG Voices Concern About State’s New Privacy Law

On August 22, 2018, California Attorney General Xavier Becerra raised significant concerns regarding the recently enacted California Consumer Privacy Act of 2018 (“CCPA”) in a letter addressed to the CCPA’s sponsors, Assemblyman Ed Chau and Senator Robert Hertzberg. Writing to “reemphasize what [he] expressed previously to [them] and [state] legislative leaders and Governor Brown,” Attorney General Becerra highlighted what he described as five primary flaws that, if unresolved, will undermine the intention behind and effective enforcement of the CCPA.

Most of the issues Attorney General Becerra pointed to were those he claimed impose unnecessary and/or onerous obligations on the Attorney General’s Office (“AGO”). For example, the CCPA requires the AGO to provide opinions, warnings and an opportunity to cure to a business before the business can be held accountable for a CCPA violation. Attorney General Becerra said that this effectively requires the AGO to provide unlimited legal counsel to private parties at taxpayer expense, and creates a potential conflict of interest by requiring the AGO to advise parties who may be violating Californians’ privacy rights.

In a similar vein, Attorney General Becerra noted that the CCPA gives consumers a limited right to sue if they become victims of a data breach, but otherwise does not include a private right of action for consumers to seek remedies to protect their privacy. That framework, Attorney General Becerra wrote, substantially increases the AGO’s need for enforcement resources. Likewise, the CCPA requires private plaintiffs to notify the Attorney General before filing suit. Attorney General Becerra criticized this requirement as both without use, since only courts may decide the merits of a case, and a drain on personnel and administrative resources.

Attorney General Becerra also pointed out that the CCPA’s civil penalty provisions purport to amend and modify the Unfair Competition Law’s civil penalty provision. The latter, however, was enacted by voters through a ballot proposition and thus cannot be amended through legislation. For that reason, Attorney General Becerra argued, the CCPA’s civil penalty provision is likely unconstitutional (the letter noted that the AGO has offered “corrective language” that replaces the CCPA’s current penalty provision with a stand-alone enforcement proposition).

Additionally, Attorney General Becerra took issue with the CCPA’s provision that the AGO has one year to conduct rulemaking for the CCPA. Attorney General Becerra noted that the CCPA did not provide resources for the AGO to carry out the rulemaking nor its implementation thereafter; the Attorney General called the existing deadline “simply unattainable.”

California Lawmakers Consider Additional Resources For Attorney General’s Privacy Act Regulations

As reported in BNA Privacy Law Watch, a California legislative proposal would allocate additional resources to the California Attorney General’s office to facilitate the development of regulations required under the recently enacted California Consumer Privacy Act of 2018 (“CCPA”). CCPA was enacted in June 2018 and takes effect January 1, 2020. CCPA requires the California Attorney General to issue certain regulations prior to the effective date, including, among others, (1) to update the categories of data that constitute “personal information” under CCPA, and (2) certain additional regulations governing compliance (such as how a business may verify a consumer’s request made pursuant to CCPA). The proposal, which was presented in two budget bills, would allocate $700,000 and five staff positions to the California Attorney General’s office to aid in the development of the required regulations. The legislature is expected to pass the relevant funding measure by August 31, 2018. California Attorney General Xavier Becerra has stated that he expects his office will issue its final rules under CCPA in June 2019.

Ohio Law Provides Safe Harbor from Tort Claims Related to Data Breaches

On August 3, 2018, Ohio Governor John Kasich signed into law Senate Bill 220 (the “Bill”), which provides covered entities with an affirmative defense to tort claims, based on Ohio law or brought in an Ohio court, that allege or relate to the failure to implement reasonable information security controls which resulted in a data breach. According to the Bill, its purpose is “to be an incentive and to encourage businesses to achieve a higher level of cybersecurity through voluntary action.” The Bill will take effect 90 days after it is provided to the Ohio Secretary of State.

Brazil’s Senate Passes General Data Protection Law

This post has been updated. 

As reported by Mundie e Advogados, on July 10, 2018, Brazil’s Federal Senate approved a Data Protection Bill of Law (the “Bill”). The Bill, which is inspired by the EU General Data Protection Regulation (“GDPR”), is expected to be sent to the Brazilian President in the coming days.

As reported by Mattos Filho, Veiga Filho, Marrey Jr e Quiroga Advogados, the Bill establishes a comprehensive data protection regime in Brazil and imposes detailed rules for the collection, use, processing and storage of personal data, both electronic and physical.

Key requirements of the Bill include:

  • National Data Protection Authority. The Bill calls for the establishment of a national data protection authority which will be responsible for regulating data protection, supervising compliance with the Bill and enforcing sanctions.
  • Data Protection Officer. The Bill requires businesses to appoint a data protection officer.
  • Legal Basis for Data Processing. Similar to the GDPR, the Bill provides that the processing of personal data may only be carried out where there is a legal basis for the processing, which may include, among other bases, where the processing is (1) done with the consent of the data subject, (2) necessary for compliance with a legal or regulatory obligation, (3) necessary for the fulfillment of an agreement, or (4) necessary to meet the legitimate interest of the data controller or third parties. The legal basis for data processing must be registered and documented. Processing of sensitive data (including, among other data elements, health information, biometric information and genetic data) is subject to additional restrictions.
  • Consent Requirements. Where consent of the data subject is relied upon for processing personal data, consent must be provided in advance and must be free, informed and unequivocal, and provided for a specific purpose. Data subjects may revoke consent at any time.
  • Data Breach Notification. The Bill requires notification of data breaches to the data protection authority and, in some circumstances, to affected data subjects.
  • Privacy by Design and Privacy Impact Assessments. The Bill requires organizations to adopt data protection measures as part of the creation of new products or technologies. The data protection authority will be empowered to require a privacy impact assessment in certain circumstances.
  • Data Transfer Restrictions. The Bill places restrictions on cross-border transfers of personal data. Such transfers are allowed (1) to countries deemed by the data protection authority to provide an adequate level of data protection, and (2) where effectuated using standard contractual clauses or other mechanisms approved by the data protection authority.

Noncompliance with the Bill can result in fines of up to two percent of gross sales, limited to 50 million reias (approximately USD 12.9 million) per violation. The Bill will take effect 18 months after it is published in Brazil’s Federal Gazette.

Update: The Bill was signed into law in mid-August and is expected to take effect in early 2020.

Kenya Considers Data Protection Bill

On July 3, 2018, a draft bill (the “Data Protection Bill”) was introduced that would establish a comprehensive data protection regime in Kenya. The Data Protection Bill would require “banks, telecommunications operators, utilities, private and public companies and individuals” to obtain data subjects’ consent before collecting and processing their personal data. The Data Protection Bill also would impose certain data security obligations related to the collection, processing and storage of data, and would place restrictions on third-party data transfers. Violations of the Data Protection Bill could result in fines up to 500,000 shillings (USD 4,960) and a five-year prison term. According to BNA Privacy Law Watch, while the Data Protection Bill is a “private member’s bill,” the Kenyan government “is working on a separate data-protection policy and bill to be published this week,” with the goal of consolidating the two proposals.

California Consumer Privacy Act Signed, Introduces Key Privacy Requirements for Businesses

On June 28, 2018, the Governor of California signed AB 375, the California Consumer Privacy Act of 2018 (the “Act”). The Act introduces key privacy requirements for businesses, and was passed quickly by California lawmakers in an effort to remove a ballot initiative of the same name from the November 6, 2018, statewide ballot. We previously reported on the relevant ballot initiative. The Act will take effect January 1, 2020.

Key provisions of the Act include:

  • Applicability. The Act will apply to any for-profit business that (1) “does business in the state of California”; (2) collects consumers’ personal information (or on the behalf of which such information is collected) and that alone, or jointly with others, determines the purposes and means of the processing of consumers’ personal information; and (3) satisfies one or more of the following thresholds: (a) has annual gross revenues in excess of $25 million, (b) alone or in combination annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, the personal information of 50,000 or more consumers, households or devices, or (c) derives 50 percent or more of its annual revenue from selling consumers’ personal information (collectively, “Covered Businesses”).
  • Definition of Personal Information. Personal information is defined broadly as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” This definition of personal information aligns more closely with the EU General Data Protection Regulation’s definition of personal data. The Act includes a list of enumerated examples of personal information, which includes, among other data elements, name, postal or email address, Social Security number, government-issued identification number, biometric data, Internet activity information and geolocation data, as well as “inferences drawn from any of the information identified” in this definition.
  • Right to Know
    • Upon a verifiable request from a California consumer, a Covered Business must disclose (1) the categories and specific pieces of personal information the business has collected about the consumer; (2) the categories of sources from which the personal information is collected; (3) the business or commercial purposes for collecting or selling personal information; and (4) the categories of third parties with whom the business shares personal information.
    • In addition, upon verifiable request, a business that sells personal information about a California consumer, or that discloses a consumer’s personal information for a business purpose, must disclose (1) the categories of personal information that the business sold about the consumer; (2) the categories of third parties to whom the personal information was sold (by category of personal information for each third party to whom the personal information was sold); and (3) the categories of personal information that the business disclosed about the consumer for a business purpose.
    • The above disclosures must be made within 45 days of receipt of the request using one of the prescribed methods specified in the Act. The disclosure must cover the 12-month period preceding the business’s receipt of the verifiable request. The 45-day time period may be extended when reasonably necessary, provided the consumer is provided notice of the extension within the first 45-day period. Importantly, the disclosures must be made in a “readily useable format that allows the consumer to transmit this information from one entity to another entity without hindrance.”
  • Exemption. Covered Businesses will not be required to make the disclosures described above to the extent the Covered Business discloses personal information to another entity pursuant to a written contract with such entity, provided the contract prohibits the recipient from selling the personal information, or retaining, using or disclosing the personal information for any purpose other than performance of services under the contract. In addition, the Act provides that a business is not liable for a service provider’s violation of the Act, provided that, at the time the business disclosed personal information to the service provider, the business had neither actual knowledge nor reason to believe that the service provider intended to commit such a violation.
  • Disclosures and Opt-Out. The Act will require Covered Businesses to provide notice to consumers of their rights under the Act (e.g., their right to opt out of the sale of their personal information), a list of the categories of personal information collected about consumers in the preceding 12 months, and, where applicable, that the Covered Business sells or discloses their personal information. If the Covered Business sells consumers’ personal information or discloses it to third parties for a business purpose, the notice must also include lists of the categories of personal information sold and disclosed about consumers, respectively. Covered Businesses will be required to make this disclosure in their online privacy notice. Covered Businesses must separately provide a clear and conspicuous link on their website that says, “Do Not Sell My Personal Information,” and provide consumers a mechanism to opt out of the sale of their personal information, a decision which the Covered Business must respect. Businesses also cannot discriminate against consumers who opt out of the sale of their personal information, but can offer financial incentives for the collection of personal information.
  • Specific Rules for Minors: If a business has actual knowledge that a consumer is less than 16 years of age, the Act prohibits a business from selling that consumer’s personal information unless (1) the consumer is between 13–16 years of age and has affirmatively authorized the sale (i.e., they opt in); or (2) the consumer is less than 13 years of age and the consumer’s parent or guardian has affirmatively authorized the sale.
  • Right to Deletion. The Act will require a business, upon verifiable request from a California consumer, to delete specified personal information that the business has collected about the consumer and direct any service providers to delete the consumer’s personal information. However, there are several enumerated exceptions to this deletion requirement. Specifically, a business or service provider is not required to comply with the consumer’s deletion request if it is necessary to maintain the consumer’s personal information to:
    • Complete the transaction for which the personal information was collected, provide a good or service requested by the consumer, or reasonably anticipated, within the context of a business’s ongoing business relationship with the consumer, or otherwise perform a contract with the consumer.
    • Detect security incidents; protect against malicious, deceptive, fraudulent or illegal activity; or prosecute those responsible for that activity.
    • Debug to identify and repair errors that impair existing intended functionality.
    • Exercise free speech, ensure the right of another consumer to exercise his or her right of free speech, or exercise another right provided for by law.
    • Comply with the California Electronic Communications Privacy Act.
    • Engage in public or peer-reviewed scientific, historical or statistical research in the public interest (when deletion of the information is likely to render impossible or seriously impair the achievement of such research) if the consumer has provided informed consent.
    • To enable solely internal uses that are reasonably aligned with the consumer’s expectations based on the consumer’s relationship with the business.
    • Comply with a legal obligation.
    • Otherwise use the consumer’s personal information, internally, in a lawful manner that is compatible with the context in which the consumer provided the information.
  • Enforcement
    • The Act is enforceable by the California Attorney General and authorizes a civil penalty up to $7,500 per violation.
    • The Act provides a private right of action only in connection with “certain unauthorized access and exfiltration, theft, or disclosure of a consumer’s nonencrypted or nonredacted personal information,” as defined in the state’s breach notification law, if the business failed “to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information.”
      • In this case, the consumer may bring an action to recover damages up to $750 per incident or actual damages, whichever is greater.
      • The statute also directs the court to consider certain factors when assessing the amount of statutory damages, including the nature, seriousness, persistence and willfulness of the defendant’s misconduct, the number of violations, the length of time over which the misconduct occurred, and the defendant’s assets, liabilities and net worth.

Prior to initiating any action against a business for statutory damages, a consumer must provide the business with 30 days’ written notice of the consumer’s allegations and, if within the 30 days the business cures the alleged violation and provides an express written statement that the violations have been cured, the consumer may not initiate an action for individual statutory damages or class-wide statutory damages. These limitations do not apply to actions initiated solely for actual pecuniary damages suffered as a result of the alleged violation.

California Assembly Bill Aims to Avert State Ballot Initiative Related to Privacy

On June 21, 2018, California lawmakers introduced AB 375, the California Consumer Privacy Act of 2018 (the “Bill”). If enacted and signed by the Governor by June 28, 2018, the Bill would introduce key privacy requirements for businesses, but would also result in the removal of a ballot initiative of the same name from the November 6, 2018, statewide ballot. We previously reported on the relevant ballot initiative.

The Bill expands some of the requirements in the ballot initiative. For example, if enacted, the Bill would require businesses to disclose (e.g., in its Privacy Notice) the categories of personal information it collects about California consumers and the purposes for which that information is used. The Bill also would require businesses to disclose, upon a California consumer’s verifiable request, the categories and specific pieces of personal information it has collected about the consumer, as well as the business purposes for collecting or selling the information and the categories of third parties with whom it is shared. The Bill would require businesses to honor consumers’ requests to delete their data and to opt out of the sale of their personal information, and would prohibit a business from selling the personal information of a consumer under the age of 16 without explicit (i.e., opt-in) consent.

A significant difference between the Bill and the ballot initiative is that the Bill would give the California Attorney General exclusive authority to enforce most of its provisions (whereas the ballot initiative provides for a private right of action with statutory damages of up to $3,000 per violation). One exception would be that a private right of action would exist in the event of a data breach in which the California Attorney General declines to bring an action.

If enacted, the Bill would take effect January 1, 2020.

California Ballot Initiative to Establish Disclosure and Opt-Out Requirements for Consumers’ Personal Information

On November 6, 2018, California voters will consider a ballot initiative called the California Consumer Privacy Act (“the Act”). The Act is designed to give California residents (i.e., “consumers”) the right to request from businesses (see “Applicability” below) the categories of personal information the business has sold or disclosed to third parties, with some exceptions. The Act would also require businesses to disclose in their privacy notices consumers’ rights under the Act, as well as how consumers may opt out of the sale of their personal information if the business sells consumer personal information. Key provisions of the Act include:

  • Definition of Personal Information. Personal information is defined broadly as “information that identifies, relates to, describes, references, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or device.” The Act includes a list of enumerated examples of personal information, which includes, among other data elements, name, postal or email address, Social Security number, government-issued identification number, biometric data, Internet activity information and geolocation data.
  • Applicability. The Act would apply to any for-profit business that “does business in the state of California” and (1) has annual gross revenues in excess of $50 million; (2) annually sells, alone or in combination, the personal information of 100,000 or more consumers or devices; or (3) derives 50 percent or more of its annual revenue from selling consumers’ personal information (collectively, “Covered Businesses”).
  • Right to Know. The Act would require Covered Businesses to disclose, upon a verifiable request from a California consumer, the categories of personal information the business has collected about the consumer, as well as the categories of personal information sold and/or disclosed for a business purpose to third parties. The Act would also require Covered Businesses to identify (i.e., provide the name and contact information for) the third parties to whom the Covered Business has sold or disclosed, for a business purpose, consumers’ personal information. Covered Businesses would be required to comply with such requests free of charge within 45 days of receipt, and would be required to provide this information only once within a 12-month period.
  • Exemption. Based on a carve-out in the definition of “third party” (which is defined to exclude (1) “the business that collects personal information from consumers under this Act” or (2) “a person to whom the business discloses a consumer’s personal information for a business purpose pursuant to a written contract”), Covered Businesses would not be required to make the disclosures described above to the extent the Covered Business discloses personal information to another entity pursuant to a written contract with such entity, provided the contract prohibits the recipient from selling the personal information, or retaining, using or disclosing the personal information for any purpose other than performance of services under the contract.
  • Disclosures and Right to Opt Out. The Act would require Covered Businesses to provide notice to consumers of their rights under the Act, and, where applicable, that the Covered Business sells their personal information. If the Covered Business sells consumers’ personal information, the notice must disclose that fact and include that consumers have a right to opt out of the sale of their personal information. Covered Businesses would be required to make this disclosure in their online privacy notice and must separately provide a clear and conspicuous link on their website that says, “Do Not Sell My Personal Information” and provides an opt-out mechanism. If a consumer opts out, the Covered Business would be required to stop selling the consumers’ personal information unless the consumer expressly re-authorizes such sale.
  • Liability for Security Breaches. Pursuant to the Act, if a Covered Business suffers a “breach of the security of the system” (as defined in California’s breach notification law), the Covered Business may be held liable for a violation of the Act if the Covered Business “failed to implement and maintain reasonable security procedures and practices, appropriate to the nature of the information, to protect personal information.”
  • Enforcement. The Act would establish a private right of action and expressly provides that a violation of the Act establishes injury-in-fact without the need to show financial harm. The Act establishes maximum statutory damages of $3,000 per violation or actual damages, whichever is higher. Separately, the Act also would be enforceable by the California Attorney General and would authorize a civil penalty of up to $7,500 per violation. The Act also contains whistleblower enforcement provisions.

If passed, the Act would take effect November 7, 2018, but would “only apply to personal information collected or sold by a business on or after” August 7, 2019.

Chicago Introduces Data Protection Ordinance

Recently, the Personal Data Collection and Protection Ordinance (“the Ordinance”) was introduced to the Chicago City Council. The Ordinance would require businesses to (1) obtain prior opt-in consent from Chicago residents to use, disclose or sell their personal information, (2) notify affected Chicago residents and the City of Chicago in the event of a data breach, (3) register with the City of Chicago if they qualify as “data brokers,” (4) provide specific notification to mobile device users for location services and (5) obtain prior express consent to use geolocation data from mobile applications. 

Key provisions of the Ordinance include:

  • Opt-in Consent to Use and Share Personal Information. In order to use, disclose or sell the personal information of Chicago residents, website operators and online services providers must obtain prior opt-in consent from individuals. Upon request, businesses must disclose to the individual (or their designee) the personal information they maintain about the individual.
  • Security Breach Notification. The Ordinance also imposes breach notification obligations on businesses that process personal information of Chicago residents. Businesses are generally required to notify affected residents or, if they do not own the affected personal information, the data owners within 15 days of discovering the breach. Businesses must also notify the City of Chicago regarding the timing, content and distribution of the notices to individuals and number of affected individuals.
  • Data Broker Registration. Data brokers, defined as commercial entities that collect, assemble and possess personal information about Chicago residents who are not their customers or employees to trade the information, must register with the City of Chicago. Data brokers must submit an annual report to the City, including, among other items, (1) the number of Chicago residents whose personal information the brokers collected in the previous year and (2) the name and nature of the businesses to which the brokers shared personal information.
  • Mobile Devices with Location Services Functionality. Retailers that sell or lease mobile devices with location services functionality must provide notice about the functionality in the form and substance prescribed by the Ordinance.
  • Location-enabled Mobile Applications. In order to collect, use, store or disclose geolocation information from a mobile application, individuals must generally provide affirmative express consent. This requirement is subject to various exceptions, such as in certain instances to allow a parent or guardian to locate their minor child.

Depending on the requirement, the Ordinance allows for a private right of action and specifies fines to address violations.

Vietnam Approves New Cybersecurity Law

On June 12, 2018, Vietnam’s parliament approved a new cybersecurity law  that contains data localization requirements, among other obligations. Technology companies doing business in the country will be required to operate a local office and store information about Vietnam-based users within the country. The law also requires social media companies to remove offensive content from their online service within 24 hours at the request of the Ministry of Information and Communications and the Ministry of Public Security’s cybersecurity task force. Companies could face substantial penalties for failure to disclose information upon governmental request. In addition, the law bans internet users in Vietnam from organizing people for anti-state purposes and imposes broad restrictions on using speech to distort the country’s history or achievements. As reported in BNA Privacy Law Watch, the law will take effect on January 1, 2019.

Louisiana Amends Data Breach Notification Law, Eliminates Fees for Security Freezes

Recently, Louisiana amended its Database Security Breach Notification Law (the “amended law”). Notably, the amended law (1) amends the state’s data breach notification law to expand the definition of personal information and requires notice to affected Louisiana residents within 60 days, and (2) imposes data security and destruction requirements on covered entities. The amended law goes into effect on August 1, 2018.

Key breach notification provisions of the amended law include:

  • Definition of Personal Information: Under the amended law, “personal information” is now defined as a resident’s first name or first initial and last name together with one or more of the following data elements, when the name or the data element is not encrypted or redacted: (1) Social Security Number; (2) driver’s license number or state identification card number; (3) account number, credit or debit card number, together with any required security code, access code or password that would permit access to the individuals’ financial account; (4) passport number; and (5) biometric data, such as fingerprints, voice prints, eye retina or iris, or other unique biological characteristic, that is used to authenticate the individual’s identity.
  • Timing: The amended law requires that notice must be made to affected residents in the most expedient time possible and without unreasonable delay, but no later than 60 days from the discovery of a breach. This timing requirement also applies to third parties who are required to notify the owner or licensee of the personal information of a breach.
  • Delays: Under the amended law, entities must provide written notification to the Louisiana Attorney General within the 60-day period if notification is delayed due to (1) the entity’s determination that “measures are necessary to determine the scope of the breach, prevent further disclosures and restore the reasonable integrity of the system” or (2) law enforcement’s determination that notification would impede a criminal investigation. The Attorney General will allow an extension after receiving a written explanation of the reasons for delay.
  • Substitute Notification: The amended law lowers the bar for substitute notifications in the form of emails, postings on the website and notifications to major statewide media. Specifically, substitute notifications are permitted if (1) the cost of providing notifications would exceed $100,000 (previously the threshold was $250,000); (2) the number of affected individuals exceeds 100,000 (previously the threshold was 500,000); or (3) the entity does not have sufficient contact information.
  • Harm Threshold Documentation: Notification is not required if the entity determines that there is no reasonable likelihood of harm to Louisiana residents. The amended law requires that this written determination and supporting documents must be maintained for five years from the discovery. The Attorney General may request the documentation.

Key data security and destruction provisions of the amended law include:

  • “Reasonable” Security Procedures: The amended law creates a new requirement that entities that conduct business in Louisiana or own or license computerized personal information about Louisiana residents must maintain “reasonable security procedures and practices” to protect personal information. In addition, the security procedures and practices must be “appropriate to the nature of the information.” The amended law does not describe specifically what practices would meet these standards.
  • Data Destruction Requirement: The amended law creates a new requirement that, when Louisiana residents’ personal information owned or licensed by a business is “no longer to be retained,” “all reasonable steps” must be taken to destroy it. For instance, the personal information must be shredded or erased, or the personal information must be otherwise modified to “make it unreadable or undecipherable.”

Separately, on May 15, 2018, SB127 was signed by the governor and took immediate effect. The bill prohibits credit reporting agencies from charging a fee for placing, reinstating, temporarily lifting or revoking a security freeze.

Eleventh Circuit Vacates FTC Data Security Order

On June 6, 2018, the U.S. Court of Appeals for the Eleventh Circuit vacated a 2016 Federal Trade Commission (“FTC”) order compelling LabMD to implement a “comprehensive information security program that is reasonably designed to protect the security, confidentiality, and integrity of personal information collected from or about consumers.” The Eleventh Circuit agreed with LabMD that the FTC order was unenforceable because it did not direct the company to stop any “unfair act or practice” within the meaning of Section 5(a) of the Federal Trade Commission Act (the “FTC Act”).

The case stems from allegations that LabMD, a now-defunct clinical laboratory for physicians, failed to protect the sensitive personal information (including medical information) of consumers, resulting in two specific security incidents. One such incident occurred when a third party informed LabMD that an insurance-related report, which contained personal information of approximately 9,300 LabMD clients (including names, dates of birth and Social Security numbers), was available on a peer-to-peer (“P2P”) file-sharing network.

Following an FTC appeal process, the FTC ordered LabMD to implement a comprehensive information security program that included:

  • designated employees accountable for the program;
  • identification of material internal and external risks to the security, confidentiality and integrity of personal information;
  • reasonable safeguards to control identified risks;
  • reasonable steps to select service providers capable of safeguarding personal information, and requiring them by contract to do so; and
  • ongoing evaluation and adjustment of the program.

In its petition for review of the FTC order, LabMD asked the Eleventh Circuit to decide whether (1) its alleged failure to implement reasonable data security practices constituted an unfair practice within the meaning of Section 5 of the FTC Act and (2) whether the FTC’s order was enforceable if it does not direct LabMD to stop committing any specific unfair act or practice.

The Eleventh Circuit assumed, for purposes of its ruling, that LabMD’s failure to implement a reasonably designed data-security program constituted an unfair act or practice within the meaning of Section 5 of the FTC Act. However, the court held that the FTC’s cease and desist order, which was predicated on LabMD’s general negligent failure to act, was not enforceable. The court noted that the prohibitions contained in the FTC’s cease and desist orders and injunctions “must be stated with clarity and precision,” otherwise they may be unenforceable. The court found that in LabMD’s case, the cease and desist order contained no prohibitions nor instructions to the company to stop a specific act or practice. Rather, the FTC “command[ed] LabMD to overhaul and replace its data-security program to meet an indeterminable standard of reasonableness.” The court took issue with the FTC’s scheme of “micromanaging,” and concluded that the cease and desist order “mandate[d] a complete overhaul of LabMD’s data-security program and [said] precious little about how this [was] to be accomplished.” The court also noted that the FTC’s prescription was “a scheme Congress could not have envisioned.”

Department of Energy Announces New Efforts in Energy Sector Cybersecurity

On May 14, 2018, the Department of Energy (“DOE”) Office of Electricity Delivery & Energy Reliability released its Multiyear Plan for Energy Sector Cybersecurity (the “Plan”). The Plan is significantly guided by DOE’s 2006 Roadmap to Secure Control Systems in the Energy Sector and 2011 Roadmap to Achieve Energy Delivery Systems Cybersecurity. Taken together with DOE’s recent announcement creating the new Office of Cybersecurity, Energy Security, and Emergency Response (“CESER”), DOE is clearly asserting its position as the energy sector’s Congressionally-recognized sector-specific agency (“SSA”) on cybersecurity.

Multiyear Plan for Energy Sector Cybersecurity

Under development over the last year, the Plan aligns with President Trump’s Executive Order 13800, which calls on the government to engage with critical infrastructure owners and operators to identify authorities and capabilities that agencies could employ to support critical infrastructure cybersecurity. To this end, the Plan lays out DOE’s integrated strategy to reduce cyber risks to the U.S. energy sector. The Plan seeks to leverage strong partnerships with the private sector to: (1) strengthen today’s cyber systems and risk management capabilities and (2) develop innovative solutions for tomorrow’s inherently secure and resilient systems. It identifies three goals to accomplish these priorities: (1) strengthen energy sector cybersecurity preparedness, (2) coordinate incident response and recovery and (3) accelerate game-changing research, development and demonstration of resilient delivery systems.

Office of Cybersecurity, Energy Security, and Emergency Response

Featured heavily in the Plan is CESER, which was announced by DOE Secretary Perry on February 14, 2018. The announcement stated that CESER would be led by an Assistant Secretary, which the Administration has yet to nominate, and that President Trump’s FY 19 budget requested $96 million for the new office.

DOE Undersecretary Mark Menezes testified to Congress that “initially, the office will be comprised of the work we currently do” under existing programs. Indeed, DOE’s FY 19 budget request indicates that CESER will be formed from existing reliability programs in the Office of Electricity Delivery & Energy Reliability, which will be renamed the Office of Electricity Delivery (“OE”). OE will maintain the Transmission Reliability, Resilient Distribution Systems, Energy Storage, and Transmission Permitting and Technical Assistance programs, while CESER will inherit the Cybersecurity for Energy Delivery Systems (“CEDS”) program, currently led by Deputy Assistant Secretary Henry S. Kenchington, and the Infrastructure Security and Energy Restoration (“ISER”) program, currently headed by Deputy Assistant Secretary Devon Streit.

CEDS forms the core of DOE’s work on energy sector cybersecurity and aligns with the Plan’s goals of increasing energy cyber preparedness and developing new cybersecurity technologies. Besides conducting cybersecurity research and development, CEDS also oversees DOE’s primary programs for sharing cybersecurity information with the private sector. This includes the Cybersecurity Risk Information Sharing Program (“CRISP”), which facilitates timely bi-directional sharing of cyber threat information in order to monitor energy sector IT networks. At present, 75% of U.S. electric utilities participate in CRISP. CEDS also includes the Cybersecurity for Operational Technology Environment (“CYOTE”) pilot project, which applies lessons learned from CRISP to monitor operating technology (“OT”) networks. According to the budget request, DOE intends to improve both CRISP and CYOTE by integrating utility data into the Intelligence Community environment to enhance threat information. The request also states that DOE will create a new “Advanced Industrial Control System Analysis Center” within CEDS that will “span the DOE laboratory network and work in collaboration with private sector partners to use the analysis of energy sector supply chain component and model impacts to address system threats and vulnerabilities through technical solutions, share information about findings, and develop mitigation and response solutions.”

ISER provides technical expertise on supporting resiliency of critical infrastructure assets key to energy sector operation and addresses the Plan’s goal of coordinating incident response. ISER’s focus is operational and spans all hazards facing the energy sector. However, the DOE budget notes that in the next fiscal year, ISER will “build out its effective, timely, and coordinated cyber incident management capability” and “envisions” forming a team of at least six cyber energy responders to support incident response within the energy sector.

DOE’s Emerging Role in Energy Sector Cybersecurity

DOE, under the Trump Administration, is reprioritizing cybersecurity higher on the Department’s agenda. To be sure, the Plan and CESER are a reshuffling of already-existing resources rather than entirely new programs. But it is clear that DOE is intent on flexing its position under the Fixing America’s Surface Transportation Act (“FAST Act”) to act as the energy sector SSA on cybersecurity.

DOE’s efforts come as the Department of Homeland Security (“DHS”) is also increasing its profile on cybersecurity. Utilizing authority under the Cybersecurity Information Sharing Act, passed just weeks after the FAST Act in 2015, DHS has certified its National Cybersecurity and Communications Integration Center (“NCCIC”) as a certified portal to accept cybersecurity information. As such, entities enjoy liability protection for sharing cybersecurity information with the NCCIC, through programs like Automated Indicator Sharing (“AIS”) and the even more robust Cyber Information Sharing and Collaboration Program (“CISCP”).

Those within the energy sector can utilize both DOE’s and DHS’s information sharing programs to strengthen their cybersecurity. Coordination with the NCCIC and sharing through AIS or CISCP provides access to the government’s cross-sectoral cybersecurity activities, though reports indicate that businesses have been slow to adopt AIS. Tailored specifically to electricity, DOE’s CRISP and CYOTE programs represent a more specialized package of information sharing, particularly appropriate for electricity sub-sector stakeholders.

DHS and DOE can be expected to continue asserting jurisdictional claims over cybersecurity issues. Hopefully, this will represent little more than the traditional rivalry between government agencies, and result in complementary rather than competing federal cybersecurity programs.

Arizona Amends Data Breach Notification Law

On April 11, 2018, Arizona amended its data breach notification law (the “amended law”). The amended law will require persons, companies and government agencies doing business in the state to notify affected individuals within 45 days of determining that a breach has resulted in or is reasonably likely to result in substantial economic loss to affected individuals. The old law only required notification “in the most expedient manner possible and without unreasonable delay.” The amended law also broadens the definition of personal information and requires regulatory notice and notice to the consumer reporting agencies (“CRAs”) under certain circumstances.

Key provisions of the amended law include:

  • Definition of Personal Information. Under the amended law, the definition of “personal information” now includes an individual’s first name or initial and last name in combination with one or more of the following “specified data elements:” (1) Social Security number; (2) driver’s license or non-operating license number; (3) a private key that is unique to an individual and that is used to authenticate or sign an electronic record; (4) financial account number or credit or debit card number in combination with any required security code, access code or password that would allow access to the individual’s financial account; (5) health insurance identification number; (6) medical or mental health treatment information or diagnoses by a health care professional; (7) passport number; (8) taxpayer identification or identity protection personal identification number issued by the Internal Revenue Service; and (9) unique biometric data generated from a measurement or analysis of human body characteristics to authenticate an individual when the individual accesses an online account. The amended law also defines “personal information” to include “an individual’s user name or e-mail address, in combination with a password or security question and answer, which allows access to an online account.”
  • Harm Threshold. Pursuant to the amended law, notification to affected individuals, the Attorney General and the CRAs is not required if breach has not resulted in or is not reasonably likely to result in substantial economic loss to affected individuals.
  • Notice to the Attorney General and Consumer Reporting Agencies. If the breach requires notification to more than 1,000 individuals, notification must also be made to the Attorney General and the three largest nationwide CRAs.
  • Timing. Notifications to affected individuals, the Attorney General and the CRAs must be issued within 45 days of determining that a breach has occurred.
  • Substitute Notice. Where the cost of making notifications would exceed $50,000, the affected group is bigger than 100,000 individuals, or there is insufficient contact information for notice, the amended law now requires that substitute notice be made by (1) sending a written letter to the Attorney General demonstrating the facts necessary for substitute notice and (2) conspicuously posting the notice on the breached entity’s website for at least 45 days. Under the amended law, substitute notice no longer requires email notice to affected individuals and notification to major statewide media.
  • Penalty Cap. The Attorney General may impose up to $500,000 in civil penalties for knowing and willful violations of the law in relation to a breach or series of related breaches. The Attorney General also Is entitled to recover restitution for affected individuals.

St. Kitts and Nevis Pass the Data Protection Bill 2018

On May 4, 2018, St. Kitts and Nevis’ legislators passed the Data Protection Bill 2018 (the “Bill”). The Bill was passed to promote the protection of personal data processed by public and private bodies.

Attorney General the Honourable Vincent Byron explained that the Bill is largely derived from the Organization of Eastern Caribbean States model and “seeks to ensure that personal information in the custody or control of an organization, whether it be a public group like the government, or private organization, shall not be disclosed, processed or used other than the purpose for which it was collected, except with the consent of the individual or where exemptions are clearly defined.”

Read more about the Bill.

Canada Will Require Breach Notification November 1

The Canadian government recently published a cabinet order stating that the effective date for breach notification provisions in the Digital Privacy Act would be November 1, 2018. At that time, businesses that experience a “breach of security safeguards” would be required to notify affected individuals, as well as the Privacy Commissioner and any other organization or government institution that might be able to reduce the risk of harm resulting from the breach.

In a separate cabinet order, the Canadian government said it would finalize the Breach of Security Safeguards Regulations, which were published in draft form in September 2017.

Canada has had mandatory breach notification regulations at the provincial level, and many companies have also voluntarily reported breaches to the federal Privacy Commissioner, so most organizations should be well-equipped to meet the November 1 compliance deadline.

GSA to Upgrade Cybersecurity Requirements

Recently, the General Services Administration (“GSA”) announced its plan to upgrade its cybersecurity requirements in an effort to build upon the Department of Defense’s new cybersecurity requirements, DFAR Section 252.204-7012, that became effective on December 31, 2017.

The first proposed rule, GSAR Case 2016-G511 “Information and Information Systems Security,” will require that federal contractors “protect the confidentiality, integrity and availability of unclassified GSA information and information systems from cybersecurity vulnerabilities and threats in accordance with the Federal Information Security Modernization Act of 2014 and associated Federal cybersecurity requirements.” The proposed rule will apply to “internal contractor systems, external contractor systems, cloud systems and mobile systems.” It will mandate compliance with applicable controls and standards, such as those of the National Institute of Standards and Technology, and will update existing GSAR clauses 552.239-70 and 552.239-71, which address data security issues. Contracting officers will be required to include these cybersecurity requirements into their statements of work. The proposed rule is scheduled to be released in April 2018. Thereafter, the public will have 60 days to offer comments.

The second proposed rule, GSAR Case 2016-G515 “Cyber Incident Reporting,” will “update requirements for GSA contractors to report cyber incidents that could potentially affect GSA or its customer agencies.” Specifically, contractors will be required to report any cyber incident “where the confidentiality, integrity or availability of GSA information or information systems are potentially compromised.” The proposed rule will establish a timeframe for reporting cyber incidents, detail what the report must contain and provide points of contact for filing the report. The proposed rule is intended to update the existing cyber reporting policy within GSA Order CIO-9297.2 that did not previously undergo the rulemaking process. Additionally, the proposed rule will establish requirements for contractors to preserve images of affected systems and impose training requirements for contractor employees. The proposed rule is scheduled to be released in August 2018, and the public will have 60 days to comment on the proposed rule.

Although the proposed rules have not yet been published, it is anticipated that they will share similarities with the Department of Defense’s new cybersecurity requirements, DFAR Section 252.204-7012.

UK Court of Appeal Rules DRIPA Inconsistent with EU Law

On January 30, 2018, the UK Court of Appeal ruled that the Data Retention and Investigatory Powers Act (“DRIPA”) was inconsistent with EU law. The judgment, pertaining to the now-expired act, is relevant to current UK surveillance practices and is likely to result in major amendments to the Investigatory Powers Act (“IP Act”), the successor of DRIPA.

In the instant case, the Court of Appeal ruled that DRIPA was inconsistent with EU law as it permitted access to communications data when the objective was not restricted solely to fighting serious crime. Additionally, the Court held that DRIPA lacked adequate safeguards since it permitted access to communications data without subjecting such access to a prior review by a court or independent administrative authority. The ruling follows the judgment of the Court of Justice of the European Union (“CJEU”), to which the Court of Appeal referred questions regarding the instant case in 2015.

The IP Act, which was enforced in 2017, largely replicates and further expands upon the powers contained in DRIPA. Though the present judgment does not change the way UK law enforcement agencies can currently access communications data for the detection and disruption of crime under the IP Act, the UK government is currently facing a separate case challenging the IP Act in the High Court, due to be heard in February 2018.

Reacting to the 2016 ruling of the CJEU, the UK government in late 2017 published a consultation document and proposed amendments to the IP Act which aimed to address the judgment of the CJEU. The proposed changes were deemed to fall short of the CJEU ruling by Liberty, the UK human rights organization bringing the proceedings against the IP Act in the High Court.

The present case and the future ruling of the High Court on the IP Act could impact the UK significantly when Brexit negotiations turn to discussions on adequacy and data sharing between the UK and the EU. UK surveillance legislation that is incompatible with EU data protection law could bring a halt to data flows between EU and UK law enforcement agencies and organizations.

How to Minimize Leaking

I am hopeful that President Trump will not block release of the remaining classified documents addressing the 1963 assassination of President John F. Kennedy. I grew up a Roman Catholic in Massachusetts, so President Kennedy always fascinated me.

The 1991 Oliver Stone movie JFK fueled several years of hobbyist research into the assassination. (It's unfortunate the movie was so loaded with fictional content!) On the 30th anniversary of JFK's death in 1993, I led a moment of silence from the balcony of the Air Force Academy chow hall during noon meal. While stationed at Goodfellow AFB in Texas, Mrs B and I visited Dealey Plaza in Dallas and the Sixth Floor Museum.

Many years later, thanks to a 1992 law partially inspired by the Stone movie, the government has a chance to release the last classified assassination records. As a historian and former member of the intelligence community, I hope all of the documents become public. This would be a small but significant step towards minimizing the culture of information leaking in Washington, DC. If prospective leakers were part of a system that was known for releasing classified information prudently, regularly, and efficiently, it would decrease the leakers' motivation to evade the formal declassification process.

Many smart people have recommended improvements to the classification system. Check out this 2012 report for details.

Indian Supreme Court Holds That Privacy Is a Fundamental Right

Stephen Mathias of the law firm Kochhar & Co. reports from India that in a landmark judgment delivered in August 2017, the Supreme Court of India (“Court”) unanimously held that the right to privacy is a fundamental right under the Constitution of India. The Court also delivered six separate concurring judgments, with the main judgment being delivered by four of the nine judges.

The Court held that the right to privacy is part of the right to life and on the same footing as the right to human dignity. The Court has previously held that many other rights are part of the right to life. It also held that the right to privacy is part of other fundamental rights, including the “freedom rights” (e.g., right to speech, movement, etc.).

Constitutional law experts have hailed the judgment as perhaps the most significant judgment delivered by the Court in decades, as the Court looked more deeply at how the right to privacy could apply. For example, the Court frowned on a recent two-judge bench judgment that had upheld criminal punishment for gay sex. One of the concurring judgments also stated that laws that seek to regulate what people eat could infringe on privacy. This is relevant to pending cases challenging bans on cow slaughter in some states in India. More importantly, it appears the nine-judge bench used their rare large bench strength to shift India towards a greater focus on protecting individual rights over jurisprudence in favor of state power.

India is one of the last remaining constitutional democracies not to have a comprehensive privacy law. The judgment sets India on a path to having such a privacy law—the Government announced midway through the hearings that it had appointed a committee to draft a new privacy law. The panel is headed by a retired High Court judge, but consists mostly of bureaucrats and technology academics. The head of the Data Security Council of India is a member, but there is only one lawyer on the committee. The committee head has, however, announced that he intends to consult experts and hold public hearings.

India has long had relatively weak laws with respect to privacy, but now has an opportunity to draft a modern and well-balanced comprehensive privacy law. As the largest destination for offshoring in the world, so much of the world’s data is processed in India. Comprehensive privacy legislation in India would surely be a key asset in the world of outsourcing.

Stephen Mathias co-chairs the Technology Law Practice of Kochhar & Co., a leading law firm in India that does substantial work in the privacy space.

Colombia Designates U.S. as “Adequate” Data Transfer Nation

On August 14, 2017, the Colombian Superintendence of Industry and Commerce (“SIC”) announced that it was adding the United States to its list of nations that provide an adequate level of protection for the transfer of personal information, according to a report from Bloomberg BNA. The SIC, along with the Superintendence of Finance, is Colombia’s data protection authority, and is responsible for enforcing Colombia’s data protection law. Under Colombian law, transfers of personal information to countries that are deemed to have laws providing an adequate level of protection are subject to less stringent restrictions (for example, prior consent for certain international transfers of personal information may not be required if a country’s protections are deemed adequate). This development should help facilitate the transfer of personal information from Colombia to the United States.

Washington Becomes Third State to Enact Biometric Privacy Law

On May 16, 2017, the Governor of the State of Washington, Jay Inslee, signed into law House Bill 1493 (“H.B. 1493”), which sets forth requirements for businesses who collect and use biometric identifiers for commercial purposes. The law will become effective on July 23, 2017. With the enactment of H.B. 1493, Washington becomes the third state to pass legislation regulating the commercial use of biometric identifiers. Previously, both Illinois and Texas enacted the Illinois Biometric Information Privacy Act (740 ILCS 14) (“BIPA”) and the Texas Statute on the Capture or Use of Biometric Identifier (Tex. Bus. & Com. Code Ann. §503.001), respectively.

H.B. 1493 defines “biometric identifier” as data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises or other unique biological patterns or characteristics that are used to identify a specific individual. Interestingly, unlike the Illinois and Texas statutes, H.B. 1493’s definition of “biometric identifier” does not reference a record or scan of face geometry (i.e., facial recognition data). The definition also explicitly excludes “physical or digital photographs, video or audio recording or data generated therefrom,” and certain health-related data processed pursuant to Health Insurance Portability and Accountability Act of 1996. Notably, several putative class action lawsuits have been filed against social networking sites, such as Shutterfly, for allegedly using facial recognition technology to scan users’ uploaded photographs in violation of BIPA’s notice and consent requirements. Although it is unclear whether H.B.1493 covers scans of face geometry, the lack of explicit inclusion of such data may be a response to such lawsuits.

Pursuant to H.B.1493, a person may not “enroll” a biometric identifier in a database for a commercial purpose without first providing notice, obtaining consent or providing a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose. In contrast to the Illinois and Texas statutes, which broadly regulate the capture (or, in the case of BIPA, the possession) of biometric identifiers, Washington’s statute is limited to those persons that “enroll” biometric identifiers by capturing the data, converting it into a reference template that cannot be reconstructed into the original output image, and storing it in a database that matches the biometric identifier to a specific individual. Notably, the statute’s limitations on disclosure and retention of biometric identifiers do not apply to biometric identifiers that have been “unenrolled.”

H.B. 1493 contains detailed requirements governing the enrollment of biometric identifiers for a commercial purpose, as well as the subsequent disclosure of such data. In particular:

  • The statute makes it clear that the notice required under the law is separate from, and is not considered, “affirmative consent.”
  • Unlike BIPA, which explicitly requires a written release from the subject before obtaining his or her biometric identifier, H.B. 1493 broadly states that the exact notice and type of consent required to achieve compliance is “context-dependent.” The notice must be given through a procedure reasonably designed to be readily available to affected individuals.
  • A person who enrolls a biometric identifier for a commercial purpose or obtains a biometric identifier from a third party for a commercial purpose may not use or disclose it in a manner that is materially inconsistent with the terms under which the biometric identifier was originally provided without obtaining consent for the new use or disclosure.
  • Unless consent has been obtained, a person who has enrolled an individual’s biometric identifier may not sell, lease or otherwise disclose the biometric identifier to another person for a commercial purpose unless one of certain enumerated statutory exceptions applies, including: (1) where necessary to provide a product or service requested by the individual; or (2) where disclosed to a third party who contractually promises that the biometric identifier will not be further disclosed and will not be enrolled in a database for a commercial purpose that is inconsistent with the notice and consent provided.

Importantly, unlike the Illinois and Texas statutes, H.B. 1493 contains a broad “security exception,” exempting those persons that collect, capture, enroll or store biometric identifiers in furtherance of a “security purpose.”

Similar to the Illinois and Texas statutes, H.B. 1493 also contains data security and retention requirements. In particular, the statute requires (1) reasonable care to guard against unauthorized access to and acquisition of biometric identifiers and (2) retention of biometric identifiers for no longer than necessary to comply with the law, protect against fraud, criminal activity, security threats or liability, or to provide the service for which the biometric identifier was enrolled.

As with the Texas biometric law, H.B. 1493 does not create a private right of action to allow for suits by individual plaintiffs. Instead, only the Washington Attorney General can enforce the requirements. The Illinois biometric law currently is the only state biometric statute that includes a private right of action.

Although Washington is only the third state to enact a biometric privacy law, several other states are considering similar legislation as the commercial collection and use of biometric identifiers becomes more commonplace.

Amended Oregon Law Reinforces Importance of Adhering to Privacy Policies

On May 25, 2017, Oregon Governor Kate Brown signed into law H.B. 2090, which updates Oregon’s Unlawful Trade Practices Act by holding companies liable for making misrepresentations on their websites (e.g., in privacy policies) or in their consumer agreements about how they will use, disclose, collect, maintain, delete or dispose of consumer information. Pursuant to H.B. 2090, a company engages in an unlawful trade practice if it makes assertions to consumers regarding the handling of their information that are materially inconsistent with its actual practices. Consumers can report violations to the Oregon Attorney General’s consumer complaint hotline. H.B. 2090 reinforces the significance of carefully drafting clear, accurate privacy policies and complying with those policies’ provisions.

CNIL Unveils 2017 Inspection Program and 2016 Annual Activity Report

On March 28, 2017, the French Data Protection Authority (“CNIL”) published its Annual Activity Report for 2016 (the “Report”) and released its annual inspection program for 2017.

The Report presents the main accomplishments in 2016 and highlights the diversified activity at both the national and EU level with the adoption of two major pieces of legislation, namely:

  • The EU General Data Protection Regulation (“GDPR”), which imposes new accountability obligations, including the obligation to (1) keep records of data processing activities, (2) notify data breaches and (3) in some cases, appoint a data protection officer. The CNIL estimates that the GDPR will lead to the appointment of a data protection officer in at least 80,000 to 100,000 organizations in France.
  • French Law of October 7, 2016 for a Digital Republic, which created new data protection rights, such as (1) the right for individuals to give instructions relating to the storage, erasure and disclosure of their personal data after their death, (2) the right to be forgotten for minors and (3) the possibility to exercise data protection rights by electronic means. This legislation strengthens the transparency requirements and increases the maximum level of fines from €150,000 to €3 million for data protection infringements.

Against that background, the Report highlights that the CNIL received a high number of complaints in 2016 (7,703 complaints, a similar number to the record number of 7,900 complaints in 2015). These complaints mainly concerned the following issues or sectors:

  • dissemination of personal data on the Internet (e.g., blogs, websites or social networks), and in particular, the erasure or rectification of personal data (33 percent of complaints). The Report emphasizes that the CNIL received a total of 410 complaints, following delisting refusals from search engines;
  • marketing issues, and in particular, direct marketing by email, telephone or regular mail (33 percent of complaints);
  • human resources issues such as excessive video surveillance and refusal to grant access to the employee file (14 percent of complaints);
  • bank and credit issues such as failure to cancel the registration in the National Database on Household Credit Repayment Incidents (9 percent of complaints); and
  • health and social sector issues such as difficulties accessing medical or social records, and the creation of pharmaceutical records without consent (3 percent of complaints).

The Report further presents the first results of the inspections conducted by the CNIL in 2016, (i.e., 430 inspections, including 100 inspections conducted remotely). The CNIL announced that the inspections for 2017 will focus on the following topics:

  • confidentiality of health data processed by insurance companies;
  • files of French intelligence services; and
  • smart TVs.

Finally, the Report outlines some of the topics that the CNIL will further consider in 2017, including algorithms and the place of citizens in smart cities.

Australia Enacts New Data Breach Notification Law

On February 13, 2017, the Parliament of Australia passed legislation that amends the Privacy Act of 1988 (the “Privacy Act”) and requires companies with revenue over $3 million AUD ($2.3 million USD) to notify affected Australian residents and the Australian Information Commissioner (the “Commissioner”) in the event of an “eligible data breach.”

The Privacy Act defines “personal information” to include “information or an opinion about an identified individual, or an individual who is reasonably identifiable (1) whether the information or opinion is true or not; and (2) whether the information or opinion is recorded in a material form or not.”

The new legislation includes a harm threshold for determining what constitutes an “eligible data breach,” which is defined as occurring when:

  • (1) “there is unauthorized access to, or unauthorized disclosure of, the [personal] information” and (2) “a reasonable person would conclude that the access or disclosure would be likely to result in serious harm to any of the individuals to whom the information relates”; or
  • “the information is lost in circumstances where:
    • (1) unauthorized access to, or unauthorized disclosure of, the information is likely to occur; and
    • (2) assuming that unauthorized access to, or unauthorized disclosure of, the information were to occur, a reasonable person would conclude that the access or disclosure would be likely to result in serious harm to any of the individuals to whom the information relates.”

The new legislation does not define “serious harm,” but an official explanatory memorandum states that serious harm could include “serious physical, psychological, emotional, economic and financial harm, as well as serious harm to reputation.” In determining whether serious harm has occurred, entities may consider the sensitivity of the information involved, the kind of person who might gain access to the information and the nature of the harm that may result from the breach.

The explanatory memorandum lists the following examples of breaches that may require notification:

  • a malicious breach of the secure storage and handling of information (e.g., in a cybersecurity incident);
  • accidental loss (most commonly of IT equipment or hard copy documents); and
  • negligent or improper disclosure of information.

Pursuant to the new legislation, if an entity suspects that an eligible data breach has occurred, it must take “all reasonable steps to ensure” that it completes an assessment of the incident within 30 days following discovery. This is not a hard deadline, but a preferable timeframe that may be adjusted depending on the complexity of the incident. If the assessment determines that an eligible data breach has occurred, entities must notify the Commissioner and affected individuals “as soon as practicable.”

Notification to both the Commissioner and affected individuals must include:

  • the identity and contact details of the entity;
  • a description of the serious data breach;
  • the kinds of information possibly breached; and
  • recommendations about the steps that individuals should take in response to the serious data breach.

The explanatory memorandum states that an entity may notify affected individuals using the method of communication it normally uses to communicate with those individuals.

In addition, there is an exception to notification for situations where the entity takes remedial action before the access or disclosure results in serious harm. The new legislation also contains a “secrecy” provision exception, which states that where compliance with the notification requirement would be inconsistent with a provision under Australian law (other than the Privacy Act) that prohibits or regulates the use or disclosure of information, the notification requirement would “be limited to the extent of the inconsistency.”

A failure to notify that is found to be a serious or repeated interference with privacy under the Privacy Act can be penalized with a fine of up to $360,000 AUD ($274,560 USD) for individuals and $1.8 million AUD ($1.37 million USD) for organizations.

Although the effective date for the new legislation has yet to be set, the new notification requirements will come into force at the latest one year after the receiving Royal Assent, which typically occurs seven to ten days after Parliament passes a bill.

House of Representatives Passes Email Privacy Act

On February 6, 2017, the House of Representatives suspended its rules and passed by voice vote H.R 387, the Email Privacy Act. As we previously reported, the Email Privacy Act amends the Electronic Communications Privacy Act (“ECPA”) of 1986. In particular, the legislation would require government entities to obtain a warrant, based on probable cause, before accessing the content of any emails or electronic communications stored with third-party service providers, regardless of how long the communications have been held in electronic storage by such providers.

Similar legislation unanimously passed the House in the last Congress, but died in the Senate due to concerns over amendments to the bill. Even the legislation’s Senate sponsors—Sen. Leahy (D-VT) and Mike Lee (R-UT)—eventually withdrew the bill from consideration due to concerns that the amendments would make electronic communications even less private than they are now.

The Email Privacy Act now moves to the Senate, where it will be considered by the Senate Judiciary Committee, which is chaired by Sen. Chuck Grassley (R-IA). However, action on the legislation may be a lower priority for the Committee, and the Senate in general, because they are currently concentrating on nominations for agencies and the Supreme Court.

Privacy Shield: Impact of Trump’s Executive Order

On January 25, 2017, President Trump issued an Executive Order entitled “Enhancing Public Safety in the Interior of the United States.” While the Order is primarily focused on the enforcement of immigration laws in the U.S., Section 14 declares that “Agencies shall, to the extent consistent with applicable law, ensure that their privacy policies exclude persons who are not United States citizens or lawful permanent residents from the protections of the Privacy Act regarding personally identifiable information.” This provision has sparked a firestorm of controversy in the international privacy community, raising questions regarding the Order’s impact on the Privacy Shield framework, which facilitates lawful transfers of personal data from the EU to the U.S. While political ramifications are certainly plausible from an EU-U.S. perspective, absent further action from the Trump Administration, Section 14 of the Order should not impact the legal viability of the Privacy Shield framework.

Adoption of the Privacy Shield in July 2016

The Privacy Shield framework was formally adopted on July 12, 2016, replacing the U.S.-EU Safe Harbor framework, which had been invalidated in October 2015 by the Court of Justice of the European Union. The timing of the Privacy Shield’s adoption coincided with other related EU-U.S. diplomatic efforts that were ongoing regarding law enforcement access to personal data in the EU and U.S. In particular, prior to the Privacy Shield’s adoption in July 2016, on June 2, 2016 the EU and U.S. successfully completed a multi-year negotiation of the so-called “Umbrella Agreement” to ensure the protection of personal data transferred for law enforcement purposes between the EU and U.S. pursuant to existing international agreements involving the EU and U.S. The Umbrella Agreement’s privacy protections are intended to apply to the many existing EU-U.S. agreements that pre-date the adoption of the Umbrella Agreement and that contemplate transfers of personal data for law enforcement purposes, such as the Passenger Name Records Agreement, various Mutual Legal Assistance Treaties (“MLATs”), and the now defunct Safe Harbor framework.

The Interplay Between the Umbrella Agreement and the Judicial Redress Act

In relevant part, Article 19 of the Umbrella Agreement affords any citizen of the EU the right to seek judicial review in the event a U.S. law enforcement agency unlawfully discloses the individual’s personal data or denies the individual the right to access or amend his or her personal data in the possession of the agency. At the time of the Umbrella Agreement negotiations, existing U.S. law did not afford such rights of judicial review to non-U.S. citizens or permanent residents, although the Privacy Act of 1974 did extend these rights to citizens and permanent residents of the U.S. As a result, the EU would not agree to the Umbrella Agreement until the U.S. extended those protections under the Privacy Act to citizens of the EU so that the U.S. could comply with Article 19 of the Umbrella Agreement.

The U.S. agreed with the EU and passed the Judicial Redress Act in February 2016, which extended Privacy Act protections regarding access, amendment and disclosure to citizens of “covered countries.” This enactment of the Judicial Redress Act in February 2016 paved the way for the execution of the Umbrella Agreement, which occurred in June 2016. Subsequently, on January 17, 2017, now former U.S. Attorney General Loretta Lynch designated “covered jurisdictions” in the Judicial Redress Act to include the citizens of all EU Member States other than Denmark and the United Kingdom (which are expected to be included in the definition soon), and this designation becomes effective on February 1, 2017. Notably, in accordance with the Judicial Redress Act, this designation by the Attorney General is not subject to judicial or administrative review.

The Impact of the Executive Order

The EU’s assent to the Privacy Shield framework was influenced, at least in part, by the Umbrella Agreement which was, in turn, conditioned upon the enactment of the Judicial Redress Act. President Trump’s Executive Order calls for federal agencies in the U.S. to ensure that their privacy notices make clear that Privacy Act protections extend only to citizens and permanent residents of the U.S. Importantly, Article 14 of the Order explicitly states that the federal agencies must do so in a manner that is “consistent with applicable law.” In the context of EU-U.S. data transfers for law enforcement purposes, the Judicial Redress Act constitutes applicable law, and thus President Trump’s Executive Order, as written, should not impact the Judicial Redress Act’s extension of the Privacy Act’s protections to citizens of the EU. As a result, absent further action from the U.S. government, we do not expect this Executive Order to impact the legal viability of the Privacy Shield Framework. That said, tempers are running high and the negative perception created by Trump’s actions could have an adverse effect on the Privacy Shield’s annual review in 2017.

One issue to monitor is the process of designating “covered countries” under the Judicial Redress Act. While former Attorney General Lynch’s designation is not subject to judicial or administrative review, the Judicial Redress Act does include a process by which “covered country” designations can be removed. There are specifically enumerated criteria for such removal and if the pending designation of EU countries as “covered countries” were to be removed by the Trump Administration, that removal could negatively impact the Privacy Shield framework. If such removal occurred, it certainly would undermine the viability of the Umbrella Agreement between the EU and U.S. Although the Privacy Shield is not explicitly dependent on the Umbrella Agreement or the Judicial Redress Act, their unraveling could have far-reaching political consequences regarding U.S.-EU law enforcement data sharing efforts, including with respect to the Privacy Shield.

UK Supreme Court Rules That Parliament Must Have Brexit Vote

On January 24, 2017, the UK Supreme Court handed down its judgment in the case of R (on the application of Miller and another) (Respondents) v. Secretary of State for Exiting the European Union (Appellant) [2017] UKSC 5. The case concerned the process to be followed to effect the UK’s withdrawal from the European Union and, in particular, whether the UK government may commence the UK’s withdrawal using executive powers, or whether Parliamentary approval is required. The Supreme Court held, by majority, that the UK government cannot commence the UK’s withdrawal from the EU without the approval of Parliament.

The Supreme Court held that withdrawal from the EU fundamentally changes the UK’s constitutional arrangements, and explained that the UK Constitution requires changes of such magnitude to be taken by Parliament. The UK Government already has, however, announced its intention to place a bill before Parliament that would grant the government the necessary approval to commence the withdrawal process. Notably, the Supreme Court’s judgment indicates that the form of such a bill is up to Parliament to decide, and that such a law may be “very brief.” The ruling has nevertheless been seized upon by opposition parties, who have suggested that they plan to use the ruling to ensure that the government obtains the right deal for the UK.

 

Email Privacy Act Reintroduced in Congress

On January 9, 2017, Representatives Kevin Yoder (R-KS) and Jared Polis (D-CO) reintroduced the Email Privacy Act, which would amend the Electronic Communications Privacy Act (“ECPA”) of 1986. In particular, the legislation would require government entities to obtain a warrant, based on probable cause, before accessing the content of any emails or electronic communications stored with third-party service providers, regardless of how long the communications have been held in electronic storage by such providers. Although ECPA currently requires law enforcement agencies to obtain a warrant to search the contents of electronic communications held by service providers that are less than 180 days old, communications that are more than 180 days old can be obtained with a subpoena.

The Email Privacy Act previously received unanimous approval in the House of Representatives in April 2016, but failed to gain traction in the Senate Judiciary Committee after Representative John Cornyn (R-TX) added a controversial amendment that would have expanded the FBI’s ability to use “national security letters” to obtain a suspect’s information from wire or electronic communications service providers. The proposed amendment gave the director of the FBI (or a designee), the ability to compel providers to disclose certain information about a suspect other than the contents of the suspect’s communications, including the individual’s name, physical address, contact information, payment card or bank account information, IP address, login history and length of service with the provider.

In May 2016, the U.S. Securities and Exchange Commission (“SEC”) expressed concern that the proposed Email Privacy Act would hinder its ability to obtain critical evidence of securities law violations. The SEC asserted that the bill would require all government agencies, including civil enforcement agencies like the SEC, to obtain criminal warrants when seeking electronic communications of service providers. According to the SEC, because it “does not have criminal law enforcement powers and therefore lacks the authority to apply for search warrants, the bill would inhibit the SEC in its mission of protecting investors and promoting confidence in the U.S. capital markets.” Currently, the SEC may seek electronic communications from service providers by issuing administrative subpoenas under its own statutory authority.

In a statement issued on January 10, 2017, Representative Polis noted that “[t]he Email Privacy Act will update, and bring our archaic laws into the 21st century, and protect Americans’ Fourth Amendment privacy rights, whether they’re communicating through pen-and-paper mail or email. Americans justly demand this level of privacy, and I remain confident that the bill will swiftly pass Congress.”

Chile Expected to Consider New Data Protection Legislation

On January 3, 2017, Bloomberg Law: Privacy and Data Security reported that Chilean legislators are soon expected to consider a new data protection law (the “Bill”) which would impose new privacy compliance standards and certain enforcement provisions on companies doing business in Chile. 

Chile’s existing data protection law, the Law on the Protection of Private Life (Law No. 19,628), was signed into law in 1999 and does not provide for a privacy regulator with enforcement authority. Analysts expect that the Bill, the details of which have not yet been made public, will modify, rather than replace, the existing law, and will provide for the establishment of a data protection authority. It is expected that the data protection authority would report to another government agency rather than operating entirely independently.

The Bill is expected to be submitted before the legislature’s annual recess in February, though experts doubt the Bill will become law before March 2018, when a new administration will take office. Deputy Finance Minister Alejandro Micco indicated that the Bill aims to address the negative effect of inadequate data protection legislation on the development of global technological services in Chile.

UK Parliament Approves Investigatory Powers Bill

On November 16, 2016, the UK Investigatory Powers Bill (the “Bill”) was approved by the UK House of Lords. Following ratification of the Bill by Royal Assent, which is expected before the end of 2016, the Bill will officially become law in the UK. The draft of the Bill has sparked controversy, as it will hand significant and wide-ranging powers to state surveillance agencies, and has been strongly criticized by some privacy and human rights advocacy groups. 

The Bill was initially proposed by the current UK Prime Minister, Theresa May, during her previous tenure as UK Home Secretary. The Bill allows intelligence and law enforcement agencies to require telecommunications service providers to retain and hand over communications data. Telecommunications service providers also will be required to store individuals’ browsing histories over the previous year in real time, and submit bulk datasets to intelligence services and law enforcement agencies where those agencies have obtained a warrant from the Secretary of State. The Bill also will permit both targeted and bulk equipment interference (i.e., obtaining data from an electronic device) by intelligence services and law enforcement agencies where authorized by a lawfully-issued warrant.

The Bill’s entry into law could potentially complicate Brexit negotiations between the UK and EU with regard to privacy and data protection issues. Revelations that U.S. intelligence agencies engaged in bulk surveillance practices similar to those contemplated by the Investigatory Powers Bill ultimately led to the demise of the U.S.-EU Safe Harbor data transfer framework. With the UK planning its exit from the EU, the Investigatory Powers Bill could create similar issues for the UK in the context of negotiating a cross-border data transfer framework with the EU.

Russia Set to Block Access to LinkedIn

This post has been updated. 

On November 10, 2016, the Court of Appeal for Moscow’s Taginsky District upheld an August 2016 decision by the district’s lower court that LinkedIn had violated Russian data protection laws. Access to the professional networking site is now set to be blocked across Russia.

The court’s decision, which followed a complaint from the Russian data protection regulator, Roskomnadzor, found that LinkedIn violated Russian data protection law on two counts:

  • not storing data about Russians on servers located in Russian territory; and
  • processing information about individuals who are not registered on the LinkedIn website and who have not signed the company’s user agreement.

This is thought to be the first time Russia’s data localization law has been enforced since its entry into force in September 2015. The law requires that data relating to Russian citizens be stored on servers physically located inside Russia’s borders. Although LinkedIn does not have a physical presence in Russia, it operates a Russian-language version of its website, which was enough to convince Roskomnadzor and the court that the company is subject to Russian data protection legislation.

Media reports have cited Roskomnadzor’s claim that it contacted LinkedIn to inquire about its data localization practices, but did not receive a substantive response. LinkedIn, however, has argued that Roskomnadzor communicated with its U.S. office instead of LinkedIn Ireland, the entity responsible for the data of non-U.S. citizens. LinkedIn is reportedly eager to enter into dialogue with Roskomnadzor to find a solution to the issue, and also has the option to appeal the decision to the Russian Supreme Court.

Roskomnadzor has the power to block Russian individuals’ access to websites, and has stated that it plans to block access to LinkedIn. The site will be entered into a special registry of websites operating in violation of the data localization law, and will be blocked three business days after being entered into the registry.

UPDATE: On November 17, 2016, the Russian data protection regulator, Roskomnadzor, officially blocked access to LinkedIn for its alleged violation of Russian data protection law.

NHTSA Releases New Automobile Cybersecurity Best Practices

The National Highway Safety Administration (“NHTSA”) recently issued non-binding guidance that outlines best practices for automobile manufacturers to address automobile cybersecurity. The guidance, entitled Cybersecurity Best Practices for Modern Vehicles (the “Cybersecurity Guidance”), was recently previewed in correspondence with the House of Representatives’ Committee on Energy and Commerce (“Energy and Commerce Committee”).

According to the NHTSA, the Cybersecurity Guidance is “non-binding guidance” that contains “voluntary best practices” to improve motor vehicle cybersecurity. The Cybersecurity Guidance generally encourages automobiles manufactures to utilize a “layered approach” through adopting the National Institute of Standards and Technology (“NIST”) Cybersecurity Framework and its five principles: identify, protect, detect, respond and recover. NHTSA also recommends the use of certain industry standards such as ISO 27000 series standards, and other best practices, such as the Center for Internet Security’s Critical Security Controls for Effective Cyber Defense. While the Cybersecurity Guidance admits that these standards were developed to mitigate threats against networks and not necessarily automotive devices, it nevertheless contends that they can still be adopted for use in the automotive industry. As with NHTSA’s cyber guidance for autonomous vehicles, the Cybersecurity Guidance also encourages automobile manufacturers to engage in information sharing as well as have a process for vulnerability reporting.

The month before the Cybersecurity Guidance was released, the Energy and Commerce Committee sent NHTSA a letter raising questions concerning cybersecurity risks related to On Board Diagnostics (“OBD-II”) ports, calling on NHTSA to establish an industry-wide working group on the subject. The Cybersecurity Guidance does not directly address OBD-II ports, though it does call for operational limits on “control vehicle maintenance diagnostic access” and calls on the automotive industry to consider the effects of aftermarket devices like insurance dongles and cell phones that are connected to vehicle information systems. Furthermore, in its response to the Energy and Commerce Committee, NHTSA indicated that at their request, “SAE International has started a working group that is looking to explore ways to harden the OBD-II port.”

On October 28, 2016, NHTSA published a request for public comments on the Cybersecurity Guidance and has opened a docket for those comments. Comments are due on November 28, 2016.

NHTSA Set to Release New Automobile Cybersecurity Best Practices

On October 14, 2016, the National Highway Transportation Administration (“NHTSA”) indicated in a letter to Congress that it intends to issue new best practices on vehicle cybersecurity. This letter came in response to an earlier request from the House Committee on Energy and Commerce (“Energy and Commerce Committee”) that NHTSA convene an industry-wide effort to develop a plan to address vulnerabilities posed to vehicles by On-Board Diagnostics (“OBD-II”) ports. Since 1994, the Environmental Protection Agency has required OBD-II ports be installed in all vehicles so that they can be tested for compliance with the Clean Air Act. OBD-II ports provide valuable vehicle diagnostic information and allow for aftermarket devices providing services such as “good driver” insurance benefits and vehicle tracking. Because OBD-II ports provide direct access to a vehicle’s internal network; however, OBD-II ports are widely cited as the central vulnerability to vehicle cybersecurity.

Although the Energy and Commerce Committee requested a plan regarding OBD-II ports specifically, the NHTSA letter reiterates previous NHTSA statements that vehicle cybersecurity should be addressed more comprehensively than “each entry port at a time.” The letter says that NHTSA’s forthcoming guidance will be based on the National Institute of Standards and Technology (“NIST”) Cybersecurity Framework’s five principles: identify, protect, detect, respond and recover.

Coming not long after NHTSA released guidance on autonomous vehicles which called for increased information sharing within the automotive sector, NHTSA’s reliance on the NIST Cybersecurity Framework in its vehicle cybersecurity guidance indicates that NHTSA is increasingly seeking to apply cybersecurity measures to passenger vehicles currently utilized within critical infrastructure. Indeed, the NIST Cybersecurity Framework was developed pursuant President Obama’s E.O. 13636, Improving Critical Infrastructure Cybersecurity.

CJEU Rules That Dynamic IP Addresses Are Personal Data

On October 19, 2016, the Court of Justice of the European Union (the “CJEU”) issued its judgment in Patrick Breyer v. Bundesrepublik Deutschland, following the Opinion of Advocate General Manuel Campos Sánchez-Bordona on May 12, 2016. The CJEU followed the Opinion of the Advocate General and declared that a dynamic IP address registered by a website operator must be treated as personal data by that operator to the extent that the user’s Internet service provider (“ISP”) has – and may provide – additional data that in combination with the IP address that would allow for the identification of the user.

The case arose in 2008 when a German citizen brought an action before the German courts seeking an injunction to prevent websites, operated by the Federal German Institutions, from registering and storing his IP addresses. Most of these websites store information on all access operations in logfiles (including the IP address of the computer from which access was sought, and the date and time when a website was accessed) for the purposes of preventing cyber attacks and making it possible to prosecute ‘pirates.’ The German citizen’s claim was initially rejected by the court of first instance. The claim was granted in part, however, by the court of appeals. Subsequently, both parties appealed the decision to the German Federal Court of Justice.

The German Federal Court of Justice has suspended the proceedings and referred the two following questions to the CJEU:

  • Whether a dynamic IP address (i.e., an IP address which is different each time there is a new connection to the Internet) registered by an online media services provider (here, the German institutions) is personal data within the meaning of Article 2(a) of the EU Data Protection Directive, when only a third party (the ISP) has the additional information necessary to identify the website user.
  • Whether the ‘legitimate interest’ legal basis under Article 7(f) of the EU Data Protection Directive is contrary to a provision of the German Telemedia Act, which is interpreted by most German legal commentators as preventing the storage of personal data after the consultation of online media in order to guarantee the security and continued proper functioning of those media. According to that interpretation, personal data must be deleted at the end of the consultation period, unless the data is required for billing purposes.

The CJEU gave a positive reply to both questions. In regards to the first question, the CJEU noted that there appears to be legal channels in Germany enabling the online media services provider to contact the competent authority – in particular, in the event of cyber attacks – so that the competent authority may take the steps necessary to obtain from the ISP additional information on the website user and subsequently bring criminal proceedings. In other words, the online media services provider would have the means, which may likely reasonably be used, to identify the website user – with the assistance of third parties – on the basis of the IP addresses stored. Consequently, the CJEU ruled that the dynamic IP address of a website user is personal data, with respect to the website operator, if that operator has the legal means allowing it to identify the user concerned with additional information about that user which is held by the ISP.

In regards to the second question, the CJEU ruled that the German legislation, as interpreted by most legal commentators, excludes the possibility to perform the ‘legitimate interest’ test (i.e., in the present case, to balance the objective of ensuring the general operability of the online media against the interests or fundamental rights of website users). In this respect, the CJEU emphasized that German Federal Institutions, which provide online media services, may have a legitimate interest in ensuring the continued functioning of their websites and thus in storing certain user personal data in order to protect themselves against cyber attacks.

The German Federal Court of Justice is now required to decide on the dispute itself.

View the full text of the judgment of the CJEU. For a summary, please see the press release of the CJEU.

Department of Transportation Issues Cyber Guidance for Autonomous Cars

On September 20, 2016, the Department of Transportation, through the National Highway Traffic Safety Administration (“NHTSA”), released federal cyber guidance for autonomous cars entitled Federal Automated Vehicles Policy (“guidance”).

The guidance makes a number of recommendations, including that automated vehicles should be designed to comply with “established best practices for cyber physical vehicle systems.” To that end, the guidance recommends manufacturers follow “guidance, best practices and design principles” published by National Institute for Standard and Technology, NHTSA and other industry groups, including the Automotive Information Sharing and Analysis Center (“Auto-ISAC”). Manufacturers also are encouraged to engage in information sharing – sharing data recorded during driving for the purpose of reducing crashes and improving highway safety, as well as sharing cyber threat signatures. The guidance recommends manufacturers report “any and all discovered vulnerabilities” to the Auto-ISAC as soon as possible.

Signaling a phased approach to driverless vehicle policy, the guidance is voluntary. The guidance is only the first step, however, and should not be viewed as foreclosing future federal regulations over driverless vehicles. President Obama noted in an op-ed that the guidance guides “states on how to wisely regulate these new technologies, so that when a self-driving car crosses from Ohio into Pennsylvania, its passengers can be confident that other vehicles will be just as responsibly deployed and just as safe,” but also said that “my administration is rolling out new rules of the road for automated vehicles.” Indeed, the President warned, “make no mistake: If a self-driving car isn’t safe, we have the authority to pull it off the road. We won’t hesitate to protect the American public’s safety.” Notably, the guidance comes on the heels of reports that Chinese cybersecurity researchers were able to hack into a driverless car from 12 miles away and tamper with electronically controlled features of the car, including brakes and locks.

Final Rules for the Data Privacy Act Published in the Philippines

Recently, the National Privacy Commission (the “Commission”) of the Philippines published the final text of its Implementing Rules and Regulations of Republic Act No. 10173, known as the Data Privacy Act of 2012 (the “IRR”). The IRR has a promulgation date of August 24, 2016, and went into effect 15 days after the publication in the official Gazette.

We previously reported on the preceding draft text of the IRR. There are several points of interest that were resolved in the final text, which presents a more practical framework than had been proposed in the draft IRR. Any changes to the final IRR will require a regulatory amendment by the Commission rather than an act of the legislature.

Some points of interest that have been resolved or finalized include the following:

  • The IRR has two separate defined terms, “personal data” and “personal information,” but the potential discrepancy between the two terms has been resolved. “Personal information” refers to information which can identify a particular individual, and is consistent with the definition provided in the statute. “Personal data” is defined as all types of “personal information,” which presumably includes both “ordinary” personal information and sensitive personal information.
  • The draft IRR had used the term “personal data” to describe “personal information” that has been input into an information and communication system, which would mean “personal information” that has been digitally and electronically formatted. This definition no longer appears in the final IRR. In addition, the terms “personal information” and “personal data” are now used more consistently in relation to their definitions. This may result in less ambiguity and a lower prospect of confusion from the use of the two terms.
  • The final IRR has now been made consistent with a provision in the original statute which stated that the Data Privacy Act would not apply to personal information collected in a foreign jurisdiction (in compliance with the laws or rules of that jurisdiction) which is being processed in the Philippines. The draft IRR had provided that, in such instances, the data privacy laws of the foreign jurisdiction would apply in relation to the collection of personal information, while the Philippine Data Privacy Act would apply to processing that takes place within the Philippines. This would have entailed a complex analysis as to where collection-related obligations under the foreign jurisdiction end and where processing-related obligations under Philippine law begin, and how the two sets of legal obligations might intersect.
  • The final IRR requires that, even where personal information has been collected in a foreign jurisdiction for processing in the Philippines, the Philippine requirements to implement information security measures will still apply. This will impose some security-related costs on that portion of the information-processing operations that take place within the Philippines.
  • The final IRR requires that sharing of personal data in the private sector proceeds according to a data sharing agreement. The data sharing agreement may be subject to review by the Commission on its own initiative or following a complaint of a data subject. The draft IRR might have been interpreted to require review by the Commission in all instances, which would have imposed a substantial burden on all sharing of personal data, as well as a burden on the resources of the Commission itself.
  • The final IRR sets forth rules on the internal organizational operations and structure of personal information controllers, such as requirements to (1) appoint a privacy officer, (2) maintain records of processing activities, (3) implement physical security measures and technical security measures, and (4) carry out regulator monitoring for security breaches. However, these obligations only apply “where appropriate.” The draft IRR might have been interpreted to require compliance in all instances. Where and when these potentially complicated requirements will be “appropriate” will depend on a number of factors, including the nature of personal data, the risks posed by the processing, the size and complexity of the organization and its operations, current best practices and cost of security implementation.
  • The final IRR gives the data subject an additional right to object or withhold consent to processing. This appears to be a new right that did not appear in the original text of the statute. This right is substantially retained from the draft IRR, with changes to specifically allow the data subject to object to processing for direct marketing, automated processing or profiling.
  • The final IRR provides more clarity on the notification requirements in connection with to a data breach. Individuals must be notified of data breaches only when both (1) sensitive personal information or information that may be used to enable identity fraud are involved; and (2) the personal information controller believes that the breach is likely to pose a real risk of serious harm to any affected data subject.
  • If the notification requirement does apply, the notification must be made within 72 hours, though notification may be delayed in certain limited circumstances. The final IRR stipulate the categories of content that must appear in the notification.
  •  The requirement under the draft IRR to notify affected individuals in the event of any breach that involves personal, sensitive or privileged information has been removed. That had been a material expansion of the circumstances under which a breach notification had to be made. By removing this requirement, the final IRR keeps the notification requirement within a relatively restricted range of circumstances. However, written reports of security incidents and personal data breaches have to be prepared and a summary has to be provided to the Commission on an annual basis. This amounts to a less onerous notification obligation.
  • In summary, the data breach notification requirement is now more clearly subject to a “risk-based approach” (i.e., the requirement to notify does not arise automatically, but arises instead on a case-by-case basis depending on an evaluation of the risk involved). Only data breaches that involve higher levels of risk must be notified.
  • The final IRR has requirements to register data processing operations and to notify the Commission of automated processing operations, but these now apply only in particular circumstances. The requirement to register with the Philippine data protection authority only applies to processing by personal information controllers and processors which employ 250 or more persons, or to processing that involves risk to the rights and freedoms of data subjects, takes place more than occasionally, or involves more than a de minimis amount (at least 1,000 individuals) of sensitive personal information. The requirement to notify individuals of data processing only applies to processing that is the sole basis of decision making that would significantly affect the data subject.
  • The draft IRR required both universal registration and notification. This would have both increased the burden of processing data and contrasted with the international trend (i.e., the new EU General Data Protection Regulation, which modifies the registration requirements of the previous EU Data Protection Directive).
  • In relation to the accountability principle, the final IRR makes generalized references to the possibility of indemnification on the basis of applicable provisions of Philippine civil law and criminal liability. The final IRR now avoids the discussion of the potential for joint liability, along with the personal information controller, on the part of personal information processors, privacy officers, employees and agents, which had appeared in the draft IRR.

The following additional items are worth noting:

  • The requirements in the final IRR to notify data subjects (at the time of the collection of their personal information) now include an obligation to provide “meaningful information” about the “logic” that will be involved in processing personal information. Requiring that this be done for each and every instance in which personal information is to be collected and processed, and in a way that would satisfy a regulatory authority and the lawyer drafting the notice, is challenging.
  • The final IRR contains a provision stating that personal data may not be retained in perpetuity in contemplation of a future use yet to be determined. This may have potential to impair the processing of “big data” in the Philippines.
  • The draft IRR had established a right of data portability. The final IRR seems to restrict the applicability of this right, by making it apply only where the data subject’s personal information is processed by electronic means and in a structured and commonly-used format. This would seem to enable data processors and controllers to avoid an obligation to comply with this right, by processing personal data using unstructured or unusual formats.
  • The draft IRR had prohibited the processing of privileged information (i.e., private communications made between an individual and his or her lawyer in preparation for litigation), unless the same requirements applicable to the processing of sensitive personal information had been satisfied. While this provision may be potentially problematic, the final IRR mitigates this by providing an exception for uses of privileged information in the context of court proceedings, legal claims and constitutional or statutory mandates. It is not clear if this exception will be adequate to cover all possible situations where an exception will be needed, but further amendments to the IRR could be made to address any shortcomings.
  • In relation to the accountability principle, the final IRR discusses the idea of liability, but does not discuss other aspects of the principle. In particular, the final IRR does not establish rules by which a personal information controller might establish that it observes good internal data handling practices and demonstrates that they comply with applicable standards, or by which the Commission would require production and review of these practices against its standards. The final IRR also does not discuss how to apply the accountability principle in the context of cross-border data transfers; while a provision of the IRR discusses data sharing, it does not appear to describe what a company must do to share data internationally in accordance with the IRR.

OMB Updates Federal Information Management Policies

The Office of Management and Budget (“OMB”) recently issued updates to Circular A-130 covering the management of federal information resources. OMB revised Circular A-130 “to reflect changes in law and advances in technology, as well as to ensure consistency with Executive Orders, Presidential Directives, and other OMB policy.” The revised policies are intended to transform how privacy is addressed across the branches of the federal government.

In its press release announcing the revised document, OMB noted that “as government continues to digitize, we must ensure we manage data not only to keep it secure, but also [to] allow us to harness this information to provide the best possible service to our citizens.” Thus, according to OMB, the updated Circular A-130 combines in one document “a wide range of policy updates for federal agencies” on issues relating to “cybersecurity, information governance, privacy, records management, open data, and acquisitions.” It also covers issues relating to IT planning and budgeting.

Specifically, Circular A-130 focuses on the following three elements “to help spur innovation throughout the government”:

  • Real Time Knowledge of the Environment: Replacing periodic compliance-driven assessments with ongoing monitoring of federal information resources.
  • Proactive Risk Management: Focusing on modernizing the way in which the government identifies, categorizes and handles privacy and security risks.
  • Shared Responsibility: Focusing on shared responsibility and accountability for privacy and security among managers, employees and citizens.

According to OMB, the revised Circular A-130 “represents a shift from viewing security and privacy requirements as compliance exercises to understanding security and privacy as crucial components of a comprehensive, strategic, and continuous risk-based program.”

The fact sheet released with the press release indicates that the updated Circular A-130 “promotes innovation, enables information sharing, and fosters the wide-scale and rapid adoption of new technologies while protecting and enhancing security and privacy.”

Circular A-130 has two appendixes: Appendix I is titled Responsibilities for Protecting and Managing Federal Information Resources and Appendix II is titled Responsibilities for Managing Personally Identifiable Information (PII).

Appendix II, which is completely new, focuses on agency responsibilities for managing PII, applying the fair information practice principles, conducting privacy impact assessments, maintaining an inventory of PII, privacy training, privacy contracting and applying the NIST Risk Management Framework to manage privacy risks in the context of agency privacy programs.

China’s State Administration for Industry and Commerce Publishes Draft Regulations on the Protection of Consumer Rights

The State Administration for Industry and Commerce of the People’s Republic of China published a draft of its Implementing Regulations for the P.R.C. Law on the Protection of the Rights and Interests of Consumers (the “Draft”) for public comment. The draft is open for comment until September 5, 2016.

The Draft reiterates the requirements under the law that business operators must follow the principles of legitimacy, rightfulness and necessity when they collect and use the personal information of consumers. They also must expressly state the purposes, methods and scope of their collection and use of the information, and obtain the consent of the consumers. It also provides that business operators may not collect information that is irrelevant to their operations, or collect information in an improper way. Under the Draft, a business operator is required to retain, for at least five years, supporting documentation that can demonstrate its performance of its obligation to expressly inform and obtain the consent of consumers.

Business operators are required to adopt information security systems to ensure the security of the personal information of consumers. Business operators are required not to provide consumers’ personal information to other parties without the consumers’ consent, except in cases where the consumers’ personal information is anonymized in such a way that it cannot identify the specific individual and that the anonymization cannot be reversed.

In the event that a business operator suffers an information security breach which results in the disclosure or loss of information, or anticipates that such a breach is likely, the business operator is required to adopt remedial measures and promptly inform the affected consumers of such breach.

Compared with the original definition of “consumers’ personal information” in the earlier Measures for the Punishment of Conduct Infringing the Rights and Interests of Consumers, the scope of the term “consumers’ personal information” under the Draft additionally includes biometric features.

According to the Draft, without consumers’ express consent or request, business operators may not send them commercial electronic messages or make commercial marketing calls. Business operators also may not cause consumers to bear the costs of sending commercial electronic messages or making commercial marketing calls, unless otherwise agreed by the parties.

Advocate General Finds Member States May Not Breach EU Laws Over Electronic Communications Retention

On July 19, 2016, Advocate General Saugmandsgaard Oe (“Advocate General”), published his Opinion on two joined cases relating to data retention requirements in the EU, C-203/15 and C-698/15. These cases were brought following the Court of Justice for the European Union’s (“CJEU’s”) decision in the Digital Rights Ireland case, which invalidated Directive 2006/24/EC on data retention. The two cases, referred from courts in Sweden and the UK respectively, sought to establish whether a general obligation to retain data is compatible with the fundamental rights to privacy and data protection under EU law.

In his Opinion, the Advocate General stresses the need to find a balance between a nation’s need to effectively fight serious crime, such as terrorism, against individuals’ fundamental rights. The Advocate General found that a general obligation to retain data may be compatible with EU law, although any action from an EU Member State against the possibility of imposing such an obligation is subject to strict requirements. The national courts are responsible for determining whether or not such requirements are satisfied. The Advocate General set out the following interpretations of the requirements:

  • the general obligation to retain data and the accompanying guarantees must be laid down by legislation or regulatory measures;
  • the obligation must respect the essence of the right to respect for private life and the right to the protection of personal data laid down by the European Charter for Human Rights;
  • any interference with the fundamental rights should be in pursuit of an objective in the general interest (which the Advocate General opined could be satisfied only by the fight against serious crime);
  • the general obligation to retain data must be strictly necessary to the fight against serious crime; and
  • the general obligation must be proportionate.

While the Advocate General’s Opinion is not binding on the CJEU, the court’s judgments have historically tended to follow the Advocate General’s stated views.

FTC Increases Maximum Fines for Violations of Certain Sections of the FTC Act

On June 29, 2016, the Federal Trade Commission announced that, to account for inflation, it is increasing the civil penalty maximums for certain violations of the FTC Act effective August 1, 2016. The FTC’s authority for issuing these adjustments comes from the Federal Civil Penalties Inflation Adjustment Act Improvements Act of 2015. The Federal Register Notice indicates which sections of the FTC Act the adjustments will apply to, and the corresponding increases. For example, the FTC has increased the maximum fine from $16,000 to $40,000 for certain violations of Section 5 of the FTC Act.

Draft Released in the Philippines Implementing Rules for the Data Privacy Act

This post has been updated. 

On June 17, 2016, the National Privacy Commission (the “Commission”) of the Philippines released draft guidelines entitled, Implementing Rules and Regulations of the Data Privacy Act of 2012 (“IRR”), for public consultation.

Under the IRR, the processing of personal data has to adhere to the principles of transparency, legitimate purpose and proportionality. The IRR defines personal data as personal information, sensitive information and privileged information. Sensitive information refers to personal information about an individual’s race, ethnicity, health, education, genetic or sexual life of a person, proceedings related to an offense committed by a person, health records and tax returns. According to the IRR, the personal information controller should take organizational, physical and technical security measures for data protection. Such security measures include the designation of a privacy officer, limitations on physical access and the adoption of technical and logical security measures.

The IRR stipulates general principles for data sharing. According to the IRR, in order to conduct lawful processing of personal data, the data subject must have given his or her consent prior to collection. Consent of the data subject has to be evidenced by written, electronic or recorded means. The IRR also specifies information that is not subject to the Data Privacy Act, such as information of a governmental officer that relates to his or her position or functions.

Under the IRR, the basic rights enjoyed by data subjects include rights to be informed, to object, of access, of correction, of rectification, erasure or blocking and to damages. The Commission and affected data subjects should be notified within 24 hours upon the knowledge of or reasonable belief that a security breach has occurred. The IRR also includes registration and compliance requirements, including a requirement to register data processing systems operating in the country.

The IRR stipulates penalties for unauthorized processing of personal information, improper disposal of personal information, unauthorized disclosure and other violations. Violations of the Data Privacy Act, the IRR and other orders may be subject to cease and desist orders, temporary or permanent bans on the processing of personal data and the imposition of fines.

The Data Privacy Act established the Commission earlier this year as an independent body to monitor and ensure the compliance of personal information controllers with international standards for data protection. The IRR specifies the functions, organizational structure and other details of the Commission. For example, the function of the Commission includes (1) making rules such as issuing guidelines for data protection and proposing legislation on privacy or data protection, (2) performing compliance and monitoring functions to ensure effective implementation of the Data Privacy Act, and (3) adjudicating on complaints and investigations of violations of the rights of data subjects.

The IRR also includes provisions on other issues such as data privacy and security in government, outsourcing and subcontracting of personal data.

The IRR contains some provisions that add new requirements going beyond those of the original text. These can vary from, or have potential to be more burdensome on enterprises than, the original requirements. Described below are some of the provisions:

  • The IRR defines “personal data” and “personal information” as two separate terms. “Personal information” is defined as the abstract information itself, while “personal data” is personal information that has been inputted into an information and communication system (which presumably means a computer system), and therefore has presumably been digitally and electronically formatted.
  • In addition to personal information that has been electronically formatted, the term “personal data” also includes “sensitive information” and “privileged information.”
  • The IRR expounds upon a provision in the original requirements that says when personal information has been collected in a foreign jurisdiction for processing in the Philippines, the data privacy laws of the foreign jurisdiction will apply in relation to the collection of personal information, but the Data Privacy Act will apply to processing that takes place within the Philippines.
  • The IRR requires that sharing of personal data in the private sector proceeds according to a data sharing agreement which is subject to the review of the Commission.
  • The IRR imposes some potentially elaborate and time-consuming rules on internal organizational operations and structure, such as the requirements to (1) appoint a privacy officer, (2) implement capacity-building, orientation and training programs in relation to privacy and security, and (3) carry out system monitoring.
  • The IRR gives the data subject an additional right to object or withhold consent to further processing. This appears to be a new right that did not appear in the original requirements.
  • The IRR requires notification of a data breach within 24 hours under normal circumstances, though notification may be delayed in some circumstances.
  • The IRR materially expands the circumstances under which a breach notification must be made. In effect, the IRR now requires notification for any security breach that involves personal, sensitive or privileged information.
  • The IRR now requires registration of data processing operations and data processing systems, and requires that the Commission be notified of any wholly or partly automatic processing operations.
  • In relation to the accountability principle, the IRR expressly establishes the potential for joint liability, along with the personal information controller, on the part of personal information processors, privacy officers, employees and agents. It also establishes the possibility of criminal liability.

Comments on the IRR should be sent by July 15, 2016, to the Commission at info@privacy.gov.ph.

UK Votes to Leave the EU: Data Protection Standards Unlikely to Be Affected

On June 23, 2016, the UK held a referendum to decide upon its continued membership in the European Union. The outcome has resulted in the decision for the UK to withdraw its membership from the European Union. Despite the result, data protection standards are unlikely to be affected.

The full details of how and when the UK will negotiate its exit from the EU is still unclear. The process for withdrawal will be a long one, and unless there is an agreement to the contrary, it will take a minimum of 2 years. The next step is for the UK to serve notice of its intention to exit the EU using the formal legal procedure set out in Article 50 of the Treaty on European Union. As yet, no notice has been served and is unlikely to be served until a new UK prime minister is in place, widely expected to be in October 2016.

From a data protection perspective, any change will not be immediate. Regardless of the referendum result, the incoming EU General Data Protection Regulation (“GDPR”) will become law on May 25, 2018, meaning that the UK will almost certainly experience life under the GDPR. Businesses will therefore need to continue to prepare for, and start to, comply with the GDPR despite the UK’s withdrawal from the EU. Other EU Member States must also comply with GDPR beginning May 25, 2018.

Given that businesses will want to trade in the EU, once the UK formally leaves the EU, it is highly likely that the UK would seek to put in place a legal framework that reflects the GDPR. In particular, it appears that the UK would seek recognition as an “adequate” jurisdiction in order to allow the free flow of data from the EU to the UK. This has been confirmed by the UK’s Information Commissioner’s Office (“ICO”) in its statement issued on June 24, 2016. The ICO highlighted that “the Data Protection Act remains the law of the land irrespective of the referendum result.” “If the UK is not part of the EU, then upcoming EU reforms to data protection law would not directly apply to the UK. But if the UK wants to trade with the Single Market on equal terms we would have to prove ‘adequacy’ – in other words UK data protection standards would have to be equivalent to the EU’s General Data Protection Regulation framework starting in 2018.”

The GDPR (or a UK equivalent) will be the prevailing data protection standard in the UK, and companies should continue their GDPR preparation as before. In due course, and subject to the outcome of the UK’s exit negotiations, companies will need to review and make adjustments to their compliance programs, including relevant data transfer mechanisms, to reflect the fact that the UK will have a separate (albeit similar) data protection law to the EU.

The New Wave of Consumer Class Actions Targeting Retailers: What is the TCCWNA?

TCCWNA. The very acronym evokes head scratches and sighs of angst and frustration among many lawyers in the retail industry. You have probably heard about it. You may have even been warned about it. And you may currently be trying to figure out how best to minimize your risk and exposure this very moment. But what is it and why has virtually every retailer been hit with a TCCWNA class action demand letter or lawsuit in the past few months? And why are most retailers scrambling to update the terms and conditions of their websites?

The TCCWNA

As reported on the Hunton Retail Law Resource blog, the New Jersey Truth-in-Consumer Contract Warranty and Notice Act (“TCCWNA”) was passed in 1981 to protect consumers from allegedly deceptive practices in consumer contracts, warranties, notices and signs. In an often-quoted passage, the New Jersey legislature explained its rationale for the TCCWNA as follows:

Far too many consumer contracts, warranties, notices and signs contain provisions which clearly violate the rights of consumers. Even though these provisions are legally invalid or unenforceable, their very inclusion in a contract, warranty, notice or sign deceives a consumer into thinking that they are enforceable and for this reason the consumer often fails to enforce his rights.

To that end, Section 15 generally prohibits retailers from offering or displaying any provision that violates a clearly established right to any actual or prospective consumer in a consumer contract, warranty, notice or sign. Section 16 generally prohibits retailers from including language stating that any such provision is or may be void, unenforceable or inapplicable in “some jurisdictions” without specifically stating whether it is in New Jersey or not.

For nearly thirty years, TCCWNA was used sparingly by plaintiffs, who often brought TCCWNA claims only as a tack-on to claims brought under New Jersey’s notorious Consumer Fraud Act (“CFA”). But, in the past five or six years, things have changed. Indeed, as the following graphic by the New Jersey Civil Justice Institute (“NJCJI”) illustrates, the frequency of TCCWNA cases spiked sharply in 2009 and has been on the rise ever since:

graph bigger

While many attorneys have spent a great deal of time trying to figure out why TCCWNA cases have increased exponentially in recent years, the primary causes appear to be the following:

  • a series of judicial decisions that have led the plaintiffs’ bar to believe that TCCWNA claims are now even more plaintiff-friendly and more susceptible to class certification than CFA claims;
  • the TCCWNA’s potential applicability to “prospective” customers, not just actual customers, thereby potentially expanding the pool of interested plaintiffs and the scope of potential liability to customers who did not make a purchase and with no discernible injury; and
  • the availability of statutory damages of $100 per customer as well as attorney’s fees and costs under the TCCWNA, which could expose companies to enormous liability.

The original wave of TCCWNA cases focused on the terms and conditions in a variety of alleged consumer contracts, warranties, notices and signs, including restaurant menus, advertising materials, gift cards and all manner of contracts and written materials.

In the past six months, however, the NJCJI has noted that an “unprecedented” number of TCCWNA cases have targeted the terms and conditions of retailers’ websites, largely because the plaintiffs’ bar now views this type of class action as “a quick ticket to jackpot justice.” As a result, a wide variety of terms and conditions commonly found on many retailers’ websites, including exculpatory, indemnification, choice of law, severability, savings, privacy and limitation of liability provisions, have been coming under increasing scrutiny. Indeed, most of the major retailers – at least two dozen — have already been targeted with such lawsuits, and many more have received pre-lawsuit demand letters. Since most retail websites use the same or similar language in their website terms and conditions, there is an almost endless pool of potential targets for enterprising plaintiffs’ attorneys. In other words, if you are a retailer who has so far avoided being targeted with a TCCWNA demand letter or lawsuit, it may just be a matter of time.

For retailers grappling with such TCCWNA issues, there is currently a considerable amount of uncertainty. The courts have not yet had the opportunity to provide clear guidance, making it difficult to assess the potential viability of plaintiffs’ claims, and the potential liability is significant. As a result, retailers have reacted in different ways to the recent wave of TCCWNA demand letters and class action lawsuits targeting their websites. Some have settled out early on an individual basis. Others have moved to compel arbitration or to strike class allegations, and others have filed motions to dismiss that have raised a variety of arguments, including the applicability of the U.S. Supreme Court’s recent Spokeo decision and arguments that the TCCWNA does not apply to commercial websites by its terms or to the provisions contained therein for a variety of reasons.

We expect that some initial clarity will be forthcoming in the next six months as the initial wave of motions to dismiss in the commercial website cases are ruled upon. But, ultimately, this will likely be an issue that will need to be resolved by appellate courts down the road. In the interim, plaintiffs’ attorneys will continue to file more “copycat” TCCWNA class action lawsuits in the belief that the best way to achieve jackpot justice is to collect as many lottery tickets as possible.

In the meantime, what is a retailer to do to minimize its risk and exposure?

What Should I Do?

  • Compliance Review – Every retailer should undertake a thorough review of its website terms and conditions, as well as any other terms contained in written materials displayed to consumers as soon as possible. From a cost-benefit perspective, the effort and cost associated with ensuring that your website and other written materials comply with TCCWNA is relatively minimal compared to the potential cost associated with facing a TCCWNA class action lawsuit where every customer could potentially be entitled to $100 in statutory damages. For example, a simple review and slight modification of website terms and conditions, if necessary, may allow retailers to avoid becoming the next target and save them from a lot of headaches down the road.
  • Support Lobbying Efforts – A lobbying effort has also recently been launched to help address the recent TCCWNA abuses. The Retail Industry Leaders Association and the NJCJI has led efforts to highlight these abuses and to help support efforts to lobby the New Jersey legislature to address the underlying issues.
  • Take TCCWNA Demand Letters and Class Action Lawsuits Seriously – Some retailers may not initially take TCCWNA demand letters and class action lawsuits as seriously as they should, but the stakes are significant if a company is determined to be in violation. A few seemingly innocuous words or provisions on a commercial website, for example, could potentially translate into hundreds of millions of dollars in liability. Accordingly, retailers should treat a TCCWNA demand letter or class action lawsuit as seriously as any other major class action lawsuit presenting novel, complex issues with potentially significant liability, including hiring experienced, competent counsel and exploring insurance coverage issues early on.

DHS and DOJ Issue Final Guidance on the Cybersecurity Information Sharing Act of 2015

On June 15, 2016, the U.S. Department of Homeland Security (“DHS”) and U.S. Department of Justice (“DOJ”) jointly issued final guidance on the Cybersecurity Information Sharing Act of 2015 (“CISA”). Enacted in December 2015, CISA includes a variety of measures designed to strengthen private and public sector cybersecurity. In particular, CISA provides protections from civil liability, regulatory action and disclosure under the Freedom of Information Act (“FOIA”) and other open government laws for “cyber threat indicators” (“CTI”) and “defensive measures” (“DM”) that are shared: (1) among businesses or (2) between businesses and the government through a DHS web portal. Congress passed CISA in order to increase the sharing of cybersecurity information among businesses and between businesses and the government, and to improve the quality and quantity of timely, actionable cybersecurity intelligence in the hands of the private sector and government information security professionals.

The document issued yesterday included final guidelines on privacy and civil liberties and on the receipt of CTI and DM by the government:

  • Privacy and Civil Liberties Final Guidelines: Cybersecurity Information Sharing Act of 2015. This document was developed by DHS and DOJ pursuant Section 105(b) of CISA. It establishes privacy and civil liberties guidelines governing the receipt, retention, use and dissemination of CTI and DM by a federal entity under CISA.
  • Final Procedures Related to the Receipt of Cyber Threat Indicators and Defensive Measures by the Federal Government. Developed by DHS and DOJ as directed in Section 105(a)(1)&(3) of CISA, this document establishes procedures on how the federal government receives CTI and DM. It also interprets statutory requirements for the processes by which federal entities receive and handle CTI and DM, and disseminate it to other appropriate federal entities.

Yesterday’s guidance builds on the four implementation guidance documents that the federal government issued in February of this year. Those documents included:

  • Guidance to Assist Non-Federal Entities to Share Cyber Threat Indicators and Defensive Measures with Federal Entities under the Cybersecurity Information Sharing Act of 2015. Developed by DHS and DOJ as directed in Section 105(a)(4) of CISA, this document provides information on how non-federal entities can share CTI and DM with the federal government under CISA, and describes the protections that non-federal entities can receive, including liability protection and other statutory protections.
  • Sharing of Cyber Threat Indicators and Defensive Measures by the Federal Government under the Cybersecurity Information Sharing Act of 2015. This document covers federal cybersecurity information sharing within the federal government and with non-federal entities. It was developed by DHS, DOJ, Director of National Intelligence and Department of Defense as directed by Section 103 of CISA. Much of the document outlines current programs through which federal entities share CTI and DM with non-federal entities. The document provides limited guidance on the roles of entities involved in cybersecurity information sharing.
  • Interim Procedures Related to the Receipt of Cyber Threat Indicators and Defensive Measures by the Federal Government. The final version of these guidelines were issued on June 15, 2016, as required by CISA.
  • Privacy and Civil Liberties Interim Guidelines: Cybersecurity Information Sharing Act of 2015. The final version of these guidelines were issued on June 15, 2016, as required by CISA.

EU and U.S. Sign Umbrella Agreement

On June 2, 2016, the European Union and the U.S. signed an Umbrella Agreement, which will implement a comprehensive data protection framework for criminal law enforcement cooperation. The agreement is not yet in effect and additional procedural steps are needed to finalize the agreement. The European Council will adopt a decision on the Umbrella Agreement after obtaining consent from the European Parliament.

The Umbrella Agreement covers all personal data (e.g., names, addresses, criminal records, etc.) exchanged between police and criminal justice authorities of the EU Member States and the U.S. federal authorities for preventing, investigating, detecting and prosecuting criminal offenses, including terrorism.

The Umbrella Agreement will provide safeguards and guarantees of lawfulness for data transfers, including provisions on clear limitations on data use, the obligation to seek prior consent before any onward transfer of data, the obligation to define appropriate retention periods, and the right to access and rectification.

In addition, the Umbrella Agreement will grant to EU citizens equal treatment with U.S. citizens with respect to judicial redress rights before U.S. courts in case U.S. authorities deny access or rectification, or unlawfully disclose personal data. In this regard, the signature of the U.S. Judicial Redress Act, granting judicial redress rights to EU citizens, by President Obama in February 2016 paved the way for the signature of the Umbrella Agreement.

The Umbrella Agreement will complement existing EU-U.S. and Member State–U.S. agreements between law enforcement authorities.

Read the European Council’s press release.

European Data Protection Supervisor Publishes 2015 Annual Report

On May 24, 2016, the European Data Protection Supervisor (“EDPS”) presented its Annual Report for 2015. The annual report provides an overview of the EDPS’ primary activities in 2015 and sets forth key priorities and challenges for 2016.

During 2015, the EDPS focused on ensuring the adoption of a new and effective data protection framework, providing recommendations for the adoption of the General Data Protection Regulation (“GDPR”). The EDPS also advised European institutions on new proposed legislations, such as the EU Passenger Name Record Directive, and worked closely with the Article 29 Working Party to analyze the consequences of the Court of Justice of the European Union’s (the “CJEU”) ruling in the Schrems case. They also advised the European Commission on alternative solutions for international data transfers. In addition, the EDPS launched several new initiatives, including on data ethics, big data, mobile health and intrusive surveillance.

In line with the EDPS Strategy 2015-2019, the EDPS will largely focus on the implementation of the GDPR during the course of 2016. The provisions of the GDPR will be directly applicable in all EU Member States as of May 25, 2018. Among other tasks, the EDPS will focus on ensuring the accountability of data controllers, increasing cooperation with national data protection authorities, empowering their activities through the establishment of the European Data Protection Board and implementing sustainable rules on international data transfers, following the ruling of the CJEU in the Schrems decision.

Also, the EDPS will continue to provide advice to the European institutions on the coherent and consistent application of EU data protection principles when negotiating trade agreements or international agreements linked to law enforcement, such as the Transatlantic Trade and Investment Partnership, the Trade in Services Agreement or the Umbrella Agreement.

In addition, the EDPS announced the release of an Opinion on the EU-U.S. Privacy Shield on May 30, 2016.

Read the EDPS’s press release.

Amended Nebraska Data Breach Notification Law Adds Regulator Notification Requirement

On April 13, 2016, Nebraska Governor Pete Ricketts signed into law LB 835 (the “Bill”), which among other things, adds a regulator notification requirement and broadens the definition of “personal information” in the state’s data breach notification statute, Neb. Rev. Stat. §§ 87-802 to 87-804. The amendments take effect on July 20, 2016.

Specifically, the Bill:

  • requires entities to notify the Nebraska Attorney General in the event of a data breach, and no later than notice is provided to Nebraska residents;
  •  adds to the definition of “personal information” a user name or email address, in combination with a password or security question and answer, that would permit access to an online account; and
  • states that data is not considered “encrypted” for purposes of avoiding notification obligations if the confidential process or key was or is reasonably believed to have been acquired as a result of the breach.

Data Protection Law Passes Turkish Parliament

On March 24, 2016, the Grand National Assembly of Turkey approved the Law on Personal Data Protection, which is Turkey’s first comprehensive data protection legislation. The law will become effective once it is ratified by Turkey’s President and published in the Official Gazette of the Republic of Turkey.

Key provisions of the law include the following:

  • With limited exceptions, express consent is required to process personal data, defined as any information relating to an identified or identifiable living individual; or sensitive data, defined as personal data of a sensitive nature, including information relating to racial or ethnic origin, political opinions, religious beliefs, health, sexual life, criminal records, punitive measures and biometric data.
  • A legislative structure that includes a Data Protection Authority and a Data Controller Board.
  • Before actively processing data, data controllers must register with the Data Controller Registry (which will be established within six months of the law becoming effective).
  • Organizations and individuals that collect or store personal data must implement certain technical and administrative measures to protect data.
  • Data controllers are required to notify the newly-established Data Controller Board in the event of a data breach.
  • The Data Protection Authority will have the authority to impose fines of up to €300,000 and prison sentences of up to four years.

Once the law becomes effective, it will immediately apply to newly collected data, and data collectors will have two years to become compliant with respect to information collected prior to the law’s adoption.

Amended Tennessee Breach Notification Law Tightens Timing Requirement

On March 24, 2016, Tennessee Governor Bill Haslam signed into law S.B. 2005, as amended by Amendment No. 1 to S.B. 2005 (the “Bill”), which makes a number of changes to the state’s data breach notification statute, Tenn. Code § 47-18-2107. The amendments take effect on July 1, 2016.

The Bill:

  • Requires businesses and state agencies to notify affected individuals “immediately, but no later than 45 days from the discovery or notification of the breach, unless a longer period of time is required due to the legitimate needs of law enforcement.” Before the amendment, the statute required notification “in the most expedient time possible and without unreasonable delay, consistent with the legitimate needs of law enforcement.”
  • Eliminates a provision from the statute which triggered notification obligations only where there had been access to, or acquisition of, unencrypted personal information. Under the Bill, notification obligations may be triggered even where the accessed or acquired data elements are encrypted.
  • Defines “unauthorized person” for purposes of triggering notification obligations, to specifically include “an employee of the [business or agency] who is discovered by the [business or agency] to have obtained personal information and intentionally used it for an unlawful purpose.”

President Obama Signs Judicial Redress Act into Law

On February 24, 2016, President Obama signed the Judicial Redress Act (the “Act”) into law. The Act grants non-U.S. citizens certain rights, including a private right of action for alleged privacy violations that occur in the U.S. The Act was signed after Congress approved an amendment that limits the right to sue to only those citizens of countries which (1) permit the “transfer of personal data for commercial purposes” to the U.S., and (2) do not impose personal data transfer policies that “materially impede” U.S. national security interests.

As we previously reported, the passing of the Judicial Redress Act is an important step that may help facilitate the approval of the new EU-U.S. Privacy Shield by European regulators. It also impacts the 2015 draft agreement known as the Protection of Personal Information Relating to the Prevention, Investigation, Detection and Prosecution of Criminal Offenses (the “Umbrella Agreement”). The Umbrella Agreement was conditioned on the passing of the Judicial Redress Act.

In addition to the private right of action, non-U.S. citizens have other rights that are granted to U.S. citizens under the Privacy Act of 1974. These include the right to request access to records shared by their governments with a U.S. federal agency in the course of a criminal investigation and to amend any inaccuracies in their records.

Department of Homeland Security Issues Procedures Regarding Sharing Cybersecurity Information

On February 16, 2016, the Department of Homeland Security (“DHS”), in collaboration with other federal agencies, released a series of documents outlining procedures for both federal and non-federal entities to share and disseminate cybersecurity information. These documents were released as directed by the Cybersecurity Act of 2015 (the “Act”), signed into law on December 18, 2015. The Act outlines a means by which the private sector may enjoy protection from civil liability when sharing certain cybersecurity information with the federal government and private entities. These documents represent the first steps by the executive branch to implement the Act.

The Act directs additional actions to occur throughout the spring and early summer, and into the coming years. Notably, the Act directs DHS to certify to Congress a capability within DHS to receive cybersecurity information by March 2016, which will become the primary portal through which the federal government receives cybersecurity information under the Act. According to DHS, the Department’s Automated Information Sharing initiative will be that principal mechanism. However, the Act also states that at “any time after” DHS certifies this capability, the President can designate a non-DoD agency to also receive cybersecurity information under the Act.

Congress Passes Judicial Redress Act

On February 10, 2016, the U.S. House of Representatives passed the Judicial Redress Act, which had been approved by the Senate the night before and included a recent Senate amendment. The House of Representatives previously passed the original bill in October 2015, but the bill was sent back to the House due to the recent Senate amendment. The Judicial Redress Act grants non-U.S. citizens certain rights, including a private right of action for alleged privacy violations that occur in the U.S. The amendment limits the right to sue to only those citizens of countries that (1) permit the “transfer of personal data for commercial purposes” to the U.S., and (2) do not impose personal data transfer policies that “materially impede” U.S. national security interests. The bill now heads to President Obama to sign.

The passing of the Judicial Redress Act is an important step that may help facilitate the approval of the new EU-U.S. Privacy Shield by European regulators. It also impacts the 2015 draft agreement known as the Protection of Personal Information Relating to the Prevention, Investigation, Detection and Prosecution of Criminal Offenses (the “Umbrella Agreement”). The Umbrella Agreement was conditioned on the passing of the Judicial Redress Act.

In addition to the private right of action, non-U.S. citizens have other rights that are granted to U.S. citizens under the Privacy Act of 1974. These include the right to request access to records shared by their governments with a U.S. federal agency in the course of a criminal investigation and to amend any inaccuracies in their records.

The Judicial Redress Act has been endorsed by the U.S. Chamber of Commerce and numerous prominent technology companies, and had strong bi-partisan support.

Update: On February 24, 2016, President Obama signed the Judicial Redress Act into law.

President Obama Signs Executive Order Establishing Federal Privacy Council

On February 9, 2016, President Obama signed an Executive Order establishing a permanent Federal Privacy Council (“Privacy Council”) that will serve as the principal interagency support structure to improve the privacy practices of government agencies and entities working on their behalf. The Privacy Council is charged with building on existing interagency efforts to protect privacy and provide expertise and assistance to government agencies, expand the skill and career development opportunities of agency privacy professionals, improve the management of agency privacy programs, and promote collaboration between and among agency privacy professionals.

Below is a summary of the key components of the Executive Order:

  • Within 120 days of the date of the Executive Order, the Director of the Office of Management and Budget (“OMB”) shall issue a revised policy that contains guidance on the roles and responsibilities, and appropriate expertise and resource levels, of Senior Agency Officials for Privacy.
  • The Privacy Council will be established as the principal interagency forum to help Senior Agency Officials for Privacy “better coordinate and collaborate, educate the federal workforce, and exchange best practices.” The Privacy Council will be chaired by the Deputy Director for Management of the OMB and will include Senior Agency Officials from numerous agencies such as the Departments of State, Treasury, Commerce, Defense and Justice.
  • The head of each agency will designate (or re-designate) a Senior Agency Official for Privacy “with the experience and skills necessary to manage an agency-wide privacy program” who will work with the Privacy Council.
  • The Privacy Council will (1) develop recommendations for OMB on federal government privacy policies and requirements; (2) coordinate and share best practices for protecting privacy and implementing appropriate safeguards; (3) help address hiring, training and professional development needs of the federal government from a privacy perspective; and (4) perform other privacy-related functions designated by the Chair.
  • The Chair and the Federal Privacy Council will coordinate with the Federal Chief Information Officers Council to promote consistency and efficiency across the executive branch in addressing data privacy and security issues.

In parallel with the Executive Order, the White House issued a fact sheet regarding the Administration’s Cybersecurity National Action Plan (“CNAP”) which “takes near-term actions and puts in place a long-term strategy to enhance cybersecurity awareness and protections, protect privacy, maintain public safety as well as economic and national security, and empower Americans to take better control of their digital security.” The CNAP includes an action to modernize government information technology and transform how the government manages cybersecurity through the proposal of a $3.1 billion Information Technology Modernization Fund, which will include the formation of a Federal Chief Information Security Officer position. The fact sheet also details the establishment of the Commission on Enhancing National Security, which will be comprised of top U.S. strategic, business and technical thinkers from outside the government who will be tasked with studying and reporting on how to better enhance cybersecurity awareness and protect privacy. To fund the implementation of the proposed actions, the 2017 budget allocates more than $19 billion for cybersecurity, which represents a 35% increase over the 2016 enacted level.

Read more about the Administration’s proposed cybersecurity and data privacy initiatives.

Senate Judiciary Committee Passes Amended Judicial Redress Act

On January 28, 2016, the Senate Judiciary Committee passed the Judicial Redress Act (the “Act”), which would give EU citizens the right to sue over certain data privacy issues in the U.S. The Act passed after an amendment was approved which would condition EU citizens’ right to sue on EU Member States (1) allowing companies to transfer personal data to the U.S. for commercial purposes and (2) having personal data transfer policies which do not materially impede the national security interests of the U.S. The vote was initially set to take place on January 21, 2016, but was delayed.

Passage of the Act may have an impact on ongoing post-Safe Harbor negotiations on trans-Atlantic data transfers from the EU to the U.S., as strengthened privacy rights for EU citizens are an important component to any new Safe Harbor agreement. As we previously reported, the Article 29 Working Party announced in October 2015 that if no Safe Harbor agreement is reached by the end of January 2016, national data protection authorities may decide to initiate enforcement actions against companies that continue to rely on the invalidated Safe Harbor agreement to transfer data to the U.S.

Russian Data Protection Authority Releases 2016 Audit Plan for Localization Law

On January 13, 2016, the Russian Data Protection Authority (Roscommandzor) released its plan for audits this year to assess compliance with Russia’s data localization law, which became effective on September 1, 2015. The localization law requires companies to store the personal data of Russians in databases located in Russia. The audit plan indicates that the Roscommandzor will audit large, multinational companies doing business in numerous jurisdictions and processing the personal data of Russian citizens.

Senate Vote on Judicial Redress Act Delayed

On January 21, 2016, a Senate Judiciary Committee vote on the Judicial Redress Act, which would give EU citizens the right to sue over certain data privacy issues in the U.S., has reportedly been postponed. As reported by Forbes, the vote may have been delayed due to amendments to the fifth paragraph of the bill, which deals with litigation pursuant to the act. The vote was initially scheduled for today.

The delay could have a negative impact on the post-Safe Harbor negotiations on trans-Atlantic data transfers from the EU to the U.S., because strengthened privacy rights for EU citizens are an important component to any new Safe Harbor framework between the EU and U.S. European Data Protection Authorities gave the relevant EU and U.S. government negotiators until January 31, 2016, to reach a new agreement for transferring personal data. Failure to reach an agreement could pose issues for companies that previously relied on Safe Harbor to legitimatize their trans-Atlantic data flows.

Germany Adopts Law to Enable Class Actions for Data Protection Violations

On December 17, 2015, the German Federal Diet (Bundestag) adopted a draft law introducing class action-like claims that will enable consumer protection associations to sue companies for violations of German data protection law.

The law amends Germany’s Act on Actions for Injunctions to allow consumer protection associations to bring lawsuits against companies for improper use of consumer data in violation of German data protection law. At this time, only affected individuals, German criminal prosecutors and data protection authorities have legal standing to sue businesses for breaches of data protection law.

The law will enable consumer protection associations to allege claims for violations of the German rules governing the processing of consumers’ personal data for the purposes of (1) advertising, market and opinion research; (2) operation of a credit agency; (3) creation of personality or usage profiles; (4) address or other data trading; and (5) comparable commercial purposes. Such comparable commercial purposes, however, do not include the collection, processing or use of consumer personal data exclusively for the establishment, performance or termination of a business relationship with a consumer. As such, the law is designed to mainly address data processing by companies whose business models are based on the commercialization of personal data in both the offline and online contexts.

Importantly, the law prevents consumer protection associations from bringing claims for violations of international data transfer rules against companies relying on the invalidated Safe Harbor agreement until the end of the day of September 30, 2016 to the extent the transfer of data was based on the Safe Harbor Framework until October 6, 2015.

The draft law still needs to be signed by the president and published in the Federal Law Gazette before becoming law.

U.S. Congress Releases Compromise Bill on Cybersecurity Information Sharing

On December 16, 2015, leaders in the U.S. House of Representatives and Senate released a $1.1 trillion omnibus spending bill that contained cybersecurity information sharing language that is based on a compromise between the Cybersecurity Information Sharing Act, which passed in the Senate in October, and two cybersecurity information sharing bills that passed in the House earlier this year. Specifically, the omnibus spending bill included Division N, the Cybersecurity Act of 2015 (the “Act”). 

Notably, the Act:

  • does not contain the Senate’s provision concerning critical infrastructure at greatest risk. The language required government-directed agencies to report to Congress on the status of cyber incident reporting and develop potential cyber mitigation strategies at critical infrastructure at greatest risk. Many industry advocates expressed concern that this language could be the precursor to cybersecurity regulations regarding certain critical infrastructure facilities;
  • adopts the “knows at the time of sharing” standard for removal of personal information from shared cybersecurity information, as opposed to the higher “reasonably believes at the time of sharing” or “removes to the extent possible” standards;
  • directs that cybersecurity information be shared with the Federal government through a Department of Homeland Security (“DHS”) Portal, but allows the President to designate other portals (including, potentially, the Federal Bureau of Investigation) to also receive shared cybersecurity information;
  • provides liability protection for private entities that share cybersecurity information through the DHS portal, as well as through the presidentially-designated portals;
  • exempts shared cybersecurity information from Freedom of Information Act (“FOIA”) disclosure under existing FOIA exemptions; and
  • adopts the Senate’s longer 10-year sunset.

The House is scheduled to vote on the omnibus spending bill on Friday, with the Senate to follow. The Obama Administration has already signaled that it supports the bill.

Update: On December 18, 2015, President Obama signed into law the omnibus spending bill, which includes the Cybersecurity Act of 2015.

Brazil Releases Revised Draft Privacy Bill

In late October, the Brazilian Ministry of Justice (the “Ministry”) issued its revised Draft Bill for the Protection of Personal Data (“Draft Bill”). The Ministry released its preliminary draft in January 2015, and the Centre for Information Policy Leadership at Hunton & Williams LLP (“CIPL”) filed public comments to the draft on May 5, 2015.

Key changes to the new Draft Bill include:

  • adding “legitimate interest” as a basis for processing non-sensitive personal information;
  • adding a risk-based approach by data controllers and processors in establishing “best practices standards”;
  • broadening the definition of “consent”;
  • adding consent as a basis for legitimizing cross-border transfers;
  • requiring the application of data processing principles to public data;
  • adding a chapter on personal data processing by public authorities;
  • clarifying the competence of the Competent Public Body (a privacy authority); and
  • creating a multistakeholder, National Counsel of the Protection of Personal Data, to assist the Competent Public Body.

A more detailed summary of the revised Draft Bill can be found in an article titled Main Innovations of the Newest Version of the Brazilian Draft Law on the Protection of Personal Data, written by Brazilian attorneys Renato Leite Monteiro, Cyber Law and International Law Professor at Mackenzie University School of Law, and Bruno Bioni, Researcher for The Public Policy for Access to Information Research Group at the University of São Paulo. The next steps for the Draft Bill include an evaluation by the Brazilian Office of the Presidential Chief of Staff, followed by an introduction to Congress.

In addition, there are two other privacy bills currently moving through the Brazilian Congress, one in the Chamber of Deputies and another in the Federal Senate. An updated version of the Senate bill (PLS 330) was released on October 13, 2015. The current rapporteur for this bill is Senator Aloysio Nunes Ferreira. An English translation is not yet available.

In order for the Draft Bill to move forward, it would have to be merged with, or supersede, these other two privacy bills.

Trans-Pacific Partnership Addresses Cross-Border Data Transfers and Protection of Personal Information

On November 5, 2015, the White House released the proposed text of the Trans-Pacific Partnership Agreement (the “TPP”) containing a chapter on cross-border data transfers in the context of electronic commerce. In the chapter on Electronic Commerce, Chapter 14, the TPP includes commitments from participating parties to adopt and maintain a legal framework to protect personal information, and encourages cross-border data transfers to help facilitate business and trade.

Article 14.8, entitled Personal Information Protection, would commit participating countries to “adopt or maintain a legal framework that provides for the protection of the personal information of the users of electronic commerce.” The TPP advises countries to do so by taking into account principles and guidelines of relevant international bodies and to “encourage the development of mechanisms to promote compatibility between [the countries’] different regimes.”

In addition, Article 14.11, entitled Cross-Border Transfer of Information by Electronic Means, would commit participating countries to “allow the cross-border transfer of information by electronic means, including personal information, when this activity is for the conduct of the business of a covered person.” For purposes of this section, “covered person” refers to any citizen or business of any participating country, but excludes financial institutions. The TPP, however, also recognizes that countries have different cross-border transfer regimes and laws, and therefore does not prevent participating countries from “adopting or maintaining measures inconsistent with [cross-border transfers] to achieve a legitimate public policy objective,” provided that the measure does not apply a restriction on trade, unjustly discriminate or impose restrictions larger than those required to complete the transfer.

The TPP has not yet been ratified by the 12 participating countries, including the U.S. Congress will likely have at least 90 days to analyze and vote on the TPP before sending it to President Obama for final approval.

U.S. Chamber of Commerce Testifies about Safe Harbor at a Joint Hearing of the U.S. House of Representatives

On November 3, 2015, John Murphy, Senior Vice President for International Policy at the U.S. Chamber of Commerce, testified about the Court of Justice of the European Union’s (“CJEU’s”) EU-U.S. Safe Harbor Decision at a joint hearing of the House Commerce and Communications and Technology Subcommittees.

Murphy’s testimony emphasized the economic relationship between the U.S. and the EU and stressed that this relationship “relies on the seamless flow of data across borders.” He stated that the CJEU’s decision invalidating the Safe Harbor agreement threatens the transatlantic flow of data. Murphy noted that the CJEU’s decision focused on “process concerns” within the Safe Harbor agreement, such as restrictions placed on EU Member States’ enforcement authority, and did not address “the actual substantive commercial data protection rules.”

Accordingly, Murphy thanked the House for passing the Judicial Redress Act, which is intended to address one of the process concerns identified by the CJEU, and encouraged Congress to work with a group of European Parliamentarians who are in Washington D.C. this week to resolve outstanding issues regarding Safe Harbor. He concluded by urging U.S. and EU officials to promptly implement a revised Safe Harbor framework to allow European and American companies to transfer data across the Atlantic.

Senate Passes Cybersecurity Information Sharing Act

On October 27, 2015, the U.S. Senate passed S.754 – Cybersecurity Information Sharing Act of 2015 (“CISA”) by a vote of 74 to 21. CISA is intended to facilitate and encourage the sharing of Internet traffic information between and among companies and the federal government to prevent cyber attacks, by giving companies legal immunity from antitrust and privacy lawsuits. CISA comes in the wake of numerous recent, high-profile cyber attacks.

CISA is supported by the Department of Defense, the White House, the U.S. Chamber of Commerce and various financial industry groups. The Securities Industry and Financial Markets Association’s President and CEO Kenneth E. Bentsen Jr. stated, “The threat our economy faces from cyber attacks is real and information-sharing legislation will help the financial services industry to better protect our systems as well as the privacy of our customers.”

CISA, however, has come under attack by privacy and civil liberty organizations and technology companies who claim that CISA lacks appropriate privacy safeguards and does not do enough to limit the government’s use of users’ information. The Computer & Communications Industry Association, a technology industry trade group, stated, “[We] recognize the goal of seeking to develop a more robust system through which the government and private sector can readily share data about emerging threats. But such a system should not come at the expense of users’ privacy, need not be used for purposes unrelated to cybersecurity, and must not enable activities that might actively destabilize the infrastructure the bill aims to protect.”

CISA comes on the heels of two similar bills passed by the House of Representatives, H.R. 1731, the National Cybersecurity Protection Advancement Act of 2015 and H.R. 1560, the Protecting Cyber Networks Act. While CISA is similar to these pieces of legislation, a conference will be necessary to resolve differences between the House- and Senate-passed bills. Details on that conference, including who will be in attendance and whether the conference will be held on the House- or Senate-side, are not yet available but are expected to be resolved soon. One House aide reported that the conference would likely take place by the end of the year, saying, “We’re not expecting any fireworks or drama.” The Obama Administration has indicated support for the cybersecurity information sharing legislation, and President Obama is expected to sign the final bill.

CJEU Declares the Commission’s U.S. Safe Harbor Decision Invalid

On October 6, 2015, the Court of Justice of the European Union (the “CJEU”) issued its judgment in the Schrems v. Facebook case, following the Opinion of the Advocate General published on September 23, 2015. In its judgment, the CJEU concluded that:

  • The national data protection authorities (“DPAs”) have the power to investigate and suspend international data transfers even where the European Commission (the “Commission”) has adopted a decision finding that a third country affords an adequate level of data protection, such as Decision 2000/520 on the adequacy of the protection provided by the Safe Harbor Privacy Principles (the “Safe Harbor Decision”).
  • The Safe Harbor Decision is invalid.

Powers of National Authorities

The CJEU concluded that a decision of the European Commission on the adequacy level of data protection provided by a non-EU country cannot eliminate or reduce the powers granted to DPAs under the EU Data Protection Directive 95/46/EC. DPAs therefore can suspend international data transfers made under the Safe Harbor Framework following an investigation. The Court, however, also stated that the CJEU alone has the ultimate jurisdiction to examine the validity of a Commission adequacy decision.

Validity of U.S.-EU Safe Harbor Framework

In its judgment, the CJEU also assessed the validity of the Safe Harbor Decision. The CJEU observed that the Safe Harbor Framework solely applies to U.S. undertakings which adhere to it, leaving out of scope U.S. public authorities. In addition, national security, public interest and law enforcement requirements prevail over the Safe Harbor Framework. When a conflict arises with respect to these requirements, the U.S. undertakings are obligated to disregard the existing protective rules. The CJEU further concluded that U.S. legislation does not limit interference with individual’s rights to what is strictly necessary. Notably, the CJEU indicated that U.S. legislation authorizes on a general basis, storage of all personal data of all the persons whose data is transferred from the EU to the U.S. without any differentiation, limitation or exception being made in light of the objectives pursued, and without providing an objective criterion for determining limits to the access and use of this data by public authorities.

The CJEU further observed that the Safe Harbor Framework does not provide sufficient legal remedies to allow individuals to access their personal data and to obtain rectification or erasure of such data. This compromises the fundamental right to effective judicial protection, according to the CJEU.

Finally, the CJEU stated that the Safe Harbor Decision restricts the powers of DPAs to investigate the validity of the Decision and the Commission lacked competence to do so. For all of the reasons set forth above, the CJEU declared the Safe Harbor Decision invalid.

Next Steps

Following the judgment of the CJEU, the Irish DPA is required to examine, with all due diligence, whether the transfer of data of Facebook’s European users to the U.S. should be suspended given that the level of protection provided by the U.S. for data transferred under the U.S.-EU Safe Harbor Framework is no longer adequate.

The Article 29 Working Party, the UK Information Commissioner’s Office and the Spanish DPA have already published statements on the CJEU’s judgment explaining that they will work with other EU DPAs to issue further guidance for businesses and clarify the impact of the judgment on businesses.

View the full text of the CJEU’s judgment.

For a summary, please see the press release of the CJEU.

CIPL and Instituto Brasiliense de Direito Publico Host Global Data Privacy Dialogue in Brazil

On October 6 and 7, 2015, the Centre for Information Policy Leadership at Hunton & Williams LLP (“CIPL”), a global privacy policy think-tank based in Washington D.C. and London, and the Instituto Brasiliense de Direito Publico, a legal institute based in Brazil, will co-host a two-day Global Data Privacy Dialogue in Brazil, at the IDP’s conference facilities.

The conference will bring together Brazilian and international privacy experts from government, industry and academia to discuss how to achieve effective privacy protection for individuals, while at the same time enabling technological innovation and the beneficial uses of personal data in the age of Big Data and the Internet of Things. The Global Data Privacy Dialogue is part of an initiative to facilitate and support international expert engagement with key Brazilian stakeholders during Brazil’s ongoing process to develop a comprehensive privacy law.

During the conference, participants from Brazil, Uruguay, Colombia, Europe, the United States and Canada will discuss:

  • the realities of modern information technology and information uses;
  • Brazil’s draft privacy legislation and other important global developments in data protection;
  • how to govern global data flows;
  • how to apply core privacy principles such as consent in the modern information age;
  • how to design effective organizational privacy compliance programs and best practices; and
  • the role of a national data protection authority.

“Achieving the dual goal of privacy and beneficial use of data is imperative, and we don’t need to sacrifice one for the other. Our hope is that we can bring to bear the tremendous wealth of experience that already exists around the world on the many important privacy policy issues currently being considered in Brazil,” said Bojana Bellamy, CIPL’s president. “Brazil is an important economy and whatever happens in Brazil on privacy legislation will have a global impact.”

Laura Schertel Mendes, IDP Researcher, and Sérgio Alves Jr., IDP Executive Secretary, welcomed the collaboration with CIPL. Schertel noted, “Brazil has achieved global attention as a leader in internet policymaking by liaising national and international communities of academics, governmental agencies, private companies and civil society.”

“We expect this Dialogue will contribute to the discussion on how to improve the Brazilian legal framework with effective, updated, and enforceable privacy protection tools and policies,” Alves added.

Speakers for the Dialogue include: Virgilio Almeida, Secretary for Information Technology from the Ministry of Science, Technology and Innovation; Peter Hustinx, former European Data Protection Supervisor; Juliana Pereira da Silva, National Secretary of the Consumer in Brazil’s Ministry of Justice; Maximiliano Martinhão, Secretary of Telecommunications, Ministry of Communications; and David Smith, Deputy Commissioner and Director of Data Protection, UK Information Commissioner’s Office.

View the agenda.

Google Granted Permission to Appeal to the UK Supreme Court

On July 28, 2015, the UK Supreme Court announced its decision to grant permission in part for Google Inc. (“Google”) to appeal the England and Wales Court of Appeal’s decision in Google Inc. v Vidal-Hall and Others.

As we reported previously, the claimants in this case were users of Apple’s Safari browser who argued that during certain months in 2011 and 2012, Google collected information about their browsing habits via cookies placed on their devices without their consent and in breach of Google’s privacy policy. The Court of Appeal ruled on two important issues.

The first issue was whether or not there was a tort of “misuse of private information” under English law. The Court of Appeal upheld the lower court’s decision and affirmed that there is such a tort under English law, but that this is not a new cause of action. Rather, the Court of Appeal stated that its approach “simply gives the correct legal label to one that already exists.”

The second issue was whether damages under Section 13(2) of the Data Protection Act 1998 (the “Act”) can be awarded in circumstances in which the claimant has not suffered any financial harm. The claimants argued that they had suffered anxiety and distress, but did not allege that they suffered financial harm. The Court of Appeal held that, on a literal interpretation, Section 13(2) of the Act does not permit any damages in the absence of financial harm, but noted that it does not appear to be compatible with EU Data Protection Directive 95/46/EC (the “Directive”). The Directive permits claims for damages without financial harm. Building on the evolution of English case law in this area over the last decade, the Court of Appeal held that the claimants could recover damages from Google without showing financial harm, regardless of language to the contrary in Section 13(2) of the Act.

Subsequently, Google applied for permission to appeal to the Supreme Court on the following grounds:

  • whether the Court of Appeal was right to hold the claimant’s claims for misuse of private information are tort claims for the purposes of the rules relating to service of the proceedings out of the jurisdiction;
  • whether the Court of Appeal was right to hold that Section 13(2) of the Act was incompatible with Article 23 of the Directive; and
  • whether the Court of Appeal was right to decline the application of Section 13(2) of the Act on the grounds that it conflicts with the rights guaranteed by Articles 7 and 8 of the EU Charter of Fundamental Rights.

The Supreme Court refused permission to appeal on the first ground on the basis that it does not raise an arguable point of law, but granted permission to appeal on all other grounds. It is anticipated that the Supreme Court will hear the appeal during 2016.

States Writing Biometric-Capture Laws May Look to Illinois

Recent class actions filed against Facebook and Shutterfly are the first cases to test an Illinois law that requires consent before biometric information may be captured for commercial purposes. Although the cases focus on biometric capture activities primarily in the social-media realm, these cases and the Illinois law at issue have ramifications for any business that employs biometric-capture technology, including those who use it for security or sale-and-marketing purposes. In a recent article published in Law360, Hunton & Williams partner, Torsten M. Kracht, and associate, Rachel E. Mossman, discuss how businesses already using these technologies need to keep abreast of new legislation that might affect the legality of their practices, and how businesses considering the implementation of these technologies should consult local rules and statutes before implementing biometric imaging.

Read the full article now.

Connecticut Passes New Data Protection Measures into Law

On July 1, 2015, Connecticut’s governor signed into law Public Act No. 15-142, An Act Improving Data Security and Agency Effectiveness (the “Act”), that (1) amends the state’s data breach notification law to require notice to affected individuals and the Connecticut Attorney General within 90 days of a security breach and expands the definition of personal information to include biometric data such as fingerprints, retina scans and voice prints; (2) affirmatively requires all businesses, including health insurers, who experience data breaches to offer one year of identity theft prevention services to affected individuals at no cost to them; and (3) requires health insurers and contractors who receive personal information from state agencies to implement and maintain minimum data security safeguards. With the passing of the Act, Connecticut becomes the first state to affirmatively require businesses to provide these security services to consumers.

A brief summary of the data security requirements for health insurers and state contractors is set forth below:

Health Insurers

The new legislation requires health insurers and related entities (including pharmacy and third party benefits administrators) to:

  • Create a comprehensive information security program to safeguard individuals’ personal information.
  • Encrypt personal information being transmitted or while stored on a portable device.
  • Implement security measures to protect personal information stored on Internet-accessible devices.
  • Implement access controls and authentication measures to ensure that access to personal information is limited only to those who need it in connection with their job function.
  • Ensure that employees and third parties comply with data security requirements.

These requirements are effective October 1, 2015, but health insurers have until October 1, 2017, to come into full compliance.

State Contractors

Additionally, the Act requires that contracts between a state agency and a contractor authorizing the contractor to receive personal information include terms and conditions requiring the contractor to implement data security measures to protect the relevant personal information. The minimum data security requirements for contractors are substantially similar to the requirements for health insurers listed above, but also include additional requirements that the contractor:

  • Obtain approval from the contracting state agency to store data on removable storage media.
  • Report any suspected or actual breaches of the personal information to the state as soon as practical after discovery.

The section pertaining to state contractors is effective July 1, 2015.

Read the complete terms of the Act.

House of Representatives Passes Bill to Permit Broader Use and Disclosure of Protected Health Information for Research Purposes

On July 10, 2015, the United States House of Representatives passed the 21st Century Cures Act (the “Act”), which is intended to ease restrictions on the use and disclosure of protected health information (“PHI”) for research purposes.

Currently, the HIPAA Privacy Rule permits the use and disclosure of PHI for research purposes without requiring authorization from an individual but does require that any waiver of the authorization requirement be approved by an institutional review board or a privacy board.

The Act amends the Health Information Technology for Economic and Clinical Health (“HITECH”) Act to obligate the Secretary of the Department of Health and Human Services to revise or clarify the HIPAA Privacy Rule to:

  • Allow the use and disclosure of PHI by a covered entity for research purposes to be treated as that entity’s “health care operations.”
  • Enable research activities that are related to the quality, safety, or effectiveness of a product or activity that is regulated by the Food and Drug Administration (“FDA”) to be considered public health activities so that the activities can be disclosed to a person subject to the FDA’s jurisdiction for the purposes of collecting or reporting adverse events, tracking FDA-regulated products, enabling product recalls or repairs or conducting post-marketing surveillance.
  • Permit remote access to PHI so long as the covered entity and researcher maintain “appropriate security and privacy safeguards” and the PHI is “not copied or otherwise retained by the researcher.”
  • Specify that an authorization for the use or disclosure of PHI for future research purposes is deemed to sufficiently describe the purpose of the use or disclosure of PHI if the authorization (1) sufficiently describes the purposes such that it would be reasonable for the individual to expect that the PHI could be used or disclosed for such future research, and (2) states that the authorization will either expire on a particular date or at a particular event or will remain valid “unless and until it is revoked by the individual.”

The Act also requires the Office of the National Coordinator for Health Information Technology to publish guidance that clarifies the HIPAA Privacy and Security Rules with respect to information blocking, which includes any business or technical practices that “prevent or materially discourage the exchange of electronic health information” and “do not serve to protect patient safety, maintain the privacy and security of individuals’ health information or promote competition and consumer welfare.”

The Act, which garnered widespread bipartisan support, now moves to the Senate, which is expected to take up the legislation this fall.

Several groups, including the Pharmaceutical Research and Manufacturers of America and the Association of American Medical Colleges, support the 21st Century Cures Act.

New Hampshire and Oregon Student Privacy Legislation

Legislators in New Hampshire and Oregon recently passed bills designed to protect the online privacy of students in kindergarten through 12th grade.

On June 11, 2015, New Hampshire Governor Maggie Hassan (D-NH) signed H.B. 520, a bipartisan bill that requires operators of websites, online platforms and applications targeting students and their families (“Operators”) to create and maintain “reasonable” security procedures to protect certain covered information about students. H.B. 520 also prohibits Operators from using covered information for targeted advertising. H.B. 520 defines covered information broadly as “personally identifiable information or materials,” including name, address, date of birth, telephone number and educational records, provided to Operators by students, their schools, their parents or legal guardians, or otherwise gathered by the Operators.

Governor Hassan said that technology “is an essential component of the 21st century innovation economy” and plays an important and growing role in the classroom. She added that H.B. 520 protects New Hampshire students against threats to their privacy while enabling them to participate in that economy. H.B. 520 takes effect on January 1, 2016.

On June 10, 2015, the Oregon legislature passed S.B. 187, providing similar protections to K-12 students’ personal information and restricting the use of that information by Operators. The bill defines “covered information” in the same way as the New Hampshire student privacy bill and applies to the same types of Operators. S.B. 187 prohibits selling student information and presenting students with targeted advertisements. Operators also may not disclose student information to third parties, except in limited circumstances, but may use “de-identified student information” to improve or market the effectiveness of their products. Legislators rejected proposals backed by the technology industry that would have allowed students ages 12 and older to consent to the use and disclosure of covered information.

S.B. 187 grants the Oregon Attorney General enforcement power under the state’s consumer protection statute. Governor Kate Brown (D-OR) is expected to sign the bill, which would take effect on July 1, 2016.

Both New Hampshire and Oregon modeled their student privacy legislation on California’s Student Online Personal Information Protection Act, which was enacted in 2014.

Florida Passes Drone Surveillance Bill Requiring Individual Consent

On April 28, 2015, the Florida House of Representatives passed a bill (SB 766) that prohibits businesses and government agencies from using drones to conduct surveillance by capturing images of private real property or individuals on such property without valid written consent under circumstances where a reasonable expectation of privacy exists.

The bill expands Florida’s Freedom from Unwarranted Surveillance Act to prohibit the “use [of] a drone equipped with an imaging device to record an image of privately owned real property or of the owner, tenant, occupant, invitee, or licensee of such property with the intent to conduct surveillance on the individual or property captured in the image in violation of such person’s reasonable expectation of privacy without his or her written consent.” Under the bill, there is a presumption that a person has a “reasonable expectation of privacy on his or her privately owned real property if he or she is not observable by persons located at ground level in a place where they have a legal right to be, regardless of whether he or she is observable from the air with the use of a drone.” The term “surveillance” is broadly defined to cover surveillance activities that allow drone operators to observe individuals and real property with sufficient visual clarity to obtain information about an individual’s identity, habits, conduct, movements or whereabouts, or the unique identifying features or occupancy of the property.

Individuals will have a private right of action under the bill to seek compensatory damages, including punitive damages and attorney fees, and injunctive relief for violations of the surveillance prohibition. The bill, however, contains several exceptions to the surveillance prohibition, such as when drones are used for certain surveillance purposes by utilities, state-licensed entities, and businesses delivering cargo, conducting environmental monitoring or engaging in aerial mapping.

The Florida Senate passed the bill in a 37-2 vote on April 23, 2015, and as a result of the recent House vote, the bill will be sent to Florida’s governor for approval.

Update: On May 14, 2015, Florida Governor Rick Scott signed bill SB 766, named the Freedom from Unwarranted Surveillance Act, into law.

Data Security Act Introduced in New York State Assembly

On April 8, 2015, a New York Assemblyman introduced the Data Security Act in the New York State Assembly that would require New York businesses to implement and maintain information security safeguards. The requirements would apply to “private information,” which is defined as either:

  • personal information consisting of any information in combination with one or more of the following data elements, when either the personal information or the data element is not encrypted: Social Security number; driver’s license number or non-driver identification card number; financial account or credit or debit card number in combination with any required security code or password; or biometric information;
  • a user name or email address in combination with a password or security question and answer that would permit access to an online account; or
  • unsecured protected health information (as that term is defined in the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) Privacy Rule).

The Data Security Act obligates entities to develop an information security program that includes:

  • administrative safeguards, such as conducting risk assessments, training employees and selecting service providers capable of maintaining appropriate safeguards;
  • technical safeguards, such as assessing risks in network and software design and regularly testing and monitoring the effectiveness of key controls; and
  • physical safeguards, such as disposing of electronic media so that the information cannot be read or reconstructed.

The Data Security Act deems certain specific entities in compliance with the law’s requirements, such as financial institutions that comply with the Gramm-Leach-Bliley Act, HIPAA-regulated entities, and entities that comply with NIST Standards. Entities that comply with the latest version of NIST Special Publication 800-53 are also immune from any civil liability under the Act.

The Data Security Act establishes a rebuttable presumption that an entity that obtains an independent third party certification complies with the requirements of the law. The New York Attorney General is empowered to enjoin any violations of the Data Security Act, and can obtain civil penalties of $250 for each person whose private information was compromised, up to a maximum of $10 million. For knowing and reckless violations, these amounts can increase to $1,000 for each affected person up to a total of the higher of $50 million or three times the aggregate amount of any actual costs and losses.

The Data Security Act also amends New York’s breach notification law by using the expanded definition of “private information” discussed above. Previously, New York’s law did not cover breaches involving biometric information, user names and passwords, or protected health information.

House of Representatives Passes Two Cybersecurity Bills

The House of Representatives passed two complimentary bills related to cybersecurity, the “Protecting Cyber Networks Act” (H.R. 1560) and the “National Cybersecurity Protection Advancement Act of 2015” (H.R. 1731). These bills provide, among other things, liability protection for (1) the use of monitoring and defensive measures to protect information systems, and (2) the sharing of cybersecurity threat information amongst non-federal entities and with the federal government. With the Senate having just recently overcome disagreement on sex trafficking legislation and the Attorney General nomination, that body is now expected to consider similar information sharing legislation entitled the “Cybersecurity Information Sharing Act” (S. 754) in the coming weeks. Assuming S. 754 also is passed by the Senate, the two Chambers of Congress will convene a Conference Committee to draft a single piece of legislation which will be then voted on by the House and Senate, before heading to the President’s desk. The White House has not committed to signing any resulting legislation, but has signaled some positive support.

H.R. 1560, passed by the House on April 22, provides liability protection for companies and other non-federal entities that share cybersecurity threat information with each other and with civilian agencies in the federal government – in other words, agencies other than the Department of Defense (“DoD”) and its National Security Agency (“NSA”). H.R. 1560, however, also authorizes a government agency receiving cybersecurity threat information to instantly share that information with any other appropriate federal agency, including the DoD and NSA. Sponsored by Representatives Devin Nunes (R-CA) and Adam Schiff (D-CA), the Chairman and Ranking Member of the House Intelligence Committee, H.R. 1560 enjoyed strong bipartisan support throughout the legislative process, reflected in the House’s 307-116 vote. Representatives Joe Barton (R-TX) and Diana DeGette (D-CO), the co-chairs of the Congressional Bipartisan Privacy Caucus, both voted against the bill, however, as did Congressman Jared Polis (D-CO), who said H.R. 1560 “falls short of its goals and likely does more than good . . . [by] raising enormous concerns about the inappropriate sharing of personal information and surveillance on Americans’ private lives.”

The next day, the House passed H.R. 1731, a bill sponsored by Representative Michael McCaul (R-TX), Chairman of the House Homeland Security Committee, which designates the U.S. Department of Homeland Security’s (“DHS”) National Cybersecurity and Communication Integration Center as the lead federal civilian interface for cybersecurity threat information sharing. As with H.R. 1560, H.R. 1731 also allows DHS to instantly share cybersecurity threat information it receives with any appropriate federal agency, including the NSA. Strong bipartisan support allowed the House to pass information sharing legislation with an overwhelming majority, 355 to 63. During consideration of the bill, a series of amendments were adopted, including one refining the definition of cyber “incident” to explicitly restrict information sharing to incidents that are directly related to protecting information systems.

House leadership’s decision to act on H.R. 1560 and H.R. 1731 as part of the House’s “Cyber Week” comes amid concern that upcoming Congressional action to reauthorize PATRIOT Act provisions set to expire on June 1 could bog information sharing legislation down in debate on NSA reform. Notably, information sharing legislation was effectively foreclosed in the last Congress when an NSA reform bill was voted down in the Senate late in 2014. As noted by H.R. 1560’s co-sponsor Rep. Schiff, however, himself a privacy advocate, “[t]he prospects for successful passage of cyber legislation has gone up dramatically.”

Washington State Senate Approves Amendment to Data Breach Notification Law

On April 13, 2015, the Senate of Washington State unanimously passed legislation strengthening the state’s data breach law. The bill (HB 1078) passed the Senate by a 47-0 vote, and as we previously reported, passed the House by a 97-0 vote.

The bill includes the following amendments to Washington’s existing data breach notification law:

  • Requires notification to the state attorney general in the event of a breach;
  • imposes a 45-day deadline for notification to affected residents and the state attorney general;
  • mandates content requirements for notices to affected residents, which must include (i) the name and contact information of the reporting business, (ii) a list of the types of personal information subject to the breach, and (iii) the toll-free telephone numbers and address of the consumer reporting agencies;
  • expands the current law to cover hard-copy data as well as “computerized” data;
  • introduces a safe harbor for personal information that is “secured,” which is defined to mean the data is encrypted in a manner that “meets or exceeds” the National Institute of Standards and Technology standard or is otherwise “modified so that the personal information is rendered unreadable, unusable, or undecipherable by an unauthorized person”; and
  • adds federal preemption language that would exempt certain covered entities from having to comply with Washington’s breach law.

The bill will now head to Governor Jay Inslee for consideration.

Update: On April 23, 2015, Governor Jay Inslee signed the bill into law.

UK Court Ruling Allows Claims Against Google for Misuse of Private Information

On March 27, 2015, the England and Wales Court of Appeal issued its judgment in Google Inc. v Vidal-Hall and Others. Google Inc. (“Google”) appealed an earlier decision by Tugendhat J. in the High Court in January 2014. The claimants were users of Apple’s Safari browser who argued that during certain months in 2011 and 2012, Google collected information about their browsing habits via cookies placed on their devices without their consent and in breach of Google’s privacy policy.

The Court of Appeal ruled on two important issues. The first issue was whether there is a tort of “misuse of private information” under English law. In order to serve proceedings in an English Court on Google (in California), the claimants’ arguments had to satisfy one of a limited number of “gateways.” The relevant gateway in this case required the claimants to show that their claims relate to an actionable tort. Because there is existing case law holding that there is no general tort of invasion of privacy, the claimants argued that the High Court should explicitly recognize a tort of misuse of private information. The High Court agreed and Google appealed the decision. The Court of Appeal upheld the High Court’s decision, and affirmed that there is a tort of misuse of private information under English law. The Court of Appeal stated that this was not a new cause of action, but that it “simply gives the correct legal label to one that already exists.”

The second issue was whether damages under Section 13(2) of the Data Protection Act 1998 (the “Act”) can be awarded in circumstances in which the claimant has not suffered any financial harm. The claimants argued that they had suffered anxiety and distress, but did not allege that they suffered financial harm. This case was unusual because the UK Information Commissioner’s Office (the “ICO”) made submissions to the Court of Appeal as an intervening party. In those submissions, the ICO argued that its previous guidance on Section 13 (which indicated that damages were not available except in cases of financial harm) was incorrect, and that damages should be available in this case. The Court of Appeal accepted the ICO’s submissions but held that, using a literal interpretation, Section 13(2) does not permit damages in the absence of financial harm. The Court of Appeal also noted, however, that Section 13(2) of the Act did not appear to be compatible with EU Data Protection Directive 95/46/EC, which appears to permit claims for damages without financial harm. Expanding upon the evolution of English case law in this area over the last decade, the Court of Appeal held that the claimants could recover damages from Google without showing financial harm, regardless of the contrary language in Section 13(2). It is not yet clear whether Google will appeal this decision.

The consequences of the case may be significant. For Google, it means that the claimants can bring their claims in English Courts directly. This may result in large numbers of such claims, although the Court of Appeal noted that any damages likely will be “relatively modest.”

In a wider context, where any company fails to fulfil its obligations under the Act (e.g., if it suffers a data breach or fails to comply with its own privacy policy) it may face claims for damages brought by the affected individuals (e.g., its customers or employees) if those individuals can demonstrate that they have suffered material anxiety or distress, even if they have not suffered any financial loss. In addition, in some circumstances, such claims may be brought in English Courts, regardless of whether the company is established outside of the UK.

Montana and Washington State Propose Amendments to Data Breach Legislation

On March 4, 2015, the House of Representatives of Washington passed a bill (HB 1078), which would amend the state’s breach notification law to require notification to the state Attorney General in the event of a breach and impose a 45-day timing requirement for notification provided to affected residents and the state regulator. The bill also mandates content requirements for notices to affected residents, including (1) the name and contact information of the reporting business; (2) a list of the types of personal information subject to the breach; and (3) the toll-free telephone numbers and address of the consumer reporting agencies. In addition, while Washington’s breach notification law currently applies only to “computerized” data, the amended law would cover hard-copy data as well.

The bill introduces a safe harbor for personal information that is “secured,” which is defined to mean the data is encrypted in a manner that “meets or exceeds” the National Institute of Standards and Technology (“NIST”) standard or is otherwise “modified so that the personal information is rendered unreadable, unusable, or undecipherable by an unauthorized person.” In addition, notice is not required if the breach is “not reasonably likely to subject consumers to a risk of harm.” The bill adds federal preemption language that would exempt certain covered entities from having to comply with the state breach law. With respect to enforcement, the bill would make an organization’s failure to comply with the state’s breach notification law a violation of the Consumer Protection Act.

The bill, which passed the House of Representatives 97-0, will now face the Washington State Senate. It has broad bipartisan support, and if enacted would strengthen the state’s data breach laws.

The Washington legislation was introduced just over a week after Montana’s governor signed into law HB 74, which amends Montana’s existing data breach notification law to expand the definition of personal information to include medical record information and an “identity protection personal identification number” issued by the IRS. The amended law also requires entities to submit to the state Attorney General’s Consumer Protection Office an electronic copy of the notice to affected individuals, and to indicate the date and method of distribution of the individual notice and the number of residents impacted by the breach. The bill was enacted on February 27, 2015, and will take effect on October 1, 2015.

White House Releases Discussion Draft for a Consumer Privacy Bill of Rights

On February 27, 2015, the White House released a highly-anticipated draft of the Consumer Privacy Bill of Rights Act of 2015 (the “Act”) that seeks to establish baseline protections for individual privacy in the commercial context and to facilitate the implementation of these protections through enforceable codes of conduct. The Federal Trade Commission is tasked with the primary responsibility for promulgating regulations and enforcing the rights and obligations set forth in the Act.

The Act’s baseline of consumer protections would apply broadly (with certain stated exceptions) to the privacy practices of covered entities that collect, create, process, retain, use or disclose personal data in or affecting interstate commerce. “Personal data” is broadly defined under the Act as “any data … under the control of a covered entity, not otherwise generally available to the public through lawful means, and … linked, or as a practical matter linkable by the covered entity, to a specific individual, or linked to a device that is associated with or routinely used by an individual.” The Act carves out from the definition of personal data several types of information, including de-identified data, cybersecurity data and employee data that is collected or used by an employer in connection with an employee’s employment status.

The Act sets forth individual rights for consumers and corresponding obligations of covered entities in connection with personal data. Key examples of the proposed privacy protections and obligations include:

  • Transparency. Covered entities shall provide individuals with clear, timely, conspicuous and easily understandable notice about the entity’s privacy and security practices. The Act sets forth various content requirements for such notices.
  • Individual Control. Individuals must be provided with reasonable means to control the processing of their personal data that are proportionate to the privacy risk to the individual and are consistent with context, which is defined to mean the circumstances surrounding a covered entity’s processing of personal data.
  • Respect for Context. If a covered entity processes personal data in a manner that is not reasonable in light of context, the entity must conduct a privacy risk analysis, and take reasonable steps to mitigate any identified privacy risks. If the privacy risk analysis is conducted under the supervision of an FTC-approved Privacy Review Board, the covered entity may be excused from certain heightened requirements under this section.
  • Focused Collection and Responsible Use. Covered entities may collect, retain and use personal data only in a manner that is reasonable in light of context. This limitation requires businesses to consider ways to minimize privacy risk, as well as to delete, destroy or de-identify personal data within a reasonable time after fulfilling the purposes for which the personal data were first collected.
  • Security. Covered entities are expected to identify reasonably foreseeable internal and external risks to the privacy and security of personal data that could result in the unauthorized disclosure, misuse, alteration, destruction or other compromise of the information. Based on this analysis, covered entities must establish, implement and maintain safeguards reasonably designed to ensure the security of such personal data, including but not limited to protecting against unauthorized loss, misuse, alteration, destruction, access to, or use of the business’ information.
  • Access and Accuracy. Upon request, a covered entity must provide an individual with reasonable access to, or an accurate representation of, personal data that pertains to the individual and is under the control of the covered entity. This obligation entails providing the individual with a means to dispute and resolve the accuracy and completeness of his or her personal data.
  • Accountability. Covered entities must take measures appropriate to the privacy risks associated with its personal data practices, including training employees, conducting internal or independent evaluations, building appropriate consideration for privacy and data protections into the design of systems and business practices, and contractually binding third parties to comply with similar requirements prior to disclosing personal data to them.

Under the Act, a violation of the relevant requirements constitutes an unfair or deceptive act or practice in violation of Section 5 of the FTC Act. While the attorney general of any state may bring a federal enforcement action for injunctive relief based on an alleged violation causing harm to a substantial number of the state’s residents, the FTC has the right to intervene as a party and assume lead responsibility for the prosecution. In an action brought or prosecuted by the FTC, the covered entity also may be liable for a civil penalty of up to $25 million under certain circumstances. The Act offers covered entities a safe harbor against enforcement actions when they have complied with an FTC-approved code of conduct for data governance that provides equivalent or greater protections for personal data than that of the Act. In addition, the Act does not offer a private right of action to individuals.

Notably, the Act preempts state and local laws to the extent they impose requirements with respect to personal data processing, but it does not preempt states’ general consumer protection laws, health or financial information laws, or data breach notification laws. With respect to federal preemption, the Act does not modify, limit or supersede the privacy or security provisions of federal laws, including the Gramm-Leach-Bliley Act and the Health Insurance Portability and Accountability Act of 1996.

As we reported on February 23, 2012, the White House released a report outlining a framework for U.S. data protection and privacy policy that included a Consumer Privacy Bill of Rights.

Read the Consumer Privacy Bill of Rights Act of 2015.

Two Wyoming Bills Amending the State’s Breach Notification Statute Are Headed to the Governor

On February 23, 2015, the Wyoming Senate approved a bill (S.F.36) that adds several data elements to the definition of “personal identifying information” in the state’s data breach notification statute. The amended definition will expand Wyoming’s breach notification law to cover certain online account access credentials, unique biometric data, health insurance information, medical information, birth and marriage certificates, certain shared secrets or security tokens used for authentication purposes, and individual taxpayer identification numbers. The Wyoming Senate also agreed with amendments proposed by the Wyoming House of Representatives to another bill (S.F.35) that adds content requirements to the notice that breached entities must send to affected Wyoming residents. Both bills are now headed to the Wyoming Governor Matt Mead for signing.

Bill S.F.36 would broaden the definition of “personal identifying information” to include an individual’s first name or first initial and last name in combination with any one or more of the data elements below:

  • Social Security number;
  • driver’s license number;
  • account number, credit card number or debit card number in combination with any security code, access code or password that would allow access to a financial account of the individual;
  • tribal identification card;
  • federal or state government issued identification card;
  • shared secrets or security tokens that are known to be used for data based authentication;
  • username or email address, in combination with a password or security question and answer that would permit access to an online account;
  • birth or marriage certificate;
  • medical information;
  • health insurance information;
  • unique biometric data; or
  • individual taxpayer identification number.

Bill S.F.35 would impose content requirements on the notice that breached entities must send affected Wyoming residents. Specifically, if enacted, the bill would require the notice to affected Wyoming residents to include (1) the types of personal identifying information subject to the breach, (2) a general description of the breach, (3) the approximate date of the breach, (4) the remedial actions taken by the entity, (5) advice directing the Wyoming resident to remain vigilant, and (6) whether notification was delayed pursuant to a request from law enforcement.

Both bills are headed to the Governor of Wyoming, Matt Mead, for his consideration.

Update: On March 2, 2015, Wyoming Governor Matt Mead signed both bills into law. The bills will become effective on July 1, 2015.

Office of the Privacy Commissioner of Canada Releases Research Report on Privacy and Cybersecurity

On February 12, 2015, the Office of the Privacy Commissioner of Canada released a research report entitled Privacy and Cyber Security – Emphasizing privacy protection in cyber security activities (the “Report”). The Report explores the interconnected relationship among cybersecurity, privacy and data protection, including common interests and challenges.

The Report illustrates some of the current and growing challenges for data protection and cybersecurity including:

  • the growing complexity of managing and providing security for cyberspace;
  • the growing sophistication and “professionalization” of cybercrimes and hackings;
  • the future focus of cyber criminals on the mobile sphere;
  • the risks of “big data” and “big data” analytics to individual privacy;
  • the failures of companies and organizations to prioritize breach preparedness; and
  • the shortcomings of a “check the box” approach to compliance with data protection laws, and the need for effective risk management and dynamic implementation of security.

The second half of the Report addresses national cybersecurity policy and foreign policy developments. The Report cautions that as cybersecurity policies progress at the national level, security and public safety concerns may overshadow individual privacy protection. Ronald Deibert, Director of the Canada Centre for Global Security Studies and the Citizen Lab at the Munk School of Global Affairs, University of Toronto, describes this scenario as the “securitization” of cyberspace, where cyberspace becomes solely a matter of national security. To prevent this securitization, Deibert proposes a “stewardship” approach, stating that cyberspace does not belong to a particular person or group and everyone, including governments, law enforcement agencies and the private sector, has a role to play in shaping the foundation and evolution of cyberspace.

The Report states that cyberspace governance and security is a global issue, and thus requires a global collaborative response through international standards and cooperation. As cybersecurity policies continue to develop across the world, privacy and data protection authorities should ensure that they adequately protect individual privacy rights.

The Report concludes with recommendations in three key areas where an increased emphasis on privacy protection could help support and advance cybersecurity activities. The first recommendation is to build privacy values into cybersecurity policy directions. The second recommendation is to use legislation to incentivize cybersecurity preparedness and ensure accountability for personal information protection. The third recommendation is to increase cybersecurity dialogue among all stakeholders to protect individual privacy and promote responsible data stewardship.

President Obama Signs Executive Order on Cybersecurity Information Sharing

On February 13, 2015, at the White House’s Cybersecurity and Consumer Protection Summit at Stanford University, President Obama signed an executive order promoting private sector cybersecurity information sharing (“Executive Order”). Building on the current cybersecurity information sharing efforts of Information Sharing and Analysis Centers and groups such as the National Cyber-Forensics and Training Alliance, the new Executive Order emphasizes the need for private companies, non-profit organizations and government agencies to share information about cyber threats, vulnerabilities and incidents. Its purpose is to facilitate private-private and public-private cybersecurity information sharing while (1) protecting the privacy and civil liberties of individuals; (2) protecting business confidentiality; (3) safeguarding shared information; and (4) protecting the government’s ability to detect, investigate, prevent and respond to cyber threats.

The Executive Order directs the Department of Homeland Security (“DHS”), in consultation with other federal agencies, to “strongly encourage” the development and formation of voluntary Information Sharing and Analysis Organizations (“ISAOs”). An ISAO may be organized based on sector, sub-sector, region or other affinity, including in response to a particular threat or vulnerability. It may be a for-profit or non-profit entity, and may take on a variety of forms, including a community group, membership organization, or even an individual company that shares information among its customers or partners. DHS will fund a non-governmental organization to serve as a standards organization that identifies a common set of voluntary standards for the creation and functioning of ISAOs. The mission of the standards organization will be to make collaboration safer, faster and easier, and to ensure greater coordination within the private sector to respond to cyber threats.

The Executive Order streamlines the process through which DHS enters into information sharing arrangements with ISAOs. Specifically, it directs the National Cybersecurity and Communications Integration Center at DHS (“NCCIC”) to engage in continuous, collaborative and inclusive coordination with ISAOs with respect to sharing cybersecurity information, addressing cyber risks and incidents, and strengthening information security systems.

The Executive Order addresses privacy concerns by ensuring that ISAOs agree to abide by a common set of privacy standards, and that agencies collaborating with ISAOs coordinate their activities with senior privacy officials and ensure the incorporation of appropriate privacy protections.

In addition, the Executive Order makes it easier for ISAOs and individual companies to access classified cybersecurity information by amending Executive Order 12829 on the National Industrial Security Program. As amended, Executive Order 12829 now gives DHS the authority to approve classified information sharing arrangements and ensure that information sharing entities can appropriately access classified cybersecurity information.

Brazil Releases Draft Personal Data Protection Bill

On January 28, 2015, the Brazilian government issued the Preliminary Draft Bill for the Protection of Personal Data (Anteprojeto de Lei para a Proteção de Dados Pessoais) on a website specifically created for public debate on the draft bill. The text of the bill (in Portuguese) is available on the website. (http://participacao.mj.gov.br/)

The draft bill applies to individuals and companies that process personal data via automated means, provided that the (1) processing occurs in Brazil or (2) personal data was collected in Brazil. The draft bill would impose data protection obligations and requirements on businesses processing personal data in Brazil, including a:

  • Requirement to obtain free, express, specific and informed consent to process personal data, with limited exceptions. For example, consent is not required if the personal data is processed to (1) comply with a legal obligation or (2) implement pre-contractual procedures or obligations related to an agreement in which the data subject is a party.
  • Prohibition on processing sensitive personal data, except in limited circumstances. For example, sensitive personal data may be processed with the specific consent of the data subject after the data subject has been informed of the risks associated with processing the sensitive personal data. Sensitive personal data includes, among other information, racial and ethnic origins, religious, philosophical or moral beliefs, political opinions, health and sexual orientation information, and genetic data.
  • Obligation to immediately report data breaches to the competent authority.
  • Requirement to allow data subjects access to their personal data and correct it if it is incomplete, inaccurate or out-of-date, with limited exceptions.
  • Restriction from transferring personal data to countries that do not provide similar levels of data protection.
  • Obligation to adopt information security measures that are proportional to the personal data processed and protect the information from unauthorized access, destruction, loss, alteration, communication or dissemination.

The draft bill contains penalties for violations, including fines and the suspension or prohibition of processing personal data for up to 10 years. Participation in the discussion is open to the public and comments on the draft bill may be submitted on the website. All comments must be in Portuguese and targeted to specific articles of the draft bill. The deadline for comment submission is February 27, 2015.

European Data Protection Supervisor Speaks on Data Protection Day

On January 28, 2015, in connection with Data Protection Day, newly appointed European Data Protection Supervisor (“EDPS”) Giovanni Buttarelli spoke about future challenges for data protection. Buttareli encouraged the EU “to lead by example as a beacon of respect for digital rights,” and “to be at the forefront in shaping a global, digital standard for privacy and data protection which centers on the rights of the individual.” Buttarelli stressed that in the context of global technological changes, “the EU has to make existing data protection rights more effective in practice, and to allow citizens to more easily exercise their rights.”

Buttarelli also gave his opinion on the “security versus right-to-privacy” debate, which has been the subject of intense discussions in the EU after the deadly attacks on Charlie l’Hebdo in France. While EU policymakers are debating new measures to tackle terrorism that implicate privacy, Buttarelli encouraged “legislators not to act on the basis of emotions and to consider the long term effects.” He stated that privacy is a fundamental right and that measures that undermine privacy in the name of security are lawful only after a showing that the measures are necessary and proportional to the privacy and security interests at issue.

Buttarelli also stated that he intends to work closely with EU institutions and assist their progress on multiple pending initiatives, including the EU Data Protection Reform package. According to Buttarelli, the ongoing legal uncertainty and fragmented data protection framework is not sustainable and is impacting citizens and businesses. The EDPS aims to assist legislators in enacting the EU Data Protection Reform package in order to reinforce EU privacy and data protection standards and promote a culture of data protection.

Read the interview with Buttarelli.

View the EDPS press release.

FTC Releases Report on Internet of Things

On January 27, 2015, the Federal Trade Commission announced the release of a report on the Internet of Things: Privacy and Security in a Connected World (the “Report”). The Report describes the current state of the Internet of Things, analyzes the benefits and risks of its development, applies privacy principles to the Internet of Things and discusses whether legislation is needed to address this burgeoning area. The Report follows a workshop by the FTC on this topic in November 2013.

The first part of the Report acknowledges the explosive growth of the Internet of Things, noting how there will be 25 million Internet-connected devices by the end of 2015 and 50 million such devices by 2020. These devices range from cameras to home automation systems to bracelets.

Next, the Report discusses the benefits and risk from the Internet of Things. The benefits highlight such developments as:

  • insulin pumps and blood pressure cuffs that can track an individual’s vital signs and submit the data to health care providers;
  • smart meters that help homeowners conserve energy; and
  • connected cars that can diagnose problems with the vehicle.

The risks that accompany such connected devices include:

  • an unauthorized person accessing and misusing personal information of the user of the connected device;
  • a hacker infiltrating the network to which the device is connected and wrecking havoc; and
  • safety risks to the individual user, such as a risk of a third party accessing a vehicle while it is being driven and altering the braking system.

The incorporation of privacy principles contained the following recommendations on these critical areas:

  • data security – companies should incorporate “security by design” similar to the concept of “privacy by design” and take additional steps such as encrypting sensitive health information;
    • the concept of “security by design” was emphasized in the FTC’s settlement with TRENDnet, an Internet camera company;
  • data minimization – companies can accomplish this by “mindfully considering data collection and retention policies and engaging in a data minimization exercise;”
  • notice and choice – companies should only be required to notify consumers and offer them a choice for uses of their information that are inconsistent with consumer expectations;
    • companies can obviate notice and choice issues by de-identifying data because there is no need to offer consumers choices regarding data that cannot be traced to them.

With respect to legislation, the FTC “does not believe that the privacy and security risks, though real, need to be addressed” by legislation or regulation at this time. Though it does not advocate legislation, the FTC intends to engage more vigorously in the Internet of Things arena by (1) using its enforcement authority, (2) developing consumer and business education materials, (3) convening multistakeholder groups to discuss important issues, and (4) advocating its recommendations with relevant federal and state government entitles.

In announcing the report, FTC Chairwoman Edith Ramirez stated that “by adopting the best practices [the FTC] laid out, businesses will be better able to provide consumers the protections they want and allow the benefits of the Internet of Things to be fully realized.”

Read the FTC’s report.

ENISA Issues Report on Implementation of Privacy and Data Protection by Design

On January 12, 2015, the European Union Agency for Network and Information Security (“ENISA”) published a report on Privacy and Data Protection by Design – from policy to engineering (the “Report”). The “privacy by design” principle emphasizes the development of privacy protections at the early stages of the product or service development process, rather than at later stages. Although the principle has found its way into some proposed legislation (e.g., the proposed EU General Data Protection Regulation), its concrete implementation remains presently unclear. Hence, the Report aims to promote a discussion on how the principle can be implemented concretely and effectively with the help of engineering methods.

The Report provides an overview of the ways in which businesses have implemented the “privacy by design” principle into their products and services. To this end, the Report reviews existing approaches and strategies to implement privacy by design, and gives a structured overview of twelve important privacy techniques (such as authentication, attribute based credentials, encryption communications, anonymity and pseudonymity, etc.). Further, the Report presents the challenges and limitations of “by-design” principles for privacy and data protection.

The Report concludes with a number of recommendations that address system developers, service providers, data protection authorities (“DPAs”) and policy makers on how to overcome and mitigate these limits. The main recommendations include:

  • Policymakers should support the development of new incentive mechanisms for privacy-friendly services and need to promote them (e.g., the establishment of audit schemes and seals to enable the customer to make informed choices and the establishment of penalties for those who do not care or obstruct privacy-friendly solutions);
  • The research community should further investigate privacy engineering, especially with a multidisciplinary approach;
  • Software developers and the research community should offer tools that enable the intuitive implementation of privacy properties. These tools should integrate freely available and maintained components with open interfaces and application programming interfaces;
  • DPAs should play an important role in providing independent guidance and assessing modules and tools for privacy engineering, such as in the promotion of privacy-enhancing technologies and the implementation of the transparency principle;
  • Legislators should promote privacy and data protection in their norms from the legal European data protection framework; and
  • Standardization bodies should include privacy considerations in the standardization process as part of international standards, and should develop standards for the interoperability of privacy features in order to help users compare the privacy guarantees of different products and services and make compliance checks easier for DPAs.

View the full report.

Proposed Indiana Law Would Raise Bar for Security and Privacy Requirements

Indiana Attorney General Greg Zoeller has prepared a new bill that, although styled a “security breach” bill, would impose substantial new privacy obligations on companies holding the personal data of Indiana residents. Introduced by Indiana Senator James Merritt (R-Indianapolis) on January 12, 2015, SB413 would make a number of changes to existing Indiana law. For example, it would amend the existing Indiana breach notification law to apply to all data users, rather than owners of data bases. The bill also would expand Indiana’s breach notification law to eliminate the requirement that the breached data be computerized for notices to be required.

Most significantly, SB413 would require data users to implement and maintain “reasonable procedures” that prohibit them from “retaining personal information beyond what is necessary for business purposes or compliance with applicable law” and “using personal information for purposes beyond those authorized by law or by the individual to whom the personal information relates.” These requirements are a substantial change from most existing U.S. privacy laws, and designing and implementing the necessary procedures could be a challenge for many companies.

Failure to comply with the bill’s requirements would constitute a deceptive act under state consumer protection law. While only the attorney general may bring an enforcement action, if a court determines that the violation was “done knowingly,” penalties include a fine of $50 for each affected Indiana resident, with a minimum fine of at least $5,000 and maximum fine of $150,000 per deceptive act.

The cap likely will be challenged as being too low during hearings on the bill. In any event, the fines imposed under this new section are cumulative with those available under any other state or federal law, rule or regulation.

SB413 also would require data users to have online privacy policies, and it specifies that that those policies must include information as to:

  • whether personal information is collected through the data user’s Internet website;
  • the categories of personal information collected through the data user’s Internet website, if applicable;
  • whether the data user sells, shares or transfers personal information to third parties; and
  • if applicable, whether the data user obtains the express consent of an individual to whom the personal information relates before selling, sharing or transferring the individual’s personal information to a third party.

The bill would explicitly prohibit data users from making a “misrepresentation to an Indiana resident concerning the data user’s collection, storage, use, sharing, or destruction of personal information,” or from requiring a vendor or contractor to do so.

While the bill may well be amended as it moves through the legislative process before the Indiana Senate adjourns on April 29, 2015, it is widely expected to pass. Assuming it does, it will reflect a further significant evolution in state laws regulating information privacy and security, and will add Indiana to the growing list of states moving ahead of federal law in these areas.

China’s State Administration for Industry and Commerce Publishes Measures Defining Consumer Personal Information

On January 5, 2015, the State Administration for Industry and Commerce of the People’s Republic of China published its Measures for the Punishment of Conduct Infringing the Rights and Interests of Consumers (the “Measures”). The Measures contain a number of provisions defining circumstances or actions under which enterprise operators may be deemed to have infringed the rights or interests of consumers. These provisions are consistent with the basic rules in the currently effective P.R.C. Law on the Protection of Consumer Rights and Interests (“Consumer Protection Law”). The Measures will take effect on March 15, 2015.

Article 11 of the Measures provides a list of actions that enterprise operators may not undertake because they infringe upon the personal information of consumers. In October 2013, we reported on the amendment to the Consumer Protection Law which extended the protections of personal information to consumer personal information. The list provided in Article 11 of the Measures is similar in concept to the amendment to the Consumer Protection Law.

Although the list itself does not contain any surprises, Article 11 is nevertheless potentially an important development because it provides a definition of “consumer personal information.” (The amendment to the Consumer Protection Law omitted a definition of this term.) According to Article 11, “consumer personal information” refers to “information collected by an enterprise operator during the sale of products or provision of services, that can, singly or in combination with other information, identify a consumer.” Article 11 also provides a list of specific examples of “consumer personal information,” including a consumer’s “name, gender, occupation, birth date, identification card number, residential address, contact information, income and financial status, health status, and consumer status.”

While this definition applies only in relation to consumer personal information, it is an instructive milestone in the continuing emergence of China’s sector-by-sector patchwork of rules and regulations governing the collection and use of personal information.

Deadline for Compliance with Russian Localization Law Set for September 1, 2015

On December 31, 2014, Russian President Vladimir Putin signed legislation to move the deadline for compliance to September 1, 2015, for Federal Law No. 242-FZ (the “Localization Law”), which requires companies to store the personal data of Russian citizens in databases located in Russia. The bill that became the Localization Law was adopted by the lower chamber of Russian Parliament in July 2014 with a compliance deadline of September 1, 2016. The compliance deadline was then moved to January 1, 2015, before being changed to September 1, 2015 in the legislation signed by President Putin.

The Russian law firm ALRUD reports that the Localization Law creates a new obligation to store personal data of Russian citizens in Russia, meaning that companies located outside Russia “will be forced to place their servers within Russia if they plan to continue making business in the market.” The exact purview of the Localization Law is somewhat ambiguous, but the law requires data operators to ensure that the recording, systemization, accumulation, storage, revision (updating and amending), and extraction of personal data of Russian citizens occur in databases located in Russia. As an example of the ambiguity regarding the scope of the Localization Law, it is unclear whether the law applies to companies that collect personal data from Russian customers but have no physical presence in Russia. In addition, it is unclear whether the law will affect the cross-border transfers of personal data from Russia to foreign jurisdictions.

In a Surprising Move, Congress Passes Four Cybersecurity Bills

In a flurry of activity on cybersecurity in the waning days of the 113th Congress, Congress unexpectedly approved, largely without debate and by voice vote, four cybersecurity bills that: (1) clarify the role of the Department of Homeland Security (“DHS”) in private-sector information sharing, (2) codify the National Institute of Standards and Technology’s (“NIST”) cybersecurity framework, (3) reform oversight of federal information systems, and (4) enhance the cybersecurity workforce. The President is expected to sign all four bills. The approved legislation is somewhat limited as it largely codifies agency activity already underway. With many observers expecting little legislative activity on cybersecurity before the end of the year, however, that Congress has passed and sent major cybersecurity legislation to the White House for the first time in 12 years may signal Congress’ intent to address systems protection issues more thoroughly in the next Congress.

On December 11, the House passed Senate legislation codifying DHS’s National Cybersecurity and Communications Integration Center (“NCCIC”) making it the central hub for public-private information sharing. That bill, the National Cybersecurity and Critical Infrastructure Protection Act of 2014 (“NCCIPA”), is the Senate version of similar legislation passed by the House this past summer. The NCCIPA now heading to the President is a pared-down version of the original House bill, leaving out a number of industry-desired provisions that would have eased cybersecurity information sharing with the NCCIC. Notably, industry has been calling for legal protections for companies engaged in sharing information with the government. Nevertheless, the version of the bill headed to the President lacks an extensive legal safe harbor for information-sharing. As well, this version of NCCIPA lacks language from the original House bill that explicitly gave SAFETY Act protections to cybersecurity products. Thus, while passage of NCCIPA is an important and largely unexpected step forward on cybersecurity policy, liability concerns will continue to hamper cybersecurity information sharing.

Later in the evening on December 11, the House and Senate passed the Cybersecurity Enhancement Act of 2014, which authorizes NIST to facilitate and support the development of voluntary, industry-led cyber standards and best practices for critical infrastructure. The bill essentially codifies the ongoing process begun earlier this year through which the NIST Cybersecurity Framework was developed. That process remains voluntary under the bill, with no new regulatory authority added to the Framework. The bill also authorizes the federal government to support research, raise public awareness of cyber risks, and improve the nation’s cybersecurity workforce.

Earlier in the week, on December 8, the Senate passed by voice vote and without debate the Federal Information Security Modernization Act of 2014, which overhauls the 12 year-old Federal Information Security Management Act (“FISMA”). This legislation replaces FISMA’s current requirement that agencies must file annual checklists that show the steps they have taken to secure their IT systems, and puts the Department of Homeland Security (“DHS”) in charge of “compiling and analyzing data on agency information security” and helping agencies install tools “to continuously diagnose and mitigate against cyber threats and vulnerabilities, with or without reimbursement.” DHS has been increasingly performing this role already and similar legislation passed the House of Representatives in April 2013. That bill, however, was subject to jurisdictional disagreements between the House Homeland Security and Oversight and Government Reform Committees. Surprisingly, Oversight and Government Reform Chairman Rep. Darrell Issa (R-CA) dropped objections to the Senate’s FISMA reform bill and the House passed it on Wednesday evening by voice vote. The House also passed the Senate’s Homeland Security Cybersecurity Workforce Assessment Act as a rider to the Border Patrol Agent Pay Reform Act.

This spate of cybersecurity legislation is more limited in scope than the measures that have been sought by the private sector. Indeed, rather than provide new cybersecurity tools, the bills approved by Congress largely make pre-existing actions official. Still, with the 113th Congress effectively ending this week, passage of any cybersecurity bills is very surprising. Legislative activity on cybersecurity this week indicates a seriousness by policymakers to confront issues vital to information systems protection. In its waning days, the Senate may be attempting to set its mark on future cybersecurity policy. For its part, the House’s sudden action on Senate cybersecurity bills may point to a willingness by House committees to overcome internal jurisdictional disagreements that have hampered similar legislation in the past. The significance here is the recognition by Congress that legislative success now builds momentum for systems-protection policies in the next Congress, such as information-sharing liability protection or data breach legislation. How the 114th Congress confronts those issues is important to businesses seeking to enter public-private partnerships and information-sharing agreements.

Supreme Court of Canada Extends Deadline for Amending Alberta PIPA

On October 30, 2014, the Supreme Court of Canada extended the deadline for the province of Alberta to amend its Personal Information Protection Act (“PIPA”). In November 2013, the Supreme Court of Canada declared PIPA invalid because it interfered with the right to freedom of expression in the labor context under Section 2(b) of the Canadian Charter of Rights and Freedoms. The Supreme Court of Canada gave the Alberta legislature 12 months to determine how to make the legislation constitutionally compliant, which it apparently failed to do. The new deadline for amending PIPA is May 2015.

Alberta’s Information and Privacy Commissioner Jill Clayton applauded the extension of the deadline. Commissioner Clayton had sent a letter to Alberta’s Premier, Minister of Justice and Solicitor General, and Minister of Service in September 2014 expressing her concern that the Alberta legislature would “not be able to act to preserve PIPA before it lapses.” She also highlighted the “unique benefits” of PIPA in the letter, including breach notification to affected individuals, local enforcement without the involvement of courts, and the protection of employee privacy rights.

Brazilian Congressman Introduces Right to Be Forgotten Bill

Eduardo Cunha, a congressman from the Brazilian Democratic Movement Party in Rio de Janeiro, recently introduced a new bill in Brazil that provides Brazilians with a right to be forgotten (PL 7881/2014). Rep. Cunha is one of the most influential congressmen in Brazil and has been reported likely to be the next Speaker of the Brazilian House of Representatives (also translated as the “Chamber of Deputies”).

The bill has one substantive Article:

Art 1 – It is required, by request of any citizen or person involved, to remove links from Internet search engines that make reference to irrelevant or outdated data.

To help explain the right, the bill includes quotes from a newspaper article. Carolina Lessa from Reed Elsevier Inc. provided a Portuguese translation:

“Approved in May in Europe, the so-called “Right to be Forgotten” allows citizens from the continent to request the removal of links from internet search engines that make reference to information that is irrelevant or outdated. According to the site “The Observer,” Wikipedia had its first entry removed due to the new legislation.

This information was given by the founder of the digital encyclopedia Jimmy Wales, who is against the legislation. According to Wales, the page, which he did not identify, will continue online, but won’t show up in Google searches.

This legislation is controversial and has caused outrage from the European press, which after the decision by the European Court of Justice, started receiving notifications from Google about links that were removed from their search engines by request of people involved in the news.

According to the internet giant search, the company received approximately 90 thousand requests to remove links from their European results between May and last month. Due to the large amount of requests, Google was only able to remove 50% of the requested pages.

The European countries that have the most requested removals include: (1) France with 17,500 requests for 58,000 links; (2) Germany with 16,500 requests for 57,000 links; (3) the UK with 12,000 requests and 44,000 links; (4) Spain with 8,000 requests and 27,000 links; (5) Italy with 7,500 requests and 28,000 links; and (6) the Netherlands with 5,500 requests and 21,000 links.

Recently, the page “Hidden from Google” announced that they have started to list the removed links from the search engine, and said they have received tips from several collaborators.

I consider this proposal an important social demand and that’s why I request your support for its approval.”

Because this bill will not have to be approved by the plenary, it will move much faster than other bills. Instead, it has to be approved by the House Committee on Science and Technology, Communication and Informatics and the House Committee on Constitution, Justice and Citizenship. After that, it would move to the Senate.

Obama’s New Executive Order Focuses on Securing Consumer Payments

On October 17, 2014, the White House announced that the President signed a new executive order focused on cybersecurity.  The signed executive order, entitled Improving the Security of Consumer Financial Transactions (the “Order”), is focused on securing consumer transactions and sensitive personal data handled by the U.S. Federal Government.

The highlights of today’s Order include:

  • Securing Federal Payments. To help protect citizens doing business with the U.S. Federal Government, the Order sets forth a new policy that calls for Federal agencies to employ enhanced security features to protect payment processing and payment cards, including the deployment of upgraded payment card terminals at Federal agency facilities that accept chip and PIN-enabled cards.
  • Improving Identity Theft Remediation. The Order requires several Federal agencies to support the Federal Trade Commission in its development of IdentityTheft.gov, a new one-stop resource for victims that streamlines the identity fraud reporting and remediation process with consumer reporting agencies.
  • Safeguarding Federal Online Transactions. The Order calls for the development of a plan designed to protect personal data that Federal agencies make accessible to citizens through digital applications.  The plan will require the digital applications to use multi-factor authentication and an effective identity proofing process.
  • Information Sharing. The Order directs expanded information sharing to strengthen the ability of Federal investigators to regularly report evidence of stolen financial and other information to companies whose customers are directly affected.

In addition to signing the Order, the President announced that a cybersecurity and consumer protection summit will be held later this year. The summit will bring together key stakeholders in the consumer financial space to share best practices, promote adherence to stronger security standards and discuss next generation technologies. Several private sector initiatives focused on payment card security and identity theft prevention also were announced, including the rollout of chip and PIN-compatible card terminals at all stores of several large retailers by early 2015. The President also renewed his call for data breach and cybersecurity legislation. Last year, on the evening of the State of the Union Address, President Obama issued the Executive Order on Improving Critical Infrastructure Cybersecurity and a Presidential Policy Directive on Critical Infrastructure Security and Resilience, both of which focused on improving cybersecurity for critical infrastructure sectors.

UK ICO Launches Consultation on Criteria for Privacy Seal Schemes

On September 2, 2014, the UK Information Commissioner’s Office (“ICO”) published a consultation on the framework criteria for selecting scheme providers for its privacy seal scheme. The consultation gives organizations the opportunity to provide recommendations for the framework criteria that will be used to assess the relevant schemes. The consultation is open until October 3, 2014.

Under the draft framework criteria, the ICO’s proposals include the following:

  • ICO endorses at least one scheme for a minimum of 3 years;
  • The ICO has the authority to revoke an endorsement of a scheme;
  • The scheme operator takes responsibility for the day-to-day operation of the scheme and retains ownership of the scheme (including the liabilities and indemnities that may be associated with the operation of the scheme); and
  • The scheme operator is the contact point for queries and complaints related the scheme. Nevertheless, individuals may send complaints directly to the ICO if their concern relates to a breach of the Data Protection Act or the Privacy and Electronic Communications Regulations.

A scheme must first obtain accreditation from the UK Accreditation Services (“UKAS”), the national accreditation body for the UK, before it may gain the ICO’s endorsement. The ICO will participate in the UKAS accreditation process by offering technical expertise and advice to UKAS.

As detailed in its consultation document, the ICO is interested in receiving feedback on the roles and responsibilities of the ICO; the underlying principles; the scope, objectives and sustainability of the scheme; the certification process; and the quality criteria for organizations (i.e., relating to proficiency and knowledge).

The ICO hopes to select a proposal by early 2015 and aims to launch the first round of endorsed schemes in 2016.

Article 29 Working Party Releases Statement on ECJ Ruling Invalidating the EU Data Retention Directive

The Article 29 Working Party (the “Working Party”) recently released its August 1, 2014 statement providing recommendations on the actions that EU Member States should take in light of the European Court of Justice’s April 8, 2014 ruling invalidating the EU Data Retention Directive (the “Ruling”).

In particular, the Working Party’s statement provides recommendations on:

  • Ensuring that the relevant retained data are differentiated and limited to what is strictly necessary for the purpose of fighting “serious crime” (i.e., no automatic bulk retention of all categories of data);
  • Restricting government access to what is strictly necessary in terms of categories of data and data subjects, and also implementing substantive and procedural conditions for such access; and,
  • Ensuring effective protection against unlawful access and abuse (e.g., by allowing an independent authority to assess compliance with EU data protection laws).

The Working Party’s statement also emphasizes that the Ruling does not directly affect the validity of existing national data retention measures. Accordingly, the Working Party urges the European Commission to provide guidance on how the Ruling should be interpreted at the European and Member State levels.

California Lawmakers Pass Bill to Amend State’s Breach Notification Law

On August 19, 2014, California state legislators made final amendments to a bill updating the state’s breach notification law. The amended bill, which passed the State Senate on August 21 and the Assembly on August 25, is now headed to California Governor Jerry Brown for signature. If signed, the scope of the existing law would extend to apply to entities that “maintain” personal information about California residents. Currently, only entities that “own” or “license” such personal information are required to implement and maintain reasonable security procedures and practices to protect the personal information from unauthorized access, destruction, modification or disclosure.

In addition, the bill would require notifying entities that are the source of a security breach to include in their notification an offer to provide “appropriate identity theft prevention and mitigation services” to affected individuals for not less than 12 months at no cost to the individual. The bill also would strengthen current restrictions on the use or disclosure of Social Security numbers by prohibiting selling, offering to sell, or advertising the sale of, Social Security numbers.

Update: On September 30, 2014, Governor Jerry Brown signed the amended bill AB 1710 into law.

Illinois Becomes the Latest State to “Ban the Box”

As reported in the Hunton Employment & Labor Perspectives Blog:

Illinois recently joined a growing number of states and municipalities that have passed “ban the box” laws regulating when employers can inquire into an applicant’s criminal history.

The Job Opportunities for Qualified Applicants Act was signed into law by Governor Pat Quinn on July 19, 2014. The law provides that private employers with 15 or more employees are not permitted to inquire, consider, or require disclosure of an applicant’s criminal history until (1) the individual has been offered an interview or (2) if there is no interview, the individual has been given a conditional offer of employment.

The law does have several exceptions, most notably, for employers who, pursuant to state or federal law, are required to exclude applicants with certain criminal convictions.

The Illinois Department of Labor is responsible for investigating violations of the Act, which could result in civil fines up to $1,500. The law goes into effect on January 1, 2015.

Illinois is following the lead of several other states – such as Hawaii, Massachusetts, Minnesota, and Rhode Island – that have passed “ban the box” legislation applying to private employers. Several municipalities have passed similar legislation, including the District of Columbia (which is still awaiting approval from the Mayor and Congress) and Newark, New Jersey.

Unless exempted from coverage, Illinois employers should remove from their application materials any inquiries into an applicant’s criminal history and refrain from making any such inquires, whether directly with the applicant or through a criminal background check, until the applicant has been offered either an interview or the position.

Delaware Enacts New Data Destruction Law

On July 1, 2014, Delaware Governor Jack Markell signed into law a bill that creates new safe destruction requirements for the disposal of business records containing consumer personal information. The new law requires commercial entities conducting business in Delaware to take reasonable steps to destroy their consumers’ “personal identifying information” prior to the disposal of electronic or paper records. The law will take effect on January 1, 2015.

Under the new law, destruction requirements apply to a consumer’s “personal identifying information.” The term “consumer” is defined as an individual entering into a transaction “primarily for personal, family, or household purposes” and “personal identifying information” (“PII”) consists of the consumer’s first name or first initial and last name in combination with any of the following data elements:

  • a signature;
  • full date of birth;
  • Social Security number or passport number;
  • driver’s license or state identification card number;
  • insurance policy number;
  • financial services account number, bank account number, credit card number, or “any other financial information;” or
  • confidential health care information.

Notably, a consumer’s information qualifies as “personal identifying information” if either his or her name or the accompanying data element is unencrypted at the time of disposal.

Under the new law, when records are “no longer to be retained,” commercial entities must “take all reasonable steps to destroy or arrange for the destruction of a consumer’s” PII within those records. The statute explicitly calls for “shredding, erasing, or otherwise destroying or modifying” the consumer PII in a manner that makes it “entirely unreadable or indecipherable.”

The new law comes equipped with a number of enforcement mechanisms, including a private right of action for consumers who incur actual damages as a result of a violation. Significantly, the statute enables aggrieved consumers to seek treble damages, which could quickly add up given that “each record unreasonably disposed of constitutes an individual violation” of the statute. Under certain circumstances, the Delaware Attorney General and Division of Consumer Protection of the Department of Justice also may bring enforcement actions for violations of the statute.

The statute does carve out several exemptions for regulated entities, including financial institutions subject to the privacy and security requirements of the Gramm-Leach-Bliley Act, consumer reporting agency subject to the FCRA, and certain covered entities subject to HIPAA’s privacy and security requirements.

New Cybersecurity Center to Be Established in Belgium

On July 17, 2014, the Belgian government announced that it has finalized its Royal Decree on the establishment of a Cybersecurity Center (Centrum Cyber Security België or Centre Cyber Security Belgique). The Cybersecurity Center’s tasks would be to monitor the country’s cybersecurity and manage cyber incidents. It also would oversee various cybersecurity projects, formulate legislative proposals relating to cybersecurity, and issue standards and guidelines for securing public sector IT systems. The Cybersecurity Center is expected to be operational by the end of 2014.

UK Government Announces Emergency Data Retention Law

On July 10, 2014, the UK government announced plans to introduce emergency data retention rules, publishing the Data Retention and Investigatory Powers Bill (the “Bill”) along with explanatory notes and draft regulations. The publication of the Bill follows the European Court of Justice’s April 2014 declaration that the EU Data Retention Directive (the “Directive”) is invalid. Under the Directive, EU Member States were able to require communications service provides (e.g., ISPs) to retain communications data relating to their subscribers for up to 12 months.

The Directive had been implemented in the UK by the Data Retention (EC Directive) Regulations 2009, but the legality of that legislation is likely to be challenged following the ECJ’s declaration that the Directive was void from its inception. In anticipation of such a challenge, the UK government announced that the Bill seeks to “ensure that UK law enforcement and intelligence agencies can maintain their ability to access the telecommunications data they need to investigate criminal activity and protect the public,” and to provide a clearer legal framework for organizations to cooperate with law enforcement and intelligence agencies.

Content of the Bill

The Bill authorizes the Secretary of State to provide notice to a public telecommunications operator requiring it to retain communications data where the Secretary of State considers it necessary or proportionate for one of several purposes. The purposes are listed in Section 22(2) of the Regulation of Investigatory Powers Act 2000, and include national security, the prevention or detection of crime and the protection of public safety. A notice may apply to a specific operator or describe relevant operators, require the retention of all data or a category of data, specify the periods for which the data must be retained, and “contain other requirements, or restrictions, in relation to the retention of data.”

The Bill allows the Secretary of State to issue regulations governing the retention of communications data. For example, the Secretary of State may publish of a code of practice, or alter the maximum period for which data can be maintained under a retention notice (up to 12 months). The Bill also contains a termination clause that will automatically repeal the law on December 31, 2016.

The Bill reportedly has support from the main political parties, but has been criticized by civil liberties groups who disagree with how the legislation is being expedited. Although no firm legislative timetable has been set, the Bill is expected to be fast-tracked through Parliament this week.

Florida Amends Breach Notification Law to Cover Health Data, Tighten Notice Deadline and Require State Regulator Notification

On June 20, 2014, Florida Governor Rick Scott signed a bill into law that repeals and replaces the state’s existing breach notification statute with a similar law entitled the Florida Information Protection Act (Section 501.171 of the Florida Statutes) (the “Act”).

Below is a summary of several key changes the Act makes to the previous breach notification statute:

  • The Act revises the definition of “breach of security” to cover “unauthorized access” of electronic data containing personal information; the previous law defined breach more narrowly to mean “unlawful and unauthorized acquisition” of computerized data that materially compromises the security, confidentiality or integrity of personal information.
  • The Act expands the definition of “personal information” to include “[a]ny information regarding an individual’s medical history, mental or physical condition, or medical treatment or diagnosis by a health care professional; or an individual’s health insurance policy number or subscriber identification number and any unique identifier used by a health insurer to identify the individual.” In addition, the definition of “personal information” now includes a “username or e-mail address, in combination with a password or security question and answer that would permit access to an online account.”
  • The Act requires notice to affected individuals no later than “30 days after determination of the breach or reason to believe a breach occurred.” If good cause is presented in writing to the Department of Legal Affairs (the “Department”) within the 30-day window, the covered entity may receive an additional 15 days to provide notice. The previous law required notification within 45 days.
  • The Act requires notice to the Department for a breach affecting 500 or more Florida residents in accordance with the 30-day timing requirement and 15-day extension period described above. The notification to the Department must include:
    • a synopsis of the events surrounding the breach at the time notice is provided;
    • the number of individuals in Florida who were or potentially have been affected by the breach;
    • any services related to the breach being offered or scheduled to be offered, without charge, by the covered entity to individuals, and instructions as to how to use such services;
    • a copy of the notice to affected individuals or an explanation of the other actions taken pursuant to the notification provision; and
    • the name, address, telephone number and email address of the employee or agent of the covered entity from whom additional information may be obtained about the breach.
  • Covered entities also may be required to provide the following information to the Department upon request:
    • a police report, incident report or computer forensics report;
    • a copy of the policies in place regarding breaches; and
    • steps that have been taken to rectify the breach.
  • The Act provides a harm threshold similar to the one contained in the previous law. Pursuant to the Act, however, a covered entity may rely on the harm threshold only “after an appropriate investigation and consultation with relevant federal, state, or local law enforcement agencies” (emphasis added). In addition, the covered entity must provide the written determination to the Department within 30 days after the determination.

The Act took effect on July 1, 2014. View the amended breach law.

Russian Parliament Adopts Internet Privacy Bill Requiring Data Localization

Last week, the Russian Parliament adopted a bill amending portions of Russia’s existing legislation on privacy, information technology and data protection. Among other provisions, the law would create a “data localization” obligation for companies engaged in the transmission or recording of electronic communications over the Internet. Such companies would be required to store copies of the data for a minimum of six months in databases that must be located within the Russian Federation. The new bill also would empower the Russian data protection authority to block public Internet access to any service that does not comply with this requirement.

It appears the amendments are aimed at preventing foreign intelligence services from accessing Russian citizens’ data, as well as facilitating such access by Russia’s own law enforcement agencies. Some commentators have suggested that the new bill also is intended to encourage the development of home-grown online services in Russia.

Earlier this year, the European Union’s highest court struck down a broadly comparable data retention requirement, and Brazilian lawmakers withdrew the data localization provision from a legislative proposal in the face of opposition from Internet companies.

Reports indicate that, subject to the approval of the upper house of Russia’s Parliament and signature by President Vladimir Putin, the law will become effective in the second half of 2016.

Proposed Legislation Would Give EU Citizens Right to Sue in U.S. for Wrongful Disclosure of Personal Information

On June 25, 2014, U.S. Attorney General Eric Holder announced that the Obama Administration is looking to pass legislation that would provide EU citizens with a right to judicial redress in U.S. courts if their personal information that was shared for law enforcement purposes is later intentionally or wilfully disclosed. The announcement was made during the EU-U.S. Ministerial Meeting on Justice and Home Affairs in Athens, Greece, which was co-chaired by the Attorney General and aimed to advance EU-U.S. cooperation in efforts to stop transnational crime and terrorism. The announcement also relates to the ongoing negotiations of the new “umbrella” EU-U.S. Data Protection and Privacy Agreement (“DPPA”).

During the Ministerial Meeting, AG Holder stated that “the Obama Administration is committed to seeking legislation that would ensure that, with regard to personal information transferred within the scope of the proposed DPPA […], EU citizens would have the same right to seek judicial redress for intentional or willful disclosures of protected information, and for refusal to grant access or to rectify any errors in that information, as would a U.S. citizen under the Privacy Act.” No draft legislation has been introduced to date.

GAO Testimony Highlights Risks and Inconsistent Privacy Practices of Companies That Obtain Geolocation Data

On June 4, 2014, the U.S. Government Accountability Office (“GAO”) testified before the U.S. Senate Judiciary Subcommittee on Privacy, Technology and the Law on GAO’s findings regarding (1) companies’ use and sharing of consumer location data, (2) privacy risks associated with the collection of location data, and (3) actions taken by certain companies and federal agencies to protect the privacy of location data. GAO’s testimony relates to its 2012 and 2013 reports that examined the collection of location data by certain mobile industry companies and in-car navigation providers.

In its Testimony, GAO noted that the companies surveyed for the 2012 and 2013 reports “did not consistently or clearly disclose to consumers what the companies do with [location data and other information] or the third parties with which they might share the data, leaving consumers unable to effectively judge whether such uses of their location data might violate their privacy.” GAO noted that the surveyed companies “primarily collect and share location data to provide location services and to improve those services.” GAO also noted that “[l]ocation data can also be used to enhance the functionality of other services that do not need to know the consumer’s location to operate.” In addition, GAO found that the all the surveyed companies had privacy policies or other practices including onscreen notifications “to notify consumers that they collect location data and other personal information. However, some companies have not consistently or clearly disclosed to consumers what they are doing with these data or which third parties they may share them with.” Accordingly, consumers are not always aware that companies share their location data for purposes other than providing services, and they may be unable to judge the risks associated with the sharing of their data.

GAO also noted that “most” of the privacy policies examined did not provide the companies’ retention period for the location data, which could mean that companies retain the data indefinitely. GAO noted that consumers could be at “higher risk of identity theft or threats to personal safety” as a result of such long retention periods and vast amounts of data collected. Finally, GAO highlighted the Federal Trade Commission’s guidance on mobile privacy disclosures and efforts to pass federal data privacy legislation that would provide a minimum level of protection for consumer data, including location data.

Jessica Rich, the Director of the FTC’s Bureau of Consumer Protection, also testified at the Senate’s hearing, discussing the FTC’s ongoing efforts related to the protection of consumers’ location data. In addition, the Director expressed the FTC’s support for the draft Location Privacy Protection Act of 2014 (the “Location Act”), which was introduced in March 2014 by Senator Al Franken (D-MN), and would require companies to obtain individuals’ permission before collecting location data through smartphones, tablets or in-car navigation devices, and before sharing it with others. The Director recommended, however, that the FTC, as the “federal government’s leading privacy enforcement agency” should be granted rulemaking and enforcement authority with regard to the civil provisions of the Location Act, as such provisions would make enforcement easier than under Section 5 of the FTC Act. The Location Act currently gives the Department of Justice sole enforcement and rulemaking authority, in consultation with the FTC.

FTC Issues Report on Data Broker Industry, Recommends Legislation

On May 27, 2014, the Federal Trade Commission announced the release of a new report entitled Data Brokers: A Call for Transparency and Accountability, detailing the findings of an FTC study of nine data brokers, representing a cross-section of the industry. The Report concludes that the data broker industry needs greater transparency and recommends that Congress consider enacting legislation that would make data brokers’ practices more visible and give consumers more control over the collection and sharing of their personal information.

The Report finds that data brokers collect consumer data from both online and offline sources, storing billions of data elements pertaining to almost every U.S. consumer. In addition, the Report indicates that data brokers share data with each other, and they combine and analyze consumer data to make inferences, including potentially sensitive inferences, about consumers. The Report also notes that, to the extent data brokers currently offer consumers choices about their personal information, consumers may not be aware of those choices.

The FTC recommends that Congress enact legislation to address the lack of visibility into data broker practices, and to provide consumers with increased access and control. In recent years, several bills have been introduced to address these issues, but no federal legislation on the topic has been enacted to date.

The FTC Report takes a different approach from the recent White House data report, “Big Data: Seizing Opportunities, Preserving Values,” which was issued earlier in May. Whereas the White House report discusses both the benefits of data collection as well as its privacy implications, the FTC Report focuses more on potential harms to consumers. The FTC calls for writing into law concepts that have been part of industry voluntary codes of conduct for years.

As we previously reported, in September 2013, Senator Jay Rockefeller (D-WV), Chair of the Senate Committee on Commerce, Science and Transportation, sent letters to twelve popular health and personal finance websites as part of his investigation of the data broker industry. The letters asked the companies to answer questions about their data collection and sharing practices. As reported in Bloomberg BNA, Senator Rockefeller “concluded that the FTC report ‘echoes findings’ of his committee’s recent probe of the data broker industry.”

The FTC voted to approve the issuance of the report 4-0, with Commissioner Terrell McSweeny not participating. Commissioner Julie Brill issued a concurring statement.

White House Releases Report on Big Data

On May 1, 2014, the White House released a report examining how Big Data is affecting government, society and commerce. In addition to questioning longstanding tenets of privacy legislation, such as notice and consent, the report recommends (1) passing national data breach legislation, (2) revising the Electronic Communications Privacy Act (“ECPA”), and (3) advancing the Consumer Privacy Bill of Rights.

The report states that consumers have a “right to know if [their] information has been stolen or otherwise improperly exposed” and continues that data breaches are currently regulated by a “patchwork” of 47 state laws. The report recommends that Congress pass legislation providing a single data breach standard, similar to the Obama administration’s May 2011 proposal. The data breach legislation should include “reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.”

The report also recommends revising ECPA to confirm that online, digital content is protected in the same manner as hard copy materials. For example, the report recommends removing distinctions in ECPA that focus on how long an email has been left unread.

The White House’s Big Data report also recommends advancing the Consumer Privacy Bill of Rights released by the Obama administration in February 2012. Specifically, the report calls on the Department of Commerce to seek public comment on the Consumer Privacy Bill of Rights, and then draft legislation for review by the President and Congress.

Read the White House’s Fact Sheet on the Big Data and Privacy Working Group Review.

Brazilian President Signs Internet Bill

On April 23, 2014, Brazilian President Dilma Rousseff enacted the Marco Civil da Internet (“Marco Civil”), Brazil’s first set of Internet regulations. The Marco Civil was approved by the Brazilian Senate on April 22, 2014. President Rousseff signed the law at the NETMundial Internet Governance conference in São Paulo, a global multistakeholder event on the future of Internet governance.

As we previously reported, the Marco Civil includes requirements regarding personal data protection and net neutrality. The law also contains data protection and privacy provisions that also would apply extraterritorially to foreign online businesses that process data of Brazilian citizens. A controversial provision that would have required data to be stored locally in Brazil has been omitted in the final version of the law.

Kentucky Enacts Data Breach Notification Law

On April 10, 2014, Kentucky Governor Steve Beshear signed into law a data breach notification statute requiring persons and entities conducting business in Kentucky to notify individuals whose personally identifiable information was compromised in certain circumstances. The law will take effect on July 14, 2014.

Kentucky’s data breach notification law covers “personally identifiable information,” which is defined as an individual’s first name or first initial and last name in combination with any of the following:

  • Social Security number;
  • Driver’s license number; or
  • Account number, credit or debit card number, in combination with any required security code, access code or password that would permit access to an individual’s financial account.

The breach notification law contains a harm threshold: entities are not required to notify affected Kentucky residents unless the breach “actually causes, or leads the [entity] to reasonably believe has caused or will cause identity theft or fraud.”

The law does not require entities to notify the state Attorney General or any other government agencies, but it does require notice to all consumer reporting agencies and credit bureaus if more than 1,000 residents are to be notified at one time.

Alabama, New Mexico and South Dakota are now the only U.S. states that have not yet enacted a data breach notification law.

View an unofficial copy of the statute.

FTC and DOJ Issue Antitrust Policy Statement on Cybersecurity

On April 10, 2014, U.S. Department of Justice Deputy Attorney General James Cole and Federal Trade Commission Chair Edith Ramirez announced a joint DOJ and FTC antitrust policy statement on the sharing of cybersecurity information (“Policy Statement”). The Policy Statement, as well as their remarks, emphasize the seriousness of the cybersecurity challenge and the need to improve cybersecurity information sharing. It is another example of the Obama Administration’s efforts to encourage the sharing of information about cybersecurity threats and vulnerabilities.

The Administration’s 2011 omnibus cybersecurity legislative proposal included robust provisions designed to encourage information sharing between private entities and between private entities and the government. The Obama Administration’s 2013 Executive Order on Improving Critical Infrastructure Cybersecurity required certain agencies to share classified and unclassified cyber threat information with targeted companies. And, the Department of Homeland Security and the Federal Bureau of Investigation are rapidly expanding programs designed to facilitate the bi-directional sharing of technical cybersecurity information between the government and the private sector. With this Policy Statement, the Administration is attempting to remove an issue that has hindered private-private cybersecurity information sharing.

The Policy Statement points to guidance that the DOJ issued in 2000 to the Electric Power Research Institute (“EPRI”) stating that it had no intention of initiating an enforcement action against EPRI regarding its program to exchange cyber threat and attack information. Although that guidance is over ten years old, it remains the agencies’ current analysis. The Policy Statement highlights three main points:

  • the sharing of cyber threat information can improve efficiency and network security, thereby serving a valuable purpose;
  • the information shared is typically technical in nature. It generally does not involve competitively sensitive information, such as current or future prices; and
  • the exchange of cyber threat information is limited in scope and unlikely to harm competition.

Accordingly, the two agencies conclude that the “properly designed sharing of cyber threat information should not raise antitrust concerns.”

Additional information is available in a Law360 article authored by Hunton & Williams partners Jamillia Padua Ferris and Paul M. Tiao. The article provides analysis of the Policy Statement and further discusses the importance of operational information sharing as a critical element in the fight against cyber threats.

EU Data Retention Directive Invalidated

On April 8, 2014, the European Court of Justice ruled that the EU Data Retention Directive is invalid because it disproportionally interferes with the European citizens’ rights to private life and protection of personal data. The Court’s ruling applies retroactively to the day the Directive entered into force.

The Court criticized that the Directive:

  • applies to all individuals, electronic communications and traffic data without differentiation, limitation or exception;
  • does not contain objective criteria for when data access by national authorities is justified;
  • does not contain objective criteria to determine how long data should be retained – the general minimum and maximum retention periods set out in the Directive do not distinguish between categories of data, persons concerned or the data’s usefulness;
  • does not contain sufficient safeguards against potential abuse and does not ensure irreversible destruction of the data upon expiry of the retention period; and
  • does not explicitly require that the data be retained within the EU, therefore violating the requirement in the EU Charter of Fundamental Rights that compliance control be exercised by independent authorities.

The case was referred to the European Court of Justice by senior Austrian and Irish courts for a preliminary ruling. On December 12, 2013, the Court’s Advocate General delivered his opinion that the Directive is incompatible with the European Charter of Fundamental Rights.

View the full text of the judgment.

Banning the Criminal Background Check Box in San Francisco

As reported in the Hunton Employment & Labor Perspectives Blog:

On February 14, 2014, San Francisco passed the San Francisco Fair Chance Ordinance and became the latest national municipality to “ban the box” and limit the use of criminal background checks in employment hiring decisions. The deadline for San Francisco employers to comply with the San Francisco Fair Chance Ordinance is August 13, 2014. The “ban the box” campaign continues to gain momentum – San Francisco joins other cities (Buffalo, Newark, Philadelphia, and Seattle) and states (Hawaii, Massachusetts, Minnesota, and Rhode Island) who do not allow employers to ask about prior criminal convictions on initial job applications, and similar legislation is currently pending at state and local levels around the United States. We present an overview of the San Francisco Fair Chance Ordinance and recommended best practices for compliance here.

Australian Data Breach Notification Bill Re-Introduced

On March 20, 2014, Australia’s Privacy Amendment (Privacy Alerts) Bill 2014 was re-introduced in the Senate for a first read. The bill, which was subject to a second reading debate on March 27, 2014, originally was introduced on May 29, 2013, but it lapsed on November 12, 2013 at the end of the session.

As we previously reported, if passed, the bill would amend the Privacy Act 1988 by introducing a mandatory breach notification requirement for “serious data breaches.” The proposed definition of “serious data breach” includes a harm threshold: pursuant to the bill, the breach notification obligation would be triggered if unauthorized access to, or disclosure of, personal information would result in a “real risk of serious harm” to the individual to whom the information relates. In the event an organization “believes on reasonable grounds” that there has been a “serious data breach,” the organization would be required, as soon as practicable, to notify affected individuals and submit a copy of the notification to the Australian Privacy Commissioner. The bill also contemplates notification methods, and would allow the Privacy Commissioner to exempt organizations from the notification requirement under certain circumstances.

Brazil Removes Local Data Storage Requirement from Internet Bill

On March 18, 2014, Brazilian lawmakers announced the withdrawal of a provision in pending legislation that would have required Internet companies to store Brazilian users’ data within the country.

The Marco Civil da Internet (“Marco Civil”), a draft bill introduced in the Brazilian Congress in 2011, proposes Brazil’s first set of Internet regulations, including requirements regarding personal data protection and net neutrality. As we previously reported, the Marco Civil received renewed attention last year in the wake of revelations that the U.S. National Security Agency’s PRISM surveillance program may have monitored digital communications in Brazil. In response, the Marco Civil was amended to add a local data storage requirement for Brazilian data. The provision generated controversy and opposition from Internet companies that claimed complying with the requirement would be expensive and burdensome.

According to reports, the legislation now states that global Internet companies “are subject to Brazilian laws in cases involving information on Brazilians even if the data is stored abroad.”

NIST Releases Final Cybersecurity Framework

On February 12, 2014, the National Institute of Standards and Technology (“NIST”) issued the final Cybersecurity Framework, as required under Section 7 of the Obama Administration’s February 2013 executive order, Improving Critical Infrastructure Cybersecurity (the “Executive Order”). The Framework, which includes standards, procedures and processes for reducing cyber risks to critical infrastructure, reflects changes based on input received during a widely-attended public workshop held last November in North Carolina and comments submitted with respect to a preliminary version of the Framework that was issued in October 2013.

Differences between the Framework and its preliminary version are generally editorial, and the Framework’s basic structure has remained substantially the same. However, in one notable change, the Framework no longer includes Appendix B, the “Methodology to Protect Privacy and Civil Liberties for a Cybersecurity Program.” Appendix B of the Preliminary Framework attracted significant opposition from industry because, among other things, of its breadth, prescriptive nature, and failure to reflect the standards contained in a wide range of successful privacy and data protection programs implemented by industry, in partnership with various government agencies. The Framework issued today removes Appendix B and replaces it with a general description of privacy issues that entities should consider in the section on “How to Use the Framework.”

Like the preliminary version, the Framework is broadly broken down into three components: (1) Framework Core, (2) Framework Implementation Tiers and (3) Framework Profile.

The Framework Core is organized into five overarching cybersecurity functions: (1) identify, (2) protect, (3) detect, (4) respond and (5) recover. Each function has multiple categories, which are more closely tied to programmatic activities. They include activities such as “Asset Management,” “Access Control” and “Detection Processes.” The categories, in turn, have subcategories, which are tactical activities that support technical implementation. Examples of subcategories include “[a]sset vulnerabilities are identified and documented” and “[o]rganizational information security policy is established.” The Framework Core includes informative references, which are specific sections of existing standards and practices that are common among various critical infrastructure sectors and illustrate methods to accomplish the activities described in each Subcategory.

The Framework Implementation Tiers describe how an organization views cybersecurity risk and the processes in place to manage that risk. The tiers range from Partial (Tier 1) to Adaptive (Tier 4) and describe an increasing degree of rigor and sophistication in cybersecurity risk management practice. Progression to higher tiers is encouraged when such a change would reduce cybersecurity risk and be cost effective.

The Framework Profile is the alignment of the functions, categories and subcategories with the organization’s business requirements, risk tolerance and resources. An organization may develop a current profile based on existing practices and a target profile that reflects a desired set of cybersecurity activities. A comparison of the two profiles may reveal gaps that establish a roadmap for reducing cybersecurity risk that is aligned with organizational and sector goals, considers legal and regulatory requirements and industry best practices, and reflects risk management priorities.

The Framework is a flexible document that gives users the discretion to decide which aspects of network security to prioritize, what level of security to adopt, and which standards, if any, to apply. This flexibility reflects vocal opposition by critical infrastructure owners and operators to new cybersecurity regulations.

The White House has emphasized repeatedly that the Framework itself does not include any mandates to adopt a particular standard or practice. However, Section 10 of the Executive Order directs sector-specific agencies to engage in a consultative process with the Department of Homeland Security, the Office of Management and Budget, and the National Security Staff to review the Framework and determine if current cybersecurity regulatory requirements are sufficient given current and projected risks. If such agencies deem the current regulatory requirements to be insufficient, then they “shall propose prioritized, risk-based, efficient, and coordinated actions…” This process could lead to new cybersecurity regulations in various sectors.

This regulatory review, in conjunction with the Framework being used by insurance underwriters and incentives the Administration is developing to encourage adoption of the Framework, likely will result in the Framework affecting standards of reasonableness in litigation relating to cybersecurity incidents.

German Ministry Moves on Privacy Litigation

On February 11, 2014, Germany’s Federal Minister of Justice and Consumer Protection announced that consumer rights organizations will soon be able to sue businesses directly for breaches of German data protection law. Such additional powers had already been contemplated by the German governing coalition’s agreement and the Minister now expects to present a draft law in April of this year to implement them.

If passed, the new law would bring about a fundamental change in how German data protection law is enforced. Currently, only the affected individuals as well as Germany’s criminal prosecutors and data protection authorities have legal standing to sue businesses for breaches of data protection law. Such proceedings are still relatively infrequent, in part due to the complexities and costs involved.

Consumer rights organizations, however, are sophisticated and well-funded. In the past, they have been very active in pursuing businesses for breaches of consumer protection legislation and unfair competition laws. Alleged data protection breaches often are featured in these proceedings, but consumer rights organizations had to rely on particular legal fact patterns to successfully argue their cases. The new law would likely change this and legal proceedings against businesses for data protection breaches would become more common in Germany.

Therefore, businesses subject to German data protection laws should take note of this development and consider whether their data processing practices meet the required standards.

European Member States and ENISA Issue SOPs to Manage Multinational Cyber Crises

On February 5, 2014, the Member States of the EU and European Free Trade Association (“EFTA”) as well as the European Network and Information Security Agency (“ENISA”) issued Standard Operational Procedures (“SOPs”) to provide guidance on how to manage cyber incidents that could escalate to a cyber crisis.

Background
In 2009, the European Commission’s Communication on Critical Information Infrastructure Protection invited EU Member States to develop national contingency plans and organize regular exercises to enhance a closer pan-European Network and Information Security (“NIS”) cooperation plan.

In February 2013, the European Commission, together with the High Representative of the Union for Foreign Affairs and Security Policy, launched their cybersecurity strategy (“Strategy”) for the European Union. As part of this Strategy, the European Commission also proposed a draft directive on measures to ensure a common level of NIS across the EU (the “Directive”). The Directive introduces a number of measures, including the creation of a network to enable the national NIS authorities, the European Commission and, in certain cases, ENISA and the Europol Cybercrime Center, to share early warnings on risks and incidents, and to cooperate on further steps and organize exercises at the European level.

In this context, the EU/EFTA Member States developed the SOPs in collaboration with ENISA. The draft SOPs were tested during the pan-European cyber exercises organized by ENISA.

The SOPs
The SOPs include a list of contact points, guidelines, templates, workflows, tools and best practices to help European public authorities better understand the causes and impacts of multinational cyber crises and identify effective action plans. In particular, the SOPs emphasize the need to establish direct links to the decision makers at the strategic and political level in order to successfully manage multinational cyber crises.

ENISA continues to work with EU Member States to develop information security best practices and assist the Member States with the implementation of relevant EU legislation.

Commissioner Reding Calls for New European Data Protection Compact

On January 28, 2014, Data Protection Day, Vice-President of the European Commission and Commissioner for Justice Fundamental Rights and Citizenship Viviane Reding gave a speech in Brussels proposing a new data protection compact for Europe. She focused on three key themes: (1) the need to rebuild trust in data processing, (2) the current state of data protection in the EU, and (3) a new data protection compact for Europe.

The Need to Rebuild Trust

Following the recent National Security Agency (“NSA”) surveillance revelations, Commissioner Reding stated that the most important goal for 2014 is to restore the trust of citizens in how their data are safeguarded. To achieve this goal, she recommended that:

  • Safe Harbor be strengthened by enforcing the 13 recommendations proposed by the European Commission in October 2013; and
  • The EU and U.S. agree and finalize the “umbrella” agreement on the transfer and processing of personal information in the context of police and judicial cooperation in criminal matters, which is currently being negotiated, and would afford EU citizens the same rights as U.S. citizens when their data are exchanged with the United States.

Commissioner Reding also referred to the new rights of data subjects that would be introduced by the proposed EU General Data Protection Regulation (the “Proposed Regulation”), which include the right to be forgotten, the right to data portability, and the right to be informed of personal data breaches. Commissioner Reding called for more meaningful enforcement, citing as example recent fines levied against Google Inc. in the amounts of €900,000 (in Spain) and €150,000 (in France), which she described as “more like pocket money than a fine” to Google.

The State of Data Protection Reforms in the EU

Commissioner Reding emphasized the European Parliament’s “overwhelming” support for the Proposed Regulation in its compromise text adopted in October 2013. However, she criticized many European leaders and major companies for failing to uphold data protection as a fundamental goal, stating that “some companies and a few governments continue to see data protection as an obstacle rather than as a solution; privacy rights as compliance costs, and not as an asset.” She noted that, two years after the legislative proposals were first released, “Discussions are mature. The text is ready. It is just a matter of political will.”

A Data Protection Compact for Europe

Commissioner Reding concluded her speech by proposing eight principles that should govern the way personal data are processed in the public and the private sector:

  • Europe should finalize the Proposed Regulation in 2014, as “[o]therwise others will move first and impose their standards on [Europe].”
  • The Proposed Regulation should not distinguish between the private and the public sector, and should apply the same principles and standards to both.
  • Laws affecting individuals’ privacy must be publicly consulted.
  • In relation to surveillance activities, data collection must be targeted, limited and in proportion to the surveillance objectives.
  • Laws need to be clear and kept up-to-date, otherwise they risk being applied “in ways that had not been imagined at the time [they were] written,” due to technological advancements.
  • National security exemptions should be invoked sparingly, since “not everything that relates to foreign relations is a matter of national security.”
  • Judicial authorities have an important role to play in deciding where the balance lies between protecting individuals’ privacy and maintaining nations’ security.
  • Data protection rules should apply irrespective of the nationality and place of residence of the data subject.

Commissioner Reding emphasized that bolstering trust in the way companies and governments process personal data would benefit the digital economy, national security, the Internet and Europe as a whole.

State “Ban the Box” Legislation Gains Momentum

As reported in the Hunton Employment & Labor Perspectives Blog, the “ban the box” movement continues to sweep through state legislatures. “Ban the box” laws, which vary in terms of scope and detail, generally prohibit employers from requesting information about job applicants’ criminal histories. Recent legislation in two states applies “ban the box” prohibitions to private employers in those states:

  • On December 1, 2013, a new North Carolina law went into effect that prohibits employers from inquiring about job applicants’ arrests, charges or convictions that have been expunged. This prohibition applies to requests for information on applications and during interviews with applicants.
  • On January 1, 2014, a new Minnesota law goes into effect that prohibits employers from inquiring into, requiring disclosure of or considering the criminal record or criminal history of an applicant until the applicant has been selected for an interview or, if there is no interview, until after a conditional offer of employment has been made.

Employers should review their applications and hiring practices to ensure compliance with the new laws, and verify that managers involved in the hiring process understand when, and to what extent, they are permitted to inquire about applicants’ criminal histories.

Read the full post on the Hunton Employment & Labor Blog.

EU Court of Justice Advocate-General Finds Data Retention Directive Incompatible with Charter of Fundamental Rights

On December 12, 2013, Advocate-General Cruz Villalón of the European Court of Justice (“ECJ”) issued his Opinion on the compatibility of the EU Data Retention Directive 2006/24/EC (the “Data Retention Directive”) with the Charter of Fundamental Rights of the European Union (the “EU Charter”).

Background
The Data Retention Directive requires EU Member States to ensure that telecommunications service providers collect and retain traffic and location data (but not the substantive content of those communications) for purposes of investigating, detecting and prosecuting serious crimes as defined by national law. The data must be retained for a minimum of six months and a maximum of two years.

The Advocate-General delivered his Opinion in connection with four national cases, one brought by Digital Rights Ireland against the Irish authorities and three cases pending before Austria’s constitutional court.

Opinion of the Advocate-General
In his Opinion, the Advocate-General considered that the collection and the retention, in large databases, of these data constitute a serious interference with the right to privacy contained in the EU Charter. The Advocate-General emphasized that the data could be used to reconstruct a large portion of a person’s conduct, or even a complete and accurate picture of his or her private identity. According to the Advocate-General, the risk that the data might be used for unlawful purposes is increased by the following factors:

  • the data are not retained by national public authorities, or even under their direct control, but by the telecommunications service providers; and
  • the data may be stored at indeterminate locations in cyberspace since the Data Retention Directive does not require the data to be stored in the territory of a EU Member State.

In the light of this serious interference with the right to privacy, the Advocate-General ruled that the Data Retention Directive should have defined the necessary principles for governing the guarantees needed to regulate access to the data and their use, instead of assigning the task of defining and establishing those guarantees to the EU Member States. The Advocate-General concluded that the Data Retention Directive does not comply with the requirement, laid down by the EU Charter, that any limitation on the exercise of a fundamental right must be provided for by law.

Further, the Advocate-General found no reason why the Data Retention Directive requires EU Member States to ensure that the data are retained for a maximum of two years instead of limiting the retention period to less than one year.

In the Opinion, the Advocate-General proposes to suspend the effects of a finding that the Data Retention Directive is invalid in order to enable the EU legislature to adopt, within a reasonable time period, the measures necessary to remedy the invalidity.

Hunton Publishes Final Paper in its Series of Executive Briefings on the Proposed EU Data Protection Regulation

As we previously reported, on October 21, 2013, the European Parliament approved its Compromise Text of the proposed EU General Data Protection Regulation (the “Proposed Regulation”). Hunton & Williams has now published an analysis of these proposals.

This latest analysis is the last installment in our series of Executive Briefings on the Proposed Regulation. Since the publication of the Proposed Regulation in January 2012, Hunton & Williams has been tracking developments and analyzing each stage of the legislative process:

  • Our initial Executive Briefing Paper examines the European Commission’s proposals, how the proposals would revise the existing EU data protection framework, and how those changes would likely impact organizations in practice.
  • In January 2013, following the publication of the draft report on the Proposed Regulation of the European Parliament’s lead rapporteur, we published an update to the Executive Briefing Paper, analyzing the rapporteur’s draft amendments to the European Commission’s proposals.
  • In June 2013, we published a second update to the Executive Briefing Paper examining in detail the Irish Presidency’s proposed amendments to the Proposed Regulation; specifically, the Presidency’s proposals regarding consent, legitimate grounds for processing, pseudonymization, data minimization, profiling and the right to be forgotten.
  • Our latest analysis, the final update in the series, examines the European Parliament’s Final Compromise Text, adopted on October 21, 2013.

Next up, the Council of Ministers must reach an agreement on the Proposed Regulation, after which a “trilogue” between the Parliament, the Council and the Commission will be established to work on the final text. A vote is expected before the parliamentary elections in May 2014. The coming months are likely to involve a period of intense negotiations, and businesses should remain engaged in the process.

Hunton & Williams is developing additional materials regarding the next steps in the legislative process to offer practical insights to our clients.

China’s Supreme People’s Court Releases Provisions on the Online Issuance of Judgment Documents by People’s Courts

On November 21, 2013, the Supreme People’s Court of China passed the Provisions on the Online Issuance of Judgment Documents by People’s Courts (the “Provisions”), which will take effect on January 1, 2014. The Provisions replace earlier rules (of the same title) enacted by the Supreme People’s Court on November 8, 2010, and generally focus on improved implementation of the principles of standardizing the online issuance of judgment documents, promoting judicial justice and enhancing the public credibility of the judiciary.

The Provisions also contain a number of suggestions for the protection of personal information. These recommendations indicate that:

  • Judgment documents involving state secrets, personal private matters or cases involving juvenile delinquency shall not be published on the Internet.
  • When issuing online judgment documents, a People’s Court shall delete the following information: (1) the home address, contact information, ID number, bank account number and any other personal information of a natural person; (2) relevant information of a juvenile; (3) the bank account number of an entity or other organizations; (4) business secrets; and (5) other content inappropriate for release on the Internet.
  • A People’s Court shall retain the real information of the name or title of the party concerned upon issuing online judgment documents, but the names of the following parties or litigants shall be processed anonymously through the use of alternate symbols: (1) the parties and their statutory agents in marriage and family cases or inheritance disputes; (2) victims and their statutory agents, witnesses and expert witnesses in criminal cases; (3) any defendant who is sentenced to fixed-term imprisonment of not more than three years and is exempted from criminal punishment (and who is not a recidivist or habitual offender).

The Provisions are intended to make the judicial system more independent and more transparent. At the same time, it remains to be seen how easily searchable the judgment opinion network will be after the Provisions are implemented. The Provisions represent the latest step in the ever-growing array of sector-specific regulations governing personal information in China, and may suggest that legislative and regulatory activities for the protection of personal information will continue.