Monthly Archives: November 2015

Inside Jahoo (Otlard.A ?) – A spam Botnet

Trash and Mailbox by Bethesda Softworks



Otlard.A (or let's say at least the malware triggering 2806902 || ETPRO TROJAN Win32.Otlard.A C&C Checkin response )  is a Spam Botnet

I saw it loaded as a plugin in an instance of Andromeda

That Andromeda is being spread via :


  • Bedep build id 6005 and here 6007 from an Angler EK fed by Malvertising :


VirtualDonna group redirecting traffic to an Angler instance loading Bedep buildid 6007 in memory
Bedep 6007 loading Andromeda 55ead0e4010c7c1a601511286f879e33 before update task.
2015-09-28


Note : Bedep 6007 was sometimes loading it with other payload
-2015-09-16 for : ec5d314fc392765d065ff16f21722008 with Trapwot (FakeAV) e600985d6797dec2f7388e86ae3e82ba and Pony a4f08c845cc8e2beae0d157a3624b686
-2015-09-29 for : 37898c10a350651add962831daa4fffa with Kovter ( 24143f110e7492c3d040b2ec0cdfa3d0 )

That Andromeda beaconing to dnswow .com enslaved >10k bots in a week :
Andromeda dnswow 2015-11-22

Andromeda dnswow 2015-11-27
Here the Otlard.A task in that Andromeda instance :
Task installing Otlard.A as a plugin to Andromeda

  • a Task in a Smokebot dropped by Nuclear Pack fed by Malvertising :
Malvertising > Nuclear Pack > Smokebot > Stealer, Ramnit, Htbot and Andromeda > Otlard.A
2015-11-28
Smokebot : cde587187622d5f23e50b1f5b6c86969
Andromeda : b75f4834770fe64da63e42b8c90c6fcd
(out of topic Ramnit : 28ceafaef592986e4914bfa3f4c7f5c0 - It's being massively spread those days in many infection path. (Edit 2015-12-29 :  Htbot.B :  d0a14abe51a61c727420765f72de843a named ProxyBack by PaloAlto)

Now here is what the control panel of that plugin looks like :

Otlard.A panel :


Otlard.A - JahooManager - Main - 2015-09-27
Otlard.A - JahooManager - Servers - 2015-09-27
Otlard.A - JahooManager - Settings - 2015-09-27
Otlard.A - JahooManager - Campaigns - 2015-09-27
Otlard.A - JahooManager - Bot - 2015-09-27
that exe is : 2387fb927e6d9d6c027b4ba23d8c3073 and appears to be Andromeda





Otlard.A - JahooSender - Tasks - 2015-09-27

Otlard.A - JahooSender - Tasks - 2015-11-28



Otlard.A - JahooSender - Tasks - Done Task - 2015-09-27
Otlard.A - JahooSender - Domains - 2015-09-27
Otlard.A - JahooSender - Domains - 2015-11-28

Otlard.A - JahooSender - Messages - 2015-09-27
Otlard.A - JahooSender - Messages - 2015-11-28
Otlard.A - JahooSender - Messages - Edit a Message - 2015-11-28
Otlard.A - JahooSender - Messages - Edit a Message - 2015-11-28
Otlard.A - JahooSender - Messages - Edit a Message - 2015-11-28
Otlard.A - JahooSender - Headers - 2015-11-28
Otlard.A - JahooSender - Headers - Editing Header - 2015-11-28
Otlard.A - JahooSender - Headers - Editing Header - 2015-11-28
Otlard.A - JahooSender - Macross - 2015-11-28

Otlard.A - JahooSender - Macross - 2015-11-28


Otlard.A - JahooSender - Macross - Editing macross - 2015-11-28
Otlard.A - JahooSender  - Macross - Editing macross - 2015-11-28
Otlard.A - JahooSender - Macross - Editing macross - 2015-11-28
Otlard.A - JahooSender - Attach - 2015-11-28
Otlard.A - JahooSender - Attach - Attached image - 2015-11-28
Otlard.A - JahooSender - Rules - 2015-11-28
Otlard.A - JahooSender - Rules > Spam - 2015-11-28
Olard.A - JahooSender - Rules > User - 2015-11-28
Olard.A - Bases - Emails - 2015-11-28
Olard.A - Bases - Blacklist - 2015-11-28
Olard.A - Bases - Blacklist - Edit - 2015-11-28
Olard.A - Botnet - Main - 2015-09-27
Olard.A - Botnet - Main - 2015-11-28
Otlard.A - Botnet - Modules - 2015-11-28
Otlard.A - Botnet - Modules - Edit - 2015-11-28
Otlard.A - Incubator - Accounts - 2015-11-28
Otlard.A - Incubator - Settings - 2015-11-28
Note : registrator menu has disappeared in last version. 


--
Andromeda C&C 2015-11-28 :
5.8.35.241
202023 | 5.8.35.0/24 | LLHOST | EU | llhost-inc.com | LLHost Inc

Spam Module C&C 2015-11-28 :

5.8.32.10 
5.8.32.8
5.8.32.52
5.8.34.20
5.8.32.53
5.8.32.56
202023 | 5.8.32.0/24 | LLHOST | EU | zanufact.com | LLHost Inc

Thanks : Brett StoneGross for helping me with decoding/understanding the network communications

Files :
All samples which hashes have been discussed here are in that zip.
Jahoo - socker.dll : 7d14c9edfd71d2b76dd18e3681fec798
( If you want to look into this, i can provide associated network traffic)

Read More :

Inside Andromeda Bot v2.06 Webpanel / AKA Gamarue - Botnet Control Panel 2012-07-02
Inside Pony 1.7 / Fareit C&C - Botnet Control Panel - 2012-06-27
Inside Smoke Bot - Botnet Control Panel - 2012-04-28

Post publication Reading :
ProxyBack Malware Turns User Systems Into Proxies Without Consent - 2015-12-23 - JeffWhite - PaloAlto

EDPS Issues Opinion on the Challenges of Big Data

On November 19, 2015, the European Data Protection Supervisor (the “EDPS”) published an Opinion entitled Meeting the Challenges of Big Data (the “Opinion”). The Opinion outlines the main challenges, opportunities and risks of big data, and the importance placed on companies processing large volumes of personal data to implement innovative methods to comply with data protection laws.

Main Objectives to Achieve a Big Data Protection Ecosystem

In particular, the Opinion emphasizes the following objectives:

  • Transparency. Companies that process large volumes of personal data should be transparent by providing data subjects with appropriate information on what data is processed and the logic behind big data analytics. Further, companies should adopt policies to safeguard individuals’ rights and implement effective notices regarding their information processing practices.
  • User Control. Individuals should benefit from a high degree of control over their personal data, allowing them to make meaningful choices with respect to the use of their data.
  • Protection by Design. Data protection should be embedded into products and services. Companies should use privacy-friendly engineering methods and find innovative ways to inform data subjects about, and grant individuals control over, the processing of their personal data.
  • Accountability. Companies should implement internal control mechanisms to ensure and demonstrate compliance with data protection law and be held accountable to data subjects and supervisory authorities.

Next Steps

The Opinion recognized the need to adopt a data protection reform package that strengthens and modernizes the regulatory framework so that it remains effective in a big data ecosystem.

The EDPS will organize a Big Data Protection workshop for policymakers and experts to encourage dialogue and draw attention to data protection and the privacy challenges associated with big data.

Read the full Opinion of the EDPS.

APEC Leaders Reinforce Commitment to CBPR System

On November 19, 2015, the White House released a fact sheet from the 23rd Annual APEC Economic Leaders’ Meeting in the Philippines. Under the section on Enhancing Regional Economic Integration, representatives from the U.S. and other APEC economies reinforced their commitment to the ongoing implementation of the APEC Cross-Border Privacy Rules (“CBPR”) system for information controllers.

At the meeting, the U.S. and other APEC economies worked to expand trade and investment by “facilitating cross border business activity through the expansion of globally interoperable privacy frameworks through economy and industry participation in the APEC Cross Border Privacy Rules (CBPR) System, as well as working to develop bridges between APEC CBPR’s and similar systems in Europe.”

In addition, the APEC 2015 Leaders’ Declaration and the 2015 APEC Joint Ministerial Meeting Statement reference the importance of cross-border privacy and the CBPR system and an increased participation among member economies.

Class Action Filed Against Georgia’s Secretary of State

On November 17, 2015, two plaintiffs filed a putative class action alleging that Georgia’s Secretary of State, Brian Kemp, improperly disclosed the Social Security numbers, driver’s license numbers and birth dates of more than 6.1 million Georgia voters. The lawsuit alleges that the Secretary violated Georgia’s Personal Identity Protection Act by disclosing the voters’ personally identifiable information, failing to provide voters notice of the breach and failing to notify consumer reporting agencies.

The plaintiffs allege that a “Voter File” typically is distributed monthly to political parties and members of the media, but includes only certain data elements such as voter names, addresses, race, gender, registration date and last voting date. In October 2015, a different version of the Voter File was allegedly mailed that also disclosed voters’ Social Security numbers, driver’s license numbers and birth dates.

In public statements, Secretary Kemp has taken responsibility for the mailings. The Secretary has terminated the employee responsible for what is being called a “clerical error,” and claims that all of the discs containing the files have been retrieved or destroyed. The Secretary’s office also has “verified with the media outlets and political parties that they have not copied or otherwise disseminated confidential data.”

Security Weekly #442 – Interview with Ferruh Mavituna

Interview with Ferruh Mavituna

Security Weekly brings back Ferruh Mavituna to discuss SLDC and writing vulnerable command injection in PHP. For a full list of topics discussed, visit our wiki: http://wiki.securityweekly.com/wiki/index.php/Episode442#Guest_Interview:_Ferruh_Mavituna_-_6:05PM-6:45PM

 

Failed Windows 3.1 and Hacking BackSecurity news this week we talk about the latest iThing, this one brews your coffee. Find out why its a bad idea to run Windows 3.1 in your environment, or Windows NT. Paul goes back in time, talking about OpenVMS.

http://wiki.securityweekly.com/wiki/index.php/Episode442#Stories_of_the_Week_-_7:00PM-8:00PM

 

Security Weekly Web Site: http://securityweekly.com

Hack Naked Gear: http://shop.securityweekly.com

Follow us on Twitter: @securityweekly

CIPL Points to Transparency as Key Catalyst for Innovative Information Economy

On November 20, 2015, Markus Heyder, Vice President of the Centre for Information Policy Leadership (“CIPL”) at Hunton & Williams LLP, discussed how “transparency is increasingly understood as a core component of addressing the challenges of the modern information economy” and a key catalyst for a productive and innovative information economy in an article entitled Transparency and the Future of Driverless Privacy published by the International Association of Privacy Professionals.

According to Heyder, the complexities of information practices in the digital economy can lead to a sense of suspicion and lack of trust in society towards the organizations that collect and use personal data, potentially causing overreactions to otherwise perfectly legitimate and beneficial uses of personal data. Reducing this lack of trust begins with transparency, which essentially has three distinct goals, depending on context:

  • Provide the appropriate amount of information to enable informed user engagement, choice or consent with respect to specific uses of personal data.
  • Create general awareness of information practices in a way that explains the “value exchange” between individuals and businesses and creates consumer trust and “buy-in,” even in the absence of choice and consent.
  • Educate policymakers, legislators and privacy enforcement authorities about the value propositions and benefits associated with information uses as well as the associated risks (or lack thereof) to enable informed and effective policies, laws and enforcement.

According to Heyder, while many people consider providing the appropriate amount of information as the principal goal, seeing “transparency as a new and improved way to devise actionable privacy notices,” creating general awareness and education must increasingly become more important.

In the age of big data, the Internet of Things, ubiquitous information collection and the inferring and sharing of data, there will be an increasing number of situations where individual engagement, choice or consent are no longer practicable, possible, or even wanted by individuals, as CIPL discussed in a previous article. In these situations, the primary role of transparency is to create general awareness of the “value exchange” and how organizations are using data for beneficial purposes and how they protect data, as well as to explain and demonstrate responsible and beneficial information uses to policymakers, legislators and regulators in a way that enables sensible privacy laws, regulations and enforcement. Instead of relying on individual choice and consent, organizations will have to employ alternative mechanisms to protect individuals in this environment that are based on organizational accountability, and enabled and supported by the prevailing privacy frameworks.

To illustrate the importance of the ongoing debate about transparency, Heyder points to initiatives such as the Data Transparency Lab, a community effort founded by MIT, Mozilla Foundation, the Open Data Institute and Telefónica to advance online personal data transparency through scientific research and design, and follow-up work by CIPL and others on the recent Privacy Bridges Report.

Read the full article.

Hack Naked TV – November 20, 2015

Welcome to another episode of Hack Naked TV recorded November 20th 2015. Today Beau talks Bitlocker bypass, Gmail address spoofing and more. For a full list of stories covered, visit the wiki here: http://wiki.securityweekly.com/wiki/index.php/Hack_Naked_TV_November_20_2015#Beau.27s_Stories

Security Weekly Web Site: http://securityweekly.com

Hack Naked Gear: http://shop.securityweekly.com

Follow us on Twitter: @securityweekly

Hack Naked TV – November 19, 2015

Welcome to another episode of Hack Naked TV recorded November 19th 2015. Today Aaron talks about encrypted communications in the Paris terrorist attacks, Google security news, Comcast password resets, and the Well Fargo Cybersecurity Survey.

For a full list of stories, visit our wiki here: http://wiki.securityweekly.com/wiki/index.php/Hack_Naked_TV_November_19_2015#Aaron.27s_Stories

Security Weekly Web Site: http://securityweekly.com

Hack Naked Gear: http://shop.securityweekly.com

Follow us on Twitter: @securityweekly

French Data Protection Authority Issues Guidance and FAQs on Safe Harbor

On November 19, 2015, the French Data Protection Authority (“CNIL”) published guidance, including a set of frequently asked questions, to assist companies that are transferring personal data to the U.S. pursuant to the Safe Harbor framework.

In the guidance, the CNIL stated that the October 6, 2015 decision of the Court of Justice of the European Union (“CJEU”) invalidated the European Commission’s decision on the adequacy of the protection provided by Safe Harbor. Consequently, companies can no longer rely on Safe Harbor to transfer personal data to the U.S. The CNIL then stated that, on October 15, 2015, it met with other European data protection authorities (“DPAs”) within the Article 29 Working Party (the “Working Party”) to draw up a joint action plan that would allow stakeholders to adapt to the new legal circumstances. During that meeting, the Working Party called upon the EU institutions and Member States to adopt a new legal framework allowing the transfer of personal data from the EU to the U.S. in accordance with the requirements set out by the CJEU by January 31, 2016. Until January 31, 2016, the Working Party confirmed that companies may use Binding Corporate Rules (“BCRs”) or EU Model Clauses to legitimize their data transfers to Safe Harbor certified companies. The CNIL explained that the DPAs are still analyzing the impact of the CJEU ruling on BCRs and EU Model Clauses, but have decided to allow companies to rely on them temporarily. The CNIL also pointed out that EU Model Clauses are the most suitable mechanism, since the implementation of BCRs takes several months. Therefore, the CNIL has called upon companies to implement EU Model Clauses if they wish to continue transferring personal data to U.S. Safe Harbor certified companies. The guidance makes no reference to other data transfer mechanisms, or in particular, to derogations (such as data subject consent). Such derogations have always been narrowly interpreted by the CNIL and may not legitimize repeated, mass or structural data transfers to the U.S.

In terms of registration formalities, the CNIL made it clear that companies must amend their existing notifications by the end of January 2016 to either declare that their data transfers to the U.S. have ceased, or to indicate that the data transfers will be based on another data transfer mechanism (in practice, EU Model Clauses). Data transfers based on EU Model Clauses require the CNIL’s prior ad hoc authorization. To speed up the registration process, the CNIL recommends filing new and simplified notifications in which companies commit to complying with the requirements laid down by the CNIL in its “Simplified Norm No. 46” and/or “Simplified Norm No. 48,” relating respectively to the processing of employees’ personal data and the processing of customers’ personal data. These Simplified Norms authorize data transfers outside of the EU. This assumes, however, that the data processing activities or transfers fall within the scope of the CNIL’s Simplified Norms. If not, companies must amend their existing notifications and obtain the CNIL’s ad hoc authorization for their transfers.

Finally, the CNIL stated that, beyond January 31, 2016, and in the absence of a Safe Harbor 2.0, the European DPAs will examine the possibility of using their enforcement powers to suspend or forbid data transfers to the U.S.

Retailer Sued over Allegations that Background Check Consent Form Includes Extraneous Information

As reported in the Hunton Employment & Labor Perspectives Blog:

On November 2, 2015, a putative class action was filed against retailer Big Lots Stores, Inc. in Philadelphia, stemming from allegations that the company “systematically” violated the Fair Credit Reporting Act’s (“FCRA’s”) “standalone disclosure requirement” by making prospective employees sign a document used as a background check consent form that contained extraneous information. Among other things, the plaintiff alleges that Big Lots’ form violates the FCRA because it includes the following three categories of extraneous information: (1) an “implied liability waiver” (specifically, a statement that the applicant “fully understand[s] that all employment decisions are based on legitimate nondiscriminatory reasons”); (2) state-specific notices; and (3) information on how background information will be gathered and from which sources, statements pertaining to disputing any information, and the name and contact information of the consumer reporting agency.

The case filed against Big Lots is in line with the recent increase of cases brought against employers for hyper-technical violations of the FCRA’s “standalone disclosure requirement.” In fact, Big Lots is fighting another, similar putative class action that was filed against it earlier this year in Illinois. In that case, which is currently pending in the Northern District of Illinois, Big Lots filed a motion to dismiss, which also is still pending with the court, and argued, inter alia, that “the inclusion of additional language in [its] form was not so great a distraction to negate the effectiveness of the disclosure and consent,” and “even if [its] form[s] were technically in violation of the FCRA, [the p]laintiff has not and cannot properly allege or prove a willful violation.”

Section 604(b)(2) of the FRCA specifies that an employer may not obtain a background check report unless:

  • a clear and conspicuous disclosure has been made in writing to the consumer at any time before the report is procured or caused to be procured, in a document that consists solely of the disclosure, that a consumer report may be obtained for employment purposes; and
  • the consumer has authorized in writing the procurement of the report by that person.

The disclosure needs to be in a completely separate standalone document (and not part of an employment handbook package or application for employment), although the written authorization can be part of the disclosure form.

The Federal Trade Commission (the agency charged with enforcing the FCRA) has advised that employers should not include any extraneous information on the disclosure form to avoid an applicant or employee from being distracted by other information that is presented side-by-side with the important disclosure language. The concern is that the disclosure should not be diminished in importance by including unrelated information and that the disclosure should not be buried in small text at the end of other documentation where it can be missed.

Failure to comply with the FCRA can result in state or federal government enforcement actions, as well as private lawsuits. In the case of willful noncompliance, an employer can be subject to any actual damages sustained by the consumer as a result of the failure or damages of not less than $100 and not more than $1,000, punitive damages, and attorneys’ fees. In addition, any person who knowingly and willfully obtains a consumer report under false pretenses may face criminal prosecution.

Currently, the courts are split as to what types of additional information will result a violation of the FCRA’s “stand-alone disclosure requirement.”

However, in light of the recent attacks against employers’ disclosure forms, employers should review their disclosure and/or authorization forms to make sure that their forms provide clear notice that a background check report may be obtained for employment purposes and that the authorization sought is an authorization for the procurement of such a report. Also, in order to avoid litigation, employers should take a critical look at their forms to determine whether their forms include extraneous information that can be easily deleted from them.

China Publishes New Regulation for Personal Data Security in the Courier Industry

On November 16, 2015, the Legislative Affairs Office of the State Council of the People’s Republic of China published a draft Regulation for Couriers (the “Regulation”) and requested public comment on the Regulation. Interested parties have until mid-December 2015 to submit comments on the Regulation. The Regulation comes at a time when courier services and online shopping are growing steadily in China. Under the Regulation, the sender of a parcel will be required to fill in his or her real name and address, the telephone numbers of both the sender and the recipient, as well as the name, quantity and nature of the object being couriered.

The courier company would then be required to verify that information. The courier company also would be required to refuse orders with false information on the waybill. If a courier company fails to check the waybill or accepts parcels with false information on the waybill, it may be subject to a maximum fine of RMB 10,000 (approximately $1,586).

The Regulation also would require courier companies to regularly destroy waybills to protect client personal information. In addition, courier companies will be prohibited from illegally selling or leaking client personal information they collect in the course of providing services. In cases of actual or potential leakage, destruction or loss of client personal information, courier companies would be required to take immediate remedial measures and report the incident to the local postal administration authority.

A courier company that breaches the foregoing obligations may face administrative penalties including the issuance of a warning, confiscation of illegal income, a maximum fine of RMB 50,000 (approximately $7,840) and possibly even revocation of its operating license.

In addition, if a courier company opens parcels without permission, hides, destroys, sells or illegally inspects parcels, it will be subject to a maximum fine of RMB 200,000 (approximately $31,360). In exceptional circumstances, its operating license also may be revoked.

dnscat2: now with crypto!

Hey everybody,

Live from the SANS Pentest Summit, I'm excited to announce the latest beta release of dnscat2: 0.04! Besides some minor cleanups and UI improvements, there is one serious improvement: all dnscat2 sessions are now encrypted by default!

Read on for some user information, then some implementation details for those who are interested! For all the REALLY gory information, check out the protocol doc!

Tell me what's new!

By default, when you start a dnscat2 client, it now performs a key exchange with the server, and uses a derived session key to encrypt all traffic. This has the huge advantage that passive surveillance and IDS and such will no longer be able to see your traffic. But the disadvantage is that it's vulnerable to a man-in-the-middle attack - assuming somebody takes the time and effort to perform a man-in-the-middle attack against dnscat2, which would be awesome but seems unlikely. :)

By default, all connections are encrypted, and the server will refuse to allow cleartext connections. If you start the server with --security=open (or run set security=open), then the client decides the security level - including cleartext.

If you pass the server a --secret string (see below), then the server will require clients to authenticate using the same --secret value. That can be turned off by using --security=open or --security=encrypted (or the equivalent set commands).

Let's look at the man-in-the-middle protection...

Short authentication strings

First, by default, a short authentication string is displayed on both the client and the server. Short authentication strings, inspired by ZRTP and Silent Circle, are a visual way to tell if you're the victim of a man-in-the-middle attack.

Essentially, when a new connection is created, the user has to manually match the short authentication strings on the client and the server. If they're the same, then it's a legit connection. Here's what it looks like on the client:

Encrypted session established! For added security, please verify the server also displays this string:

Tort Hither Harold Motive Nuns Unwrap

And the server:

New window created: 1
Session 1 security: ENCRYPTED BUT *NOT* VALIDATED
For added security, please ensure the client displays the same string:

>> Tort Hither Harold Motive Nuns Unwrap

There are 256 different possible words, so six words gives 48 bits of protection. While a 48-bit key can eventually be bruteforced, in this case it has to be done in real time, which is exceedingly unlikely.

Authentication

Alternatively, a pre-shared secret can be used instead of a short authentication string. When you start the server, you pass in a --secret value, such as --secret=pineapple. Clients with the same secret will create an authenticator string based on the password and the cryptographic keys, and send it to the server, encrypted, after the key exchange. Clients that use the wrong key will be summarily rejected.

Details on how this is implemented are below.

How stealthy is it?

To be perfectly honest: not completely.

The key exchange is pretty obvious. A 512-bit value has to be sent via DNS, and a 512-bit response has to come back. That's pretty big, and stands out.

After that, every packet has an unencrypted 40-bit (5-byte) header and an unencrypted 16-bit (2-byte) nonce. The header contains three bytes that don't really change, and the nonce is incremental. Any system that knows to look for dnscat2 will be able to find that.

It's conceivable that I could make this more stealthy, but anybody who's already trying to detect dnscat2 traffic will be able to update the signatures that they would have had to write anyway, so it becomes a cat-and-mouse game.

Of course, that doesn't stop people from patching things. :)

The plus side, however, is that none of your data leaks! And somebody would have to be specifically looking for dnscat2 traffic to recognize it.

What are the hidden costs?

Encrypted packets have 64 bits (8 bytes) of extra overhead: a 16-bit (two-byte) nonce and a 48-bit (six-byte) signature on each packet. Since DNS packets have between 200 and 250 bytes of payload space, that means we lose ~4% of our potential bandwidth.

Additionally, there's a key exchange packet and potentially an authentication packet. That's two extra roundtrips over a fairly slow protocol.

Other than that, not much changes, really. The encryption/decryption/signing/validation are super fast, and it uses a stream cipher so the length of the messages don't change.

How do I turn it off?

The server always supports crypto; if you don't WANT crypto, you'll have to manually hack the server or use a version of dnscat2 server <=0.03. But you'll have to manually turn off encryption in the client; otherwise, the connection fail.

Speaking of turning off encryption in the client: you can compile without encryption by using make nocrypto. You can also disable encryption at runtime with dnscat2 --no-encryption. On Visual Studio, you'll have to define "NO_ENCRYPTION". Note that the server, by default, won't allow either of those to connect unless you start it with --security=open.

Give me some technical details!

Your best bet if you're REALLY curious is to check out the protocol doc, where I document the protocol in full.

But I'll summarize it here. :)

The client starts a session by initiating a key exchange with the server. Both sides generate a random, 256-bit private key, then derive a public key using Elliptic Curve Diffie Hellman (ECDH). The client sends the public key to the server, the server sends a public key to the client, and they both agree on a shared secret.

That shared secret is hashed with a number of different values to derive purpose-specific keys - the client encryption key, the server encryption key, the client signing key, the server signing key, etc.

Once the keys are agreed upon, all packets are encrypted and signed. The encryption is salsa20 and uses one of the derived keys as well as an incremental nonce. After being encrypted, the encrypted data, the nonce, and the packet header are signed using SHA3, but truncated to 48 bits (6 bytes). 48 bits isn't very long for a signature, but space is at an extreme premium and for most attacks it would have to be broken in real time.

As an aside: I really wanted to encrypt the header instead of just signing it, but because of protocol limitations, that's simply not possible (because I have no way of knowing which packets belong to which session, the session_id has to be plaintext).

Immediately after the key exchange, the client optionally sends an authenticator over the encrypted session. The authenticator is based on a pre-shared secret (passed on the commandline) that the client and server pre-arrange in some way. That secret is hashed with both public keys and the secret (derived) key, as well as a different static string on the client and server. The client sends their authenticator to the server, and the server sends their authenticator to the client. In that way, both sides verify each other without revealing anything.

If the client doesn't send the authenticator, then a short authentication string is generated. It's based on a very similar hash to the authenticator, except without the pre-shared secret. The first 6 bytes are converted into words using a list of 256 English words, and are displayed on the screen. It's up to the user to verify them.

Because the nonce is only 16 bits, only 65536 roundtrips can be performed before running out. As such, the client may, at its own discretion (but before running out), initiate a new key exchange. It's identical to the original key exchange, except that it happens in a signed and encrypted packet. After the renegotiation is finished, both the client and server switch their nonce values back to 0 and stop accepting packets with the old keys.

And... that's about it! Keys are exchanged, an authenticator is sent or a short authentication string is displayed, all messages are signed and encrypted, and that's that!

Challenges

A few of the challenges I had to work through...

  • Because DNS has no concept of connections/sessions, I had to expose more information that I wanted in the packets (and because it's extremely length-limited, I had to truncate signatures)
  • I had originally planned to use Curve25519 for the key exchange, but there's no Ruby implementation
  • Finding a C implementation of ECC that doesn't require libcrypto or libssl was really hard
  • Finding a working SHA3 implementation in Ruby was impossible! I filed bugs against the three more popular implementations and one of them actually took the time to fix it!
  • Dealing with DNS's gratuitous retransmissions and accidental drops was super painful and required some hackier code than I like to see in crypto (for example, an old key can still be used, even after a key exchange, until the new one is used successfully; the more secure alternative can't handle a dropped response packet, otherwise both peers would have different keys)

Shouts out

I just wanted to do a quick shout out to a few friends who really made this happen by giving me advice, encouragement, or just listening to me complaining.

So, in alphabetical order so nobody can claim I play favourites, I want to give mad propz to:

  • Alex Weber, who notably convinced me to use a proper key exchange protocol instead of just a static key (and who also wrote the Salsa20 implementation I used
  • Brandon Enright, who give me a ton of handy crypto advice
  • Eric Gershman, who convinced me to work on encryption in the first place, and who listened to my constant complaining about how much I hate implementing crypto

FCC Reaches Settlement with Cable Operator over Customer Data Breach

On November 5, 2015, the Enforcement Bureau of the Federal Communications Commission (“FCC”) entered into a Consent Decree with cable operator Cox Communications to settle allegations that the company failed to properly protect customer information when the company’s electronic data systems were breached in August 2014 by a hacker. The FCC alleged that Cox failed to properly protect the confidentiality of its customers’ proprietary network information (“CPNI”) and personally identifiable information, and failed to promptly notify law enforcement authorities of security breaches involving CPNI in violation of the Communications Act of 1934 and FCC’s rules.

The data breach suffered by Cox in August 2014 occurred when a third party gained access to Cox’s systems by perpetrating a social engineering “phishing” attack on the company’s personnel. According to the Consent Decree, the relevant systems allegedly did not have technical safeguards (e.g., multi-factor authentication) to prevent the compromised credentials from being used to access customer information. As a result, the attackers allegedly acquired sensitive personal information of Cox customers, including their contact information, partial Social Security numbers, partial driver’s license numbers and telephone account-related data. The FCC indicated that the hacker later posted personal information of at least eight affected customers on social media sites, changed the passwords of at least 28 affected customers and further shared customer personal information.

In the Consent Decree, the FCC claimed that telecommunications carriers such as Cox are obligated under the Communications Act of 1934 to take “every reasonable precaution” to protect their customers’ data and must promptly disclose CPNI breaches via the FCC’s reporting portal within seven business days after reasonable determination of a breach. Based on these allegations, the FCC claimed Cox violated the Communications Act of 1934 and FCC rules by: (1) failing to properly protect the confidentiality of customers’ personally identifiable information; (2) failing to take reasonable measures to discover and protect against attempts to gain unauthorized access to CPNI; (3) failing to provide timely notification to law enforcement of a CPNI breach; and (4) engaging in unjust and unreasonable practices as a result of its failure to employ reasonable data security practices to protect proprietary information and CPNI, to monitor for customers’ breached data online and to notify all potentially affected customers of the breaches.

As part of the settlement, Cox agreed to pay a civil penalty of $595,000 and to develop and implement a compliance plan to help protect customer information against similar data breaches. The compliance plan requires Cox, for example, to improve its privacy and data security practices by: (1) designating a senior corporate manager who is a certified privacy professional; (2) conducting privacy risk assessments; (3) implementing a written information security program; (4) maintaining reasonable oversight of third party vendors; (5) implementing a more robust data breach response plan; and (6) filing regular compliance reports with the FCC. Pursuant to the Consent Decree, Cox also must identify all affected consumers, notify them of the breach and offer them free credit monitoring.

Administrative Law Judge Dismisses Complaint for Failure to Show Current or Future Substantial Consumer Injury

On November 13, 2015, Chief Administrative Law Judge D. Michael Chappell dismissed the FTC’s complaint against LabMD Inc. (“LabMD”) for failing to show that LabMD’s allegedly unreasonable data security practices caused, or were likely to cause, substantial consumer injury. The law judge did not address LabMD’s claim that the FTC does not have jurisdiction to enforce data security standards under the unfairness prong of Section 5 of the FTC Act, and LabMD has reserved its jurisdictional challenge for an anticipated appeal to the federal court. The action is In the Matter of LabMD Inc., Docket No. 9357.

The initial FTC complaint alleged that LabMD, a clinical testing laboratory, violated Section 5(a) of the FTC Act by failing to provide reasonable and appropriate security for personal information maintained on LabMD’s computer networks, thereby causing or likely causing substantial consumer injury. The complaint cited two specific security incidents which were allegedly caused by LabMD’s unreasonable data security. The first incident occurred when a third party informed LabMD that an insurance aging report, which contained personal information of approximately 9,300 LabMD clients (including names, dates of birth and Social Security numbers), was available on a peer-to-peer file-sharing network. The second incident occurred when it was reported that day sheets and copied checks, which contained personal information of approximately 600 LabMD clients (including names and Social Security numbers), were found in the possession of individuals who pleaded no contest to identity theft charges.

Regarding the first incident, the administrative law judge found no proof of identity theft-related or emotional harm, or likely future harm. Regarding the second incident, the administrative law judge found no proof that the exposure of the documents was causally connected to any failure of LabMD to reasonably protect data on its computer network, because there was insufficient evidence showing that the documents were maintained or taken from LabMD’s computer network.

The administrative law judge ultimately dismissed the entire complaint, finding that the “preponderance of the evidence…fails to show that [LabMD’s] alleged unreasonable data security caused, or is likely to cause, substantial consumer injury.” The law judge also stated, “At best, Complaint Counsel has proven the ‘possibility’ of harm, but not any ‘probability’ or likelihood of harm. Fundamental fairness dictates that demonstrating actual or likely substantial consumer injury…requires proof of more than the hypothetical or theoretical harm that has been submitted by the government in this case.”

French Data Protection Authority Imposes Fine for Inadequate Security Measures

On November 13, 2015, the French Data Protection Authority (“CNIL”) announced its decision in a case against Optical Center, imposing a fine of €50,000 on the company for violations related to the security and confidentiality of its customers’ personal data.

Optical Center distributes optical products via its store network and website, which contains 170,000 customer accounts in France. In July 2014, following a complaint, the CNIL audited the company’s data processing activities. On December 9, 2014, the CNIL served a formal notice on Optical Center, ordering it to cease its non-compliant activities within one month. Optical Center made representations indicating that it would partially comply. Subsequently, the CNIL conducted another inspection, and confirmed that Optical Center still was not complying with its data security obligations. As a result, the CNIL imposed a significant fine on Optical Center and decided to make its decision public.

In its decision, the CNIL noted that Optical Center did not secure (1) the homepage on which web users log into their online accounts or (2) the web page on which users change their passwords. The CNIL also stated that (1) customer and employee passwords were not robust enough; (2) Optical Center did not implement a password management policy for accessing employee computer workstations; (3) employee workstations were not automatically locked in the event of prolonged inactivity; and (4) access from the Internet to the company’s back office was not secure. The CNIL concluded that, as a data controller, Optical Center failed to implement appropriate data security measures. In addition, the CNIL determined that Optical Center did not implement a proper data processor agreement with a service provider. In particular, the agreement with the service provider did not (1) specify that the service provider must act only on instructions from Optical Center, and (2) impose specific data security obligations on the service provider.

Security Weekly #441 – Interview with Marton Linvy & Barton Miller from SWAMP

Interview with Miron Livny and Barton Miller

This week, we interview Miron Livny and Barton Miller of SWAMP. SWAMP simultaneously alleviates the costs, maintenance and licensing burdens of tools, while also eliminating the need to learn numerous tool interfaces. You can read more about SWAMP here: https://continuousassurance.org/

 

IoT Security In Alarm Clocks

Security news this week features the unmasking of TOR users, an alarm clock that slaps you around and more. For a full list of stories, visit our wiki: http://wiki.securityweekly.com/wiki/index.php/Episode441#Stories_of_the_Week_-_7:00PM-8:00PM

Security Weekly Web Site: http://securityweekly.com

Hack Naked Gear: http://shop.securityweekly.com

Follow us on Twitter: @securityweekly

Hunton’s Data Protection Practice Recognized by Chambers UK and Legal 500 UK

Hunton & Williams LLP announces the firm’s Global Privacy and Cybersecurity practice was again recognized by Chambers UK 2016 and The Legal 500 UK 2015 guides, earning Tier 1 rankings. Chambers UK noted that the practice has a “superbly strong bench of highly experienced counsel adept at advising on the most complex of information law concerns. Particularly accomplished in guiding clients through the privacy implications of new technologies, international data transfers, and BCR applications and implementations.” Additionally, the firm’s European data protection practice leaders are recognized in the “Star” and “Senior Statesman” categories by Chambers UK, the highest categories of rankings. Bridget Treacy, head of the firm’s UK Privacy and Cybersecurity practice, and senior attorney consultant Rosemary Jay, received the top honors of “Star” individuals for data protection. Richard Thomas, formerly the UK Information Commissioner and the firm’s global strategy advisor, was again recognized as a “Senior Statesman.”

The practice focuses on all aspects of privacy, data protection, information governance and e-commerce issues for multinational companies across a broad range of industry sectors. Over the last eight years, the practice has been recognized by Chambers Global, Chambers UK and Chambers USA as a Tier 1 firm. Computerworld magazine recognized Hunton & Williams as the best global privacy advisor in each of its four surveys.

Read the full press release.

Ninth Circuit Holds that the EEOC Has Broad Access to Personal Information, Including Social Security Numbers

As reported in the Hunton Employment & Labor Law Perspectives Blog:

On October 27, 2015, the Ninth Circuit held in EEOC v. McLane Co., Inc. that the EEOC has broad subpoena powers to obtain nationwide private personnel information, including Social Security numbers (“SSNs”), in connection with its investigation of a sex discrimination charge.

Damiana Ochoa, a former employee of a McLane subsidiary in Arizona, filed a charge with the EEOC alleging sex discrimination (based on pregnancy), claiming that when she tried to return to work after taking maternity leave, the company informed her that she could not return to work until she passed a physical capability strength test. Ochoa alleged that the company requires all new employees and all employees returning from medical leave to take the test and acknowledged that she failed this test three times. Based on her failure to pass the test, the company terminated Ochoa’s employment.

The EEOC broadened its investigation beyond Ochoa’s claims to all company facilities nationwide. The company provided certain information to the EEOC about the test and the individuals who had been required to take it, but refused to comply with an administrative subpoena that asked for “pedigree information” (such as each test taker’s name, SSN, last known address and phone number), and for the test takers who were ultimately terminated, the reasons for termination. In lieu of providing SSNs, the company provided an “employee ID number” created solely for purposes of responding to the EEOC’s investigation. In response, the EEOC filed a subpoena enforcement action. The district court sided with the company and did not require the company to turn over the pedigree information or the reason for termination information.

In a unanimous ruling, a three-judge panel of the Ninth Circuit reversed the district court’s ruling and held that the company had to provide all of the pedigree information requested by the EEOC. The Court reasoned that the EEOC has broad investigatory powers which are not constrained by strict relevancy requirements. Citing EEOC v. Shell Oil Co., 466 U.S. 54, 68-69 (1984), the Court found that the relevance standard that applies in this context “encompasses ‘virtually any material that might cast light on the allegations against the employer’” and, accordingly, the company should produce the pedigree information because it is relevant to the EEOC’s investigation, particularly since the EEOC should be able to contact and speak to other employees and applicants to learn more about their experiences.

Although the company argued that it is trying to protect its employee’s privacy interests, the Court found that SSNs are protected from public disclosure by 42 U.S.C. § 2000e-8(e). As for the reason for termination, the Court remanded the issue back to the district court to consider the company’s undue burden arguments.

The Ninth Circuit’s ruling demonstrates how much leeway certain courts are willing to provide the EEOC with respect to its broad investigatory powers, including the production of highly confidential personnel information such as SSNs. Here, the Court’s stated reasons for why the pedigree information is relevant – so that the EEOC could contact other test takers – does not support the production of SSNs. While Judge Smith’s concurrence highlighted data privacy concerns and the government’s attempts in protecting such information, his concurrence notes that “we, as a court, are not in a position in this case to weigh the concerns present in any particular data gathering and storage protocol.”

Security Weekly #440 – Interview with Michael Bazzell, Stories of the Week

Interview with Michael Bazzell

This week we interview Michael Bazzell author of "Open Source Intelligence Techniques", "Hiding from the Internet" and the technical advisor for TV hacker drama "Mr. Robot" on the USA network.

For a list of relevant links, visit our wiki: http://wiki.securityweekly.com/wiki/index.php/Episode440#Interview:_Michael_Bazzell

 

Security News - Canadian Encryption

This week, Paul and the crew discusses the million dollar bug bounty for iPhones and why it may be legal to hack your car. For a full list of stories talked about during the show, visit our wiki: http://wiki.securityweekly.com/wiki/index.php/Episode440#Stories_of_the_Week_-_7:00PM-8:00PM

Security Weekly Web Site: http://securityweekly.com

Hack Naked Gear: http://shop.securityweekly.com

Follow us on Twitter: @securityweekly

Brazil Releases Revised Draft Privacy Bill

In late October, the Brazilian Ministry of Justice (the “Ministry”) issued its revised Draft Bill for the Protection of Personal Data (“Draft Bill”). The Ministry released its preliminary draft in January 2015, and the Centre for Information Policy Leadership at Hunton & Williams LLP (“CIPL”) filed public comments to the draft on May 5, 2015.

Key changes to the new Draft Bill include:

  • adding “legitimate interest” as a basis for processing non-sensitive personal information;
  • adding a risk-based approach by data controllers and processors in establishing “best practices standards”;
  • broadening the definition of “consent”;
  • adding consent as a basis for legitimizing cross-border transfers;
  • requiring the application of data processing principles to public data;
  • adding a chapter on personal data processing by public authorities;
  • clarifying the competence of the Competent Public Body (a privacy authority); and
  • creating a multistakeholder, National Counsel of the Protection of Personal Data, to assist the Competent Public Body.

A more detailed summary of the revised Draft Bill can be found in an article titled Main Innovations of the Newest Version of the Brazilian Draft Law on the Protection of Personal Data, written by Brazilian attorneys Renato Leite Monteiro, Cyber Law and International Law Professor at Mackenzie University School of Law, and Bruno Bioni, Researcher for The Public Policy for Access to Information Research Group at the University of São Paulo. The next steps for the Draft Bill include an evaluation by the Brazilian Office of the Presidential Chief of Staff, followed by an introduction to Congress.

In addition, there are two other privacy bills currently moving through the Brazilian Congress, one in the Chamber of Deputies and another in the Federal Senate. An updated version of the Senate bill (PLS 330) was released on October 13, 2015. The current rapporteur for this bill is Senator Aloysio Nunes Ferreira. An English translation is not yet available.

In order for the Draft Bill to move forward, it would have to be merged with, or supersede, these other two privacy bills.

Trans-Pacific Partnership Addresses Cross-Border Data Transfers and Protection of Personal Information

On November 5, 2015, the White House released the proposed text of the Trans-Pacific Partnership Agreement (the “TPP”) containing a chapter on cross-border data transfers in the context of electronic commerce. In the chapter on Electronic Commerce, Chapter 14, the TPP includes commitments from participating parties to adopt and maintain a legal framework to protect personal information, and encourages cross-border data transfers to help facilitate business and trade.

Article 14.8, entitled Personal Information Protection, would commit participating countries to “adopt or maintain a legal framework that provides for the protection of the personal information of the users of electronic commerce.” The TPP advises countries to do so by taking into account principles and guidelines of relevant international bodies and to “encourage the development of mechanisms to promote compatibility between [the countries’] different regimes.”

In addition, Article 14.11, entitled Cross-Border Transfer of Information by Electronic Means, would commit participating countries to “allow the cross-border transfer of information by electronic means, including personal information, when this activity is for the conduct of the business of a covered person.” For purposes of this section, “covered person” refers to any citizen or business of any participating country, but excludes financial institutions. The TPP, however, also recognizes that countries have different cross-border transfer regimes and laws, and therefore does not prevent participating countries from “adopting or maintaining measures inconsistent with [cross-border transfers] to achieve a legitimate public policy objective,” provided that the measure does not apply a restriction on trade, unjustly discriminate or impose restrictions larger than those required to complete the transfer.

The TPP has not yet been ratified by the 12 participating countries, including the U.S. Congress will likely have at least 90 days to analyze and vote on the TPP before sending it to President Obama for final approval.

FCC to Tackle Issue of Broadband Privacy

On November 2, 2015, Federal Communications Commission (“FCC”) Chairman, Tom Wheeler, indicated in an interview that the agency would take on the issue of broadband privacy within the next several months, most likely in the form of a notice of proposed rulemaking. Chairman Wheeler said that the FCC’s inquiry would look at the privacy practices of “those who provide the networks” (i.e., Internet service providers (“ISPs”)) and how such businesses are protecting their customers’ information.

Wheeler put himself in consumers’ stead, asking, “Do I know what information is being collected?” He also asked if consumers have a voice in determining how their data is used. Wheeler indicated that scope and choice of collection are “two very important baseline rights that individuals ought to have.”

In May 2015, the FCC issued an advisory indicating that, until the agency implements new rules to guide ISPs on how to comply with Section 222 of the Communications Act of 1934, it would hold ISPs to a “reasonable, good faith steps” standard of compliance in connection with the customer privacy protections of communications law. The FCC did not offer additional details at the time, stating only that broadband providers should employ effective privacy protections in line with their privacy policies and core tenets of basic privacy protections.

Chairman Wheeler did not provide a more specific timeframe during the interview for issuance of the FCC’s privacy announcement.

NSA Ordered to Stop Bulk Telephony Metadata Program

On November 9, 2015, U.S. District Judge Richard J. Leon issued a preliminary injunction ordering the National Security Agency to stop its bulk telephony metadata program. The preliminary injunction was issued in favor of subscribers of Verizon Wireless Business Network and comes 20 days before the program was set to expire under the USA Freedom Act. The case is Klayman v. Obama et al. (1:13-cv-00851) in the U.S. District Court for the District of Columbia.

Hack Naked TV – November 9, 2015

Today Beau talks about vBulletin RCE, PageFair serving malware, and a million dollar bug bounty for iOS 9. For a full list of stories visit http://wiki.securityweekly.com/wiki/index.php/Hack_Naked_TV_November_9_2015#Beau.27s_Stories.

For a directory of all Hack Naked TV shows visit http://wiki.securityweekly.com/wiki/index.php/Hack_Naked_Show_Notes.

Security Weekly Web Site: http://securityweekly.com

Hack Naked Gear: http://shop.securityweekly.com

Follow Security Weekly on Twitter: @securityweekly

Follow Beau on Twitter: @dafthack

Hunton Authors Article in Pratt’s Privacy & Cybersecurity Law Report

Hunton & Williams LLP’s Aaron Simpson, partner in the firm’s Global Privacy and Cybersecurity practice, and Adam Solomon, associate, co-authored an article in Pratt’s Privacy & Cybersecurity Law Report entitled Dealmakers Ignore Cyber Risks At Their Own Peril.

The article addresses the cybersecurity and privacy risks associated with corporate transactions such as mergers and acquisitions and private equity deals. In the article, Simpson and Solomon speak in detail about how businesses can mitigate risks associated with data breaches and privacy issues by taking a proactive approach when performing due diligence.

Pratt’s Privacy & Cybersecurity Law Report is a subscription journal, written by industry leading privacy professionals and published nine times a year. The journal covers legal issues relating to privacy and cybersecurity law. Aaron Simpson currently serves on the Board of Editors.

EU Commission Publishes Communication on Transatlantic Data Transfers and Confirms Objective to Establish a New Safe Harbor Framework

On November 6, 2015, the European Commission published a communication and a Q&A document addressed to the European Parliament and European Council on the transfer of personal data from the EU to the U.S. under EU Data Protection Directive 95/46/EC (the “Directive”), following the decision by the Court of Justice of the European Union invalidating the European Commission’s Safe Harbor Decision.

In the communication, the European Commission stated that it has intensified its discussion with the U.S. Government and confirmed its objective to finish the discussions for an updated framework for transatlantic transfers of personal data “in 3 months.” According to the Commission, the updated framework must provide sufficient limitations and safeguards to ensure the continued protection of EU citizens’ personal data, including with respect to access by public authorities for law enforcement and national security purposes.

The European Commission also stated that alternative data transfer mechanisms can still be used by companies for lawful data transfers to countries outside of the EU, such as the United States. The European Commission provided guidance with respect to these alternative data transfer mechanisms. This guidance, however, does not restrict the powers and duties of data protection authorities (“DPAs”) to examine the lawfulness of cross-border transfers. The guidance states that transfers may be carried out using alternative data transfer mechanisms, including:

Contractual Solutions

Standard Contractual Clauses (“SCCs”), using the sets of model clauses provided by the European Commission. SCCs include obligations for the protection of personal data and allow data subjects to invoke the rights provided them in the contractual clauses before a DPA or a court of the EU Member State in which the data exporter is established. Model clauses are, in principle, automatically accepted by national authorities, with the exception of certain Member States that maintain a system of notification or pre-authorization of the clauses. In addition, companies can rely on ad hoc contractual agreements, subject to approval by DPAs on a case-by-case basis.

Intra-Group Transfers

Binding Corporate Rules (“BCRs”) provide a basis for intra-group transfers of personal data and ensure sufficient protection of personal data throughout a group of affiliated entities. BCRs are subject to an authorization procedure in each Member State from which the group of related entities intends to transfer data. These rules are binding for the group of related entities, but also enforceable in the EU, where third party beneficiaries can lodge complaints before a DPA or a national court to enforce compliance with the rules.

Derogations

Derogations are expressly listed in Article 26 (1) and (a) to (f) of the Directive, under which data can be transferred when one of the following derogations applies: (1) unambiguous consent of the data subject has been obtained; (2) conclusion or performance of a contract, including pre-contractual situations; and (3) establishment, exercise or defense of legal claims.

The communication stated that these derogations are strictly interpreted, and that for repeated, mass or structural transfers, companies should use a specific legal framework such as SCCs or BCRs. In addition, according to the communication, the derogation based on the data subject’s free and informed consent is an option of last resort.

The European Commission also identified two important conditions for the use of alternative data transfer mechanisms. First, the Commission stated that data must be originally collected and further processed by the data controller in accordance with national laws implementing the Directive, regardless of the legal basis for cross-border transfer on which the controller relies. Second, the Commission reaffirmed that when using alternative data transfer mechanisms, data exporters and importers still bear the responsibility to ensure that transfers comply with the safeguards set out in the Directive, which should be considered in the context of the transfer.

Although the scope of the judgment is limited to the Safe Harbor Decision, the European Commission will now prepare a decision to replace the provision included in all the other adequacy decisions that limits the powers of DPAs. Further, the Commission will regularly assess, together with the DPAs, existing and future adequacy decisions.

The European Commission also invited controllers to cooperate with the DPAs and stated that it will work closely with the Article 29 Working Party to ensure the harmonized application of EU data protection law.

Read the press release of the European Commission.

CIPL Advisor Fred Cate Moderates Discussion Panel on Transatlantic Data Protection Issues

On Monday, November 2, 2015, Hunton & Williams LLP’s Centre for Information Policy Leadership (“CIPL”) Senior Policy Advisor, Fred H. Cate, moderated an academic panel on The Data Dilemma: A Transatlantic Discussion on Privacy, Security, Innovation, Trade, and the Protection of Personal Data in the 21st Century. The event was sponsored by Indiana University and took place at the CIEE Global Institute in Berlin, Germany.

The program featured Commissioner Julie Brill from the U.S. Federal Trade Commission and Dr. Alexander Dix, Berlin Commissioner for Data Protection and Freedom of Information, in an informal dialogue addressing a wide range of pressing transatlantic data protection issues in light of the Court of Justice of the European Union’s Safe Harbor Decision, including the view of the Schleswig-Holstein Commissioner that the EU-U.S. data transfers facilitated by the use of model clauses fail to comply with EU law, EU-U.S. efforts to develop common ground and appropriate oversight of government surveillance, and the outcome of the 37th International Privacy Conference in Amsterdam.

U.S. Chamber of Commerce Testifies about Safe Harbor at a Joint Hearing of the U.S. House of Representatives

On November 3, 2015, John Murphy, Senior Vice President for International Policy at the U.S. Chamber of Commerce, testified about the Court of Justice of the European Union’s (“CJEU’s”) EU-U.S. Safe Harbor Decision at a joint hearing of the House Commerce and Communications and Technology Subcommittees.

Murphy’s testimony emphasized the economic relationship between the U.S. and the EU and stressed that this relationship “relies on the seamless flow of data across borders.” He stated that the CJEU’s decision invalidating the Safe Harbor agreement threatens the transatlantic flow of data. Murphy noted that the CJEU’s decision focused on “process concerns” within the Safe Harbor agreement, such as restrictions placed on EU Member States’ enforcement authority, and did not address “the actual substantive commercial data protection rules.”

Accordingly, Murphy thanked the House for passing the Judicial Redress Act, which is intended to address one of the process concerns identified by the CJEU, and encouraged Congress to work with a group of European Parliamentarians who are in Washington D.C. this week to resolve outstanding issues regarding Safe Harbor. He concluded by urging U.S. and EU officials to promptly implement a revised Safe Harbor framework to allow European and American companies to transfer data across the Atlantic.