So I went ahead and did a podcast with Stewart Baker, former general counsel for the NSA and actually somebody I have a decent amount of respect for (Google set me up with him during the SOPA debate, he understood everything I had to say, and he really applied some critical pressure publicly and behind the scenes to shut that mess down). Doesn’t mean I agree with the guy on everything. I told him in no uncertain terms we had some disagreements regarding backdoors. and if he asked me about them I’d say as such. He was completely OK with this, and in today’s echo-chamber loving society that’s a real outlier. The debate is a ways in, and starts around here.
You can get the audio (and a summary) here but as usual I’ve had the event transcribed. Enjoy!
Steptoe Cyberlaw Podcast-070
Stewart: Welcome to episode 70 of the Steptoe Cyberlaw Podcast brought to you by Steptoe & Johnson; thank you for joining us. We’re lawyers talking about technology, security, privacy in government and I’m joined today by our guest commentator, Dan Kaminsky, who is the Chief Scientist at WhiteOps, the man who found and fixed a major and very troubling flaw in the DNS system and my unlikely ally in the fight against SOPA because of its impact on DNS security. Welcome, Dan.
Dan: It’s good to be here.
Stewart: All right; and Michael Vatis, formerly with the FBI and the Justice Department, now a partner in in Steptoe’s New York office. Michael, I’m glad to have you back, and I guess to be back with you on the podcast.
Michael: It’s good to have a voice that isn’t as hoarse as mine was last week.
Stewart: Yeah, that’s right, but you know, you can usually count on Michael to know the law – this is a valuable thing in a legal podcast – and Jason Weinstein who took over last week in a coup in the Cyberlaw podcast and ran it and interviewed our guest, Jason Brown from the Secret Service. Jason is formerly with the Justice Department where he oversaw criminal computer crime, prosecutions, among other things, and is now doing criminal and civil litigation at Steptoe.
I’m Stewart Baker, formerly with NSA and DHS, the record holder for returning to Steptoe to practice law more times than any other lawyer, so let’s get started. For old time’s sake we ought to do one more, one last hopefully, this week in NSA. The USA Freedom Bill was passed, was debated, not amended after efforts; passed, signed and is in effect, and the government is busy cleaning up the mess from the 48/72 hours of expiration of the original 215 and other sunsetted provisions.
So USA Freedom; now that it’s taken effect I guess it’s worth asking what does it do. It gets rid of bulk collection across the board really. It says, “No, you will not go get stuff just because you need it, and won’t be able to get it later if you can’t get it from the guy who holds it, you’re not going to get it.” It does that for a pen trap, it does that for Section 215, the subpoena program, and it most famously gets rid of the bulk collection program that NSA was running and that Snowden leaked in his first and apparently only successful effort to influence US policy.
[Helping] who are supposed to be basically Al Qaeda’s lawyers – that’s editorializing; just a bit – they’re supposed to stand for freedom and against actually gathering intelligence on Al Qaeda, so it’s pretty close. And we’ve never given the Mafia its own lawyers in wiretap cases before the wiretap is carried out, but we’re going to do that for –
Dan: To be fair you were [just] wiretapping the Mafia at the time.
Stewart: Oh, absolutely. Well, the NSA never really had much interest in the Mafia but with Title 3 yeah; you went in and you said, “I want a Title 3 order” and you got it if you met the standard, in the view of judge, and there were no additional lawyers appointed to argue against giving you access to the Mafia’s communications. And Michael, you looked at it as well – I’d say those were the two big changes – there are some transparency issues and other things – anything that strikes you as significant out of this?
Michael: I think the only other thing I would mention is the restrictions on NSLs where you now need to have specific selection terms for NSLs as well, not just for 215 orders.
Stewart: Yeah, really the house just went through and said, ”Tell us what capabilities could be used to gather international security agencies’ information and we will impose this specific selection term, requirement, on it.” That is really the main change probably for ordinary uses of 215 as though it were a criminal subpoena. Not that much change. I think the notion of relevance has probably always carried some notion that there is a point at which it gathered too much and the courts would have said, “That’s too much.”
Michael: going in that, okay, Telecoms already retain all this stuff for 18 months for billing purpose, and they’re required to by FCC regulation, but I think as we’ve discussed before, they’re not really required to retain all the stuff that NSA has been getting under bulk retention program, especially now that people have unlimited calling plans, Telecoms don’t need to retain information about every number call because it doesn’t matter for billing purposes.
So I think, going forward, we’ll probably hear from NSA that they’re not getting all the information they need, so I don’t think this issue is going to go away forever now. I think we’ll be hearing complaints and having some desire by the Administration to impose some sort of data retention requirements on Telecoms, and then they’ll be a real fight.
Stewart: That will be a fight. Yeah, I have said recently that, sure, this new approach can be as effective as the old approach if you think that going to the library is an adequate substitute for using Google. They won’t be able to do a lot of the searching that they could do and they won’t have as much data. But on the upside there are widespread rumors that the database never included many smaller carriers, never included mobile data probably because of difficulties separating out location data from the things that they wanted to look at.
So privacy concerns have already sort of half crippled the program and it also seems to me you have to be a remarkably stupid terrorist to think that it’s a good idea to call home using a phone that operates in the United States. People will use call of duty or something to communicate.
All right, the New York Times has one of its dumber efforts to create a scandal where there is none – it was written by Charlie Savage and criticized “Lawfare” by Ben Wittes and Charlie, who probably values his reputation in National Security circles somewhat, writes a really slashing response to Ben Wittes, but I think, frankly, Ben has the better of the argument.
The story says “Without public notice or debate the Obama Administration has expanded NSAs warrant with surveillance of American’s international internet traffic to search for evidence of malicious computer hacking” according to some documents obtained from Snowden. It turns out, if I understand this right, that what NSA was looking for in that surveillance, which is a 702 surveillance, was malware signatures and other indicia that somebody was hacking Americans, so they collected or proposed to collect the incoming communications from the hackers, and then to see what was exfiltrated by the hackers.
In what universe would you describe that as American’s international internet traffic? I don’t think when somebody’s hacking me or stealing my stuff, that that’s my traffic. That’s his traffic, and to lead off with that framing of the issue it’s clearly baiting somebody for an attempted scandal, but a complete misrepresentation of what was being done.
Dan: I think one of the issues is there’s a real feeling, “What are you going to do with that data?” Are you going to report it? Are you going to stop malware? Are you going to hunt someone down?
Stewart: All of it.
Dan: Where is the – really?
Dan: Because there’s a lot of doubt.
Stewart: Yeah; I actually think that the FBI regularly – this was a program really to support the FBI in its mission – and the FBI has a program that’s remarkably successful in the sense that people are quite surprised when they show up, to go to folks who have been compromised to say, “By the way, you’re poned,” and most of the time when they do that some people say, “What? Huh?” This is where some of that information almost certainly comes from.
Dan: The reality is, everyone always says, “I can’t believe Sony got hacked,” and many of us actually in the field go, “Of course we can believe it.” Sony got hacked because everybody’s hacked somewhere.
Stewart: Yes, absolutely.
Dan: There’s a real need to do something about this on a larger scale. There is just such a lack of trust going on out there.
Stewart: Oh yeah.
Dan: And it’s not without reason.
Stewart: Yeah; Jason, any thoughts about the FBIs role in this?
Jason: Yeah. I think that, as you said, the FBI does a very effective job at knocking on doors or either pushing out information generally through alerts about new malware signatures or knocking on doors to tell particular victims they’ve been hacked. They don’t have to tell them how they know or what the source of the information is, but the information is still valuable.
I thought to the extent that this is one of those things under 702, where I think a reasonable person will look at this and be appreciative of the fact that the government was doing this, not critical. And as you said, the notion that this is sort of stolen internet traffic from Americans is characterized as surveillance of American’s traffic, is a little bit nonsensical.
Stewart: So without beating up Charlie Savage – I like him, he deserves it on this one – but he’s actually usually reasonably careful. The MasterCard settlement or the failed MasterCard settlement in the Target case, Jason, can you bring us up to date on that and tell us what lessons we should learn from it?
Jason: There have been so many high profile breaches in the last 18 months people may not remember Target, which of course was breached in the holiday season of 2013. MasterCard, as credit card companies often do, try to negotiate a settlement on behalf of all of their issuing banks with Target to pay damages for losses suffered as a result of the breach. In April MasterCard negotiated a proposed settlement with Target that would require Target to pay about $19 million to the various financial institutions that had to replace cards and cover for all losses and things of that nature.
But three of the largest banks, in fact I think the three largest MasterCard issuing banks, Citi Group, Capital One and JP Morgan Chase, all said no, and indicated they would not support the settlement and scuttled it because they thought $19 million was too small to cover the losses. There are trade groups for the banks and credit unions that say that between the Target and Home Depot breaches combined there were about $350 million in costs incurred by the financial institutions to reissue cards and cover losses, and so even if you factor out the Home Depot portion of that $19 million, it’s a pretty small number.
So Target has to go back to the drawing board, as does MasterCard to figure out if there’s a settlement or if the litigation is going to continue. And there’s also a proposed class action ongoing in Minnesota involving some smaller banks and credit unions as well. It would only cost them $10 million to settle the consumer class action, but the bigger exposure is here with the financial institution – Michael made reference last week to some press in which some commentator suggested the class actions from data breaches were on the wane – and we both are of the view that that’s just wrong.
There may be some decrease in privacy related class actions related to misuse of private information by providers, but when it comes to data breaches involving retailers and credit card information, I think not only are the consumer class actions not going anywhere, but the class actions involving the financial institutions are definitely not going anywhere. Standing is not an issue at all. It’s pretty easy for these planners to demonstrate that they suffered some kind of injury; they’re the ones covering the losses and reissuing the cards, and depending on the size of the breach the damages can be quite extensive. I think it’s a sign of the times that in these big breaches you’ll find banks that are insisting on a much bigger pound of flesh from the victims.
Stewart: Yeah, I think you’re right about that. The settlements, as I saw when I did a quick study of settlements for consumers, are running between 50 cents and two bucks per exposure, which is not a lot, and the banks’ expenses for reissuing cards are more like 50 bucks per victim. But it’s also true that many of these cards are never going to be used; many of these numbers are never going to be used, and so spending 50 bucks for every one of them to reissue the cards, at considerable cost to the consumers as well, might be an overreaction, and I wouldn’t be surprised if that were an argument.
Dan: So my way of looking at this is from the perspective of deterrence. Is $19 million enough of a cost to Target to cause them to change their behavior and really divest – it’s going to extraordinarily expense to migrate our payment system to the reality, which is we have online verification. We can use better technologies. They exist. There’s a dozen ways of doing it that don’t lead to a password to your money all over the world. This is ridiculous.
Stewart: It is.
Dan: I’m just going to say the big banks have a point; $19 million is –
Stewart: Doesn’t seem like a lot.
Dan: to say, “We really need to invest in this; this never needs to happen again,” and I’m not saying 350 is the right number but I’ve got to agree, 19 is not.
Stewart: All right then. Okay, speaking of everybody being hacked, everybody includes the Office of Personnel Management.
Stewart: My first background investigation and it was quite amusing because the government, in order to protect privacy, blacked out the names of all the investigators who I wouldn’t have known from Adam, but left in all my friends’ names as they’re talking about my drug use, or not.
Stewart: Exactly; no, they were all stand up guys for me, but there is a lot of stuff in there that could be used for improper purposes and it’s perfectly clear that if the Chinese stole this, stole the Anthem records, the health records, they are living the civil libertarian’s nightmare about what NSA is doing. They’re actually building a database about every American in the country.
Dan: Yeah, a little awkward, isn’t it?
Stewart: Well, annoying at least; yes. Jason, I don’t know if you’ve got any thoughts about how OPN responds to this? They apparently didn’t exactly cover themselves with glory in responding to an IG report from last year saying, “Your system sucks so bad you ought to turn them off.”
Jason: Well, first of all as your lawyer I should say that your alleged drug use was outside the limitations period of any federal or state government that I’m aware of, so no one should come after you. I thought it was interesting that they were offering credit monitoring, given that the hack has been attributed to China, which I don’t think is having any money issues and is going to steal my credit card information.
I’m pretty sure that the victims include the three of us so I’m looking forward to getting that free 18 months of credit monitoring. I guess they’ve held out the possibility that the theft was for profit as opposed to for espionage purposes, and the possibility that the Chinese actors are not state sponsored actors, but that seems kind of nonsensical to me. And I think that, as you said, as you both said, that the Chinese are building the very database on us that Americans fear that the United States was building.
Stewart: Yeah, and I agree with you that credit monitoring is a sort of lame and bureaucratic response to this. Instead, they really ought to have the FBI and the counterintelligence experts ask, “What would I do with this data if I were the Chinese?” and then ask people whose data has been exploited to look for that kind of behavior. Knowing how the Chinese do their recruiting I’m guessing they’re looking for people who have family still in China – grandmothers, mothers and the like, and who also work for the US government – and they will recruit them on the basis of ethnic and patriotic duty. And so folks who are in that situation could have their relatives visited for a little chat; there’s a lot of stuff that is unique to Chinese use of this data that we ought to be watching for a little more aggressively than stealing our credit.
Stewart: Yeah; well, that’s all we’ve got when it’s hackers. We should think of a new response to this.
Dan: We should, but like all hacks [attribution] is a pain in the butt because here’s the secret – hacking is not hard; teenagers can do it.
Stewart: Yes, that’s true.
Dan: [Something like this can take just] a few months.
Stewart: But why would they invest?
Dan: Why not? Data has value; they’ll sell it.
Stewart: Maybe; so that’s right. On the other hand the Anthem data never showed up in the markets. We have better intelligence than we used to. We’ll know if this stuff gets sold and it hasn’t been sold because – I don’t want to give the Chinese ideas but –
Dan: I don’t think they need you to give them ideas; sorry.
Stewart: One more story just to show that I was well ahead of the Chinese on this – my first security clearance they asked me for people with whom I had obligations of affection or loyalty, who were foreigners. And I said I’m an international lawyer – this was before you could just print out your Outlook contacts – I Xeroxed all those sheets of business cards that I’d collected, and I sent it to the guys and said, “These are all the clients or people I’ve pitched,” and he said, “There are like 1,000 names here.” I said, “Yeah, these are people that I either work for or want to work for.” And he said, “But I just want people to whom you have ties of obligation or loyalty or affection.” I said, “Well, they’re all clients and I like them and I have obligations to clients or I want them to be. I’ve pitched them.” And he finally stopped me and said, “No, no, I mean are you sleeping with any of them?” So good luck China, figuring out which of them, if any, I was actually sleeping with.
Dan: You see, you gave up all those names to China.
Stewart: They’re all given up.
Dan: Look what you did!
Stewart: Exactly; exactly. Okay, last a topic – Putin’s trolls – I thought this was fascinating. This is where the New York Times really distinguished itself with this article because it told us something we didn’t know and it shed light on kind of something astonishing. This is the internet association I think. Their army of trolls, and the Chinese have an even larger army of trolls, and essentially Putin’s FSB has figured out that if you don’t want to have a Facebook revolution or a Twitter revolution you need to have people on Twitter, on Facebook 24 hours a day, posting comments and turning what would otherwise be evidence of dissent into a toxic waste dump with people trashing each other, going off in weird directions, saying stupid things to the point where no one wants to read the comments anymore.
It’s now a policy. They’ve got a whole bunch of people doing it, and on top of it they’ve decided, “Hell, if the US is going to export Twitter and Twitter revolutions then we’ll export trolling,” and to the point where they’ve started making up chemical spills and tweeting them with realistic video and people weighing in to say, “Oh yeah, I can see it from my house, look at those flames.” All completely made up and doing it as though it were happening in Louisiana.
Dan: The reality is that for a long time the culture has managed. We had broadcasts, broadcasters had direct government links, everything was filtered, and the big experiment of the internet was what if we just remove those filters? What if we just let the people manage it themselves? And eventually astroturfing did not start with Russia; there’s been astroturfing for years. It’s where you have these people making fake events and controlling the message. What is changing is the scale of it. What is changing is who is doing it. What is changing is the organization and the amount of investment. You have people who are professionally operating to reduce the credibility of Twitter, of Facebook so that, quote/unquote, the only thing you can trust is the broadcast.
Stewart: I think that’s exactly right. I think they call the Chinese version of this the 50 Cent Army because they get 50 cents a post. But I guess I am surprised that the Russians would do that to us in what is plainly an effort to test to see whether they could totally disrupt our emergency response, and it didn’t do much in Louisiana but it wouldn’t be hard in a more serious crisis, for them to create panic, doubt and certainly uncertainty about the reliability of a whole bunch of media in the United States.
This was clearly a dry run and our response to it was pretty much that. I would have thought that the US government would say, “No, you don’t create fake emergencies inside the United States by pretending to be US news media.”
Jason: I was going to say all those alien sightings in Roswell in the last 50 years do you think were Russia or China?
Stewart: Well, they were pre Twitter; I’m guessing not but from now on I think we can assume they are.
Dan: What it all comes back to is the crisis of legitimacy. People do not trust the institutions that are around them. If you look there’s too much manipulation, too much skin, too many lives, and as it happens institutions are not all bad. Like you know what? Vaccines are awesome but because we have this lack of legitimacy people are looking to find what is the thing I’m supposed to be paying attention to, because the normal stuff keeps coming out that it was a lie and really, you know what, what Russia’s doing here is just saying, “We’re going to find the things that you’re going to instead, that you think are lying; we’re going to lie there too because what we really want is we want America to stop airing our dirty laundry through this Twitter thing, and if America is not going to regulate Twitter we’re just going to go ahead and make a mess of it too.”
Stewart: Yeah. I think their view is, “Well, Twitter undermines our legitimacy; we can use it to undermine yours?”
Dan: Yeah, Russians screwing with Americans; more likely than you think.
Michael: I’m surprised you guys see it as an effort to undermine Twitter; this strikes me as classic KGB disinformation tactics, and it seems to me they’re using a new medium and, as you said before, they’re doing dry runs so that when they actually have a need to engage in information operations against the US or against Ukraine or against some other country, they’ll know how to do it. They’ll have practiced cores of troll who know how to do this stuff in today’s media. I don’t think they’re trying to undermine Twitter.
Stewart: One of the things that interesting is that the authoritarians have figured out how to manage their people using electronic tools. They were scared to death by all of this stuff ten years ago and they’ve responded very creatively, very effectively to the point where I think they can maintain an authoritarian regime for a long time, without totalitarianism but still very effectively. And now they’re in the process of saying, “Well, how can we use these tools as a weapon the way they perceive the US has used the tools as weapon in the first ten years of social media.” We need a response because they’re not going to stop doing it until we have a response.
Michael: I’d start with the violation of the missile treaty before worrying about this so much.
Stewart: Okay, so maybe this is of a piece with the Administration’s strategy for negotiating with Russia, which is to hope that the Russians will come around. The Supreme Court had a ruling in the case we talked about a while ago; this is the guy who wrote really vile and threatening and scary things about his ex wife and the FBI agent who came to interview him and who said afterwards, after he’d posted on Facebook and was arrested for it, “Well, come on, I was just doing what everybody in hip hop does; you shouldn’t take it seriously. I didn’t,” and the Supreme Court was asked to decide whether the test for threatening action is the understanding of the writer or the understanding of the reader? At least that’s how I read it, and they sided with the writer, with the guy who wrote all those vile things. Michael, did you look more closely at that than I did?
Michael: The court read into it a requirement that the government has to show at least that the defendant sent the communication with the purpose of issuing a threat or with the knowledge that it would be viewed as a threat, and it wasn’t enough for the government to argue and a jury to find that a reasonable person would perceive it as a threat.
So you have to show at least knowledge or purpose or intent, and it left open the question whether recklessness as to how it would be perceived, was enough.
Stewart: All right; well, I’m not sure I’m completely persuaded but it probably also doesn’t have enough to do with CyberLaw in the end to pursue. Let’s close up with one last topic, which is the FBI is asking for or talking about expanding CALEA to cover social media, to cover communications that go out through direct messaging and the like, saying it’s not that we haven’t gotten cooperation from social media when we wanted it or a wiretap; it’s just that in many cases they haven’t been able to do it quickly enough and we need to set some rules in advance for their ability to do wiretaps.
This is different from the claim that they’re Going Dark and that they need access to encrypted communications; it really is an effort to actually change CALEA, which is the Communications Assistance Law Enforcement Act from 1994, and impose that obligation on cellphone companies and then later on voiceover IP providers. Jason, what are the prospects for this? How serious a push is this?
Jason: Well, prospects are – it’s DOA – but just to put it in a little bit of historical perspective. So Going Dark has of late been the name for the FBIs effort to deal with encryption, but the original use of that term, Going Dark was, at least in 2008/2009 when the FBI started a legislative push to amend CALEA and extend it to internet based communications, Going Dark was the term they used for that effort. They would cite routinely the fact that there was a very significant number of wiretaps in both criminal and national security case that providers that were not covered by CALEA didn’t have the technical capability to implement.
So it wasn’t about law enforcement having the authority to conduct a wiretap; they by definition has already definition had already developed enough evidence to satisfy a court that they could meet the legal standard. It was about the provider’s ability to help them execute that authority that they already had. As you suggested, either the wiretap couldn’t be done at all or the provider and the government would have to work together to develop a technical solution which could take months and months, by which time the target wasn’t using that method of communication anymore; had moved onto something else.
So for the better part of four years, my last four years at the department, the FBI was pushing along with DEA and some other agencies, for a massive CALEA reform effort to expand it to internet based communications. At that time – this is pre Snowden; it’s certainly truer now – but at that time it was viewed as a political non starter, to try to convince providers that CALEA should be expanded.
So they downshifted as a Plan B to try to amend Title 18, and I think there were some parallel amendments to Title 50, but the Title 18 amendments would have dramatically increased the penalties for provider who didn’t have the capability to implement a wiretap order, a valid wiretap order that law enforcement served.
There would be this graduating series of penalties that would essentially create a significant financial disincentive for a provider not to have in their sight capability in advance or to be able to develop one quite quickly. So the FBI, although it wanted CALEA to be expanded was willing to settle for this sort of indirect way to achieve the same thing; to incentivize providers to develop an intercept solutions.
That was an unlikely Bill to make it to the Hill and to make it through the Hill before Snowden; after Snowden I think it became politically plutonium. It was very hard even before Snowden to explain to people that this was not an effort to expand authorities; it was about executing those authorities. That argument became almost impossible to make in the post Snowden world.
What struck me about this story though is that they appear to be going back to Plan A, which is trying to go in the front door and expand CALEA, and the only thing I can interpret is either that the people running this effort now are unaware of the previous history that they went through, or they’ve just decided what the hell; they have nothing to lose. They’re unlikely to get it through anyway so they might as well ask for what they want.
Stewart: That’s my impression. There isn’t any likelihood in the next two years that encryption is going to get regulated, but the Justice Department and the FBI are raising this issue I think partly on what the hell, this is what we want, this is what we need, we might as well say so, and partly I think preparation of the battle space for the time when they actually have a really serious crime that everybody wishes had been solved and can’t be solved because of some of these technical gaps.
Dan: You know what drives me nuts is we’re getting hacked left and right; we’re leaking data left and right, and all these guys can talk about is how they want to leak more data. Like when we finish here this is about encryption. We’re not saying we’re banning encryption but if there’s encryption and we can’t get through it we’re going to have a graduated series of costs or we’re going to pull CALEA into this. There’s entire classes of software we need to protect American business that are very difficult to invest in right now. It’s very difficult to know, in the long term, that you’re going to get to run it.
Stewart: Well, actually my impression is that VCs are falling all over themselves to fund people who say, “Yeah, we’re going to stick it to the NSA.”
Dan: Yeah, but those of us who actually know what we’re doing, know whatever we’re doing, whatever would actually work, is actually under threat. There are lots of scammers out there; oh my goodness, there are some great, amazing, 1990s era snake oil going on, but the smart money is not too sure we’re going to get away with securing anything.
Stewart: I think that’s probably right; why don’t we just move right in because I had promised I was going to talk about this from the news roundup to this question – Julian Sanchez raised it; I raised it with Julian at a previous podcast. We were talking about the effort to get access to encrypted communications and I mocked the people who said, “Oh, you can never provide access without that; that’s always a bad idea.” And I said, “No, come on.” Yes, it creates a security risk and you have to manage it but sometimes the security risk and the cost of managing it is worth it because of the social values.
Dan: Sometimes you lose 30 years of background check data.
Stewart: Yeah, although I’m not sure they would have. I’m not sure how encryption, especially encryption of data in motion, would have changed that.
Dan: It’s a question of can you protect the big magic key that gives you access to everything on the Internet, and the answer is no.
Stewart: So let me point to the topic that Julian didn’t want to get into because it seemed to be more technical than he was comfortable with which is –
Dan: Bring it on.
Stewart: Exactly. I said, “Are you kidding me? End to end encryption?” The only end to end encryption that has been adopted universally on the internet since encryption became widely exportable is SSL/TLS. That’s everywhere; it’s default.
Okay, but SSL/TLS is broken every single day by the thousands, if not the millions, and it’s broken by respectable companies. In fact, probably every Fortune 500 company insists that SSL has to be broken at their firewall.
And they do it; they do it so that they can inspect the traffic to see whether some hacker is exfiltrating the –
Dan: Yeah, but they’re inspecting their own traffic. Organizations can go ahead and balance their benefits and balance their risks. When it’s an external actor it’s someone else’s risk. It’s all about externality.
Stewart: Well, yes, okay; I grant you that. The point is the idea that building in access is always a stupid idea, never worth it. It’s just wrong, or at least it’s inconsistent with the security practices that we have today. And probably, if anything, some of the things that companies like Google and Facebook are doing to promote SSL are going to result in more exfiltration of data. People are already exfiltrating data through Google properties because Google insists that they be whitelisted from these intercepts.
Dan: What’s increasingly happening is that corporations are moving the intercept and DLP and analytics role to the endpoint because operating it as a midpoint just gets slower and more fragile day after day, month after month, year after year. If you want security, look, it’s your property, you’re a large company, you own 30,000 desktops, they’re your desktops, and you can put stuff on them.
Stewart: But the problem that the companies have, which is weighing the importance of end to end encryption for security versus the importance of being able to monitor activity for security, they have come down and said, “We have to be able to monitor it; we can’t just assume that every one of our users is operating safely.” That’s a judgment that society can make just as easily. Once you’ve had the debate society can say, “You know, on the whole, ensuring the privacy of everybody in our country versus the risks of criminals misusing that data, we’re prepared to say we can take some risk on the security side to have less effective end to end encryption in order to make sure that people cannot get away with breaking the law with impunity.”
Dan: Here’s a thing though – society has straight out said, “We don’t want bulk surveillance.” If you want to go ahead and monitor individuals, you have a reason to monitor, that’s one thing but –
Stewart: But you can’t monitor all of them. If they’ve been given end to end – I agree with you – there’s a debate; I’m happy to continue debating it but I’ve lost so far. But you say, no, it’s this guy; this guy, we want to listen to his communications, we want to see what he is saying on that encrypted tunnel, you can’t break that just stepping into the middle of it unless you already own his machine.
Dan: Yeah, and it’s unfortunately the expensive road.
Stewart: because they don’t do no good.
Dan: isn’t there. It isn’t the actual thing.
Stewart: It isn’t here – I’m over at Stanford and we’re at the epicenter of a contempt for government, but everybody gets a vote. You get a vote if you live in Akron, Ohio too, but nobody in Akron gets a vote about where their end to end encryption is going to be deployed.
Dan: You know, look, average people, normal people have like eight secure messengers on their phone. Text messaging has fallen off a cliff; why? At the end of the day it’s because people want to be able to talk to each other and not have everyone spying on them. There’s a cost, there’s an actual cost to spying on the wrong people.
Stewart: There is?
Dan: If you go ahead and you make everyone your enemy you find yourself with very few friends. That’s how the world actually works.
Stewart: All right; I think we’ve at least agreed that there’s routine breakage of the one end to end encryption methodology that has been widely deployed. I agree with you, people are moving away from man in middle and are looking to find ways to break into systems at the endpoint or close to the endpoint. Okay; let’s talk a little bit, if we can, about DNSSEC because we had a great fight over SOPA and DNSSEC, and I guess the question for me is what – well, maybe you can give us two seconds or two minutes on what DNSSEC is and how it’s doing in terms of deployment.
Dan: DNSSEC, at the end of the day makes it as easy to get encryption keys as it is to get the address for a server. Crypto should not be drama. You’re a developer, you need to figure out how to encrypt something, hit the encrypt button, you move on with your life. You write your app. That’s how it needs to work.
DNS has been a fantastic success at providing addressing to the internet. It would be nice if keying was just as easy, but let me tell you, how do you go ahead and go out and talk to all these internet people about how great DNSSEC is when really it’s very clear DNS itself – it’s not like SOPA fights, it’s not going to come back –
Stewart: Yeah; well, maybe.
Dan: – and it’s not like the security establishment, which should be trying to make America safer, it’s like, “Man, we really want to make sure we get our keys in there.” When that happens [it doesn’t work]. It’s not that DNSSEC isn’t a great technology, but it really depends on politically [the DNS and its contents] being sacrosanct.
Stewart: Obviously, DHS, the OMB committed to getting DNSSEC deployed at the federal level, and so their enthusiasm for DNSSEC has been substantial. Are you saying that they have undermined that in some way that –
Dan: The federal government is not monolithic; two million employees, maybe more, and what I’m telling you is that besides the security establishment that’s keeping on saying, “Hey, we’ve got to be able to get our keys in there too,” has really – we’ve got this dual mission problem going on here. Any system with a dual mission, no one actually believes there’s a dual mission, okay.
If the Department of Transportation was like, “Maybe cars should cars should crash from time to time,” if Health or Human Services was like, “Hey, you know, polio is kind of cool for killing some bad guys.” No one would take those vaccines because maybe it’s the other mission and that’s kind of the situation that we have right here. Yeah, DNSSEC is a fantastic technology for key distribution, but we have no idea five years from now what you’re going to do with it, and so instead it’s being replaced with garbage [EDIT: This is rude, and inappropriate verbiage.]
I’m sorry, I know people are doing some very good work, but let me tell you, their value add is it’s a bunch of centralized systems that all say, “But we’re going to stand up to the government.” I mean, that’s the value add and it never scales, it never works but we keep trying because we’ve got to do something because it’s a disaster out there. And honestly, anything is better than what we’ve got, but what we should be doing is DNSSEC and as long as you keep making this noise we can’t do it.
Stewart: So DNSSEC is up to what? Ten percent deployment?
Dan: DNSSEC needs a round of investment that makes it a turnkey switch.
Dan: DNSSEC could be done [automatically] but every server just doesn’t. We [could] just transition the internet to it. You could do that. The technology is there but the politics are completely broken.
Stewart: Okay; last set of questions. You’re the Chief Scientist at WhiteOps and let me tell you what I think WhiteOps does and then you can tell me what it really does. I think of WhiteOps as having made the observation that the hackers who are getting into our systems are doing it from a distance. They’re sending bots into pack up and exfiltrate data. They’re logging on and bots look different from human beings when they type stuff and the people who are trying to manage an intrusion remotely also looks different from somebody who is actually on the network and what WhiteOps is doing is saying, “We can find those guys and stop them.”
Dan: And it’s exactly what we’re doing. Look, I don’t care how clever your buffer overflow is; you’re not teleporting in front of a keyboard, okay. That’s not going to happen. So our observation is that we have this very strong signal, it’s not perfect because sometimes people VPN in, sometimes people make scripted processes.
Stewart: But they can’t keep a VPN up for very long?
Stewart: So this sounds so sensible and so obvious that I guess my question is how come we took this long to have that observation become a company?
Dan: I don’t know but we built it. The reality is, is that it requires knowledge of a lot of really interesting browser internals. At WhiteOps we’ve been breaking browsers for years so we’re basically taking all these bugs that actually never let you attack the user but they have completely different responses inside of a bot environment. That’s kind of the secret sauce.
Every browser is really a core object that reads HTML 5, Java Scripted video, all the things you’ve got to do to be a web browser. Then there’s like this goop, right? Like it puts it on the screen, it has a back button, uses an address bar, and lets you configure stuff, so it turns out that the bots use the core not the goop.
Stewart: Oh yeah, because the core enables them to write one script for everything?
Dan: Yeah, so you have to think of bots as really terribly tested browsers and once you realize that it’s like, “Oh, this is barely tested, let’s make it break.”
Stewart: Huh! I know you’ve been doing work with companies looking for intrusions. You’ve also been working with advertisers; not trying to find people who are basically engaged in click fraud. Any stories you can tell about catching people on well guarded networks?
Dan: I think one story I really enjoy – we actually ran the largest study into ad fraud that had ever been done, of its nature. We found that there’s going to be about $6 billion of ad fraud at http://whiteops.com/botfraud, and we had this one case, so we tell the world we’re going to go ahead and run this test in August and find all the fraud. You know what? We lied. We do that sometimes.
We actually ran a test from a little bit in July, all the way through September and we watched this one campaign; 40 percent fraud, then when we said we were going to start, three percent fraud. Then when we said we’re going to start, back to 40. You just had this square wave. It was the most beautiful demo. We showed this to the customers – one of the biggest brands in the country – and they were just like, “Those guys did what?”
And here’s what’s great – for my entire career I’ve been dealing with how people break in. This bug, that bug, what’s wrong with Flash, what’s wrong with Java? This is the first time in my life I have ever been dealing with why. People are doing this fraud to make money. Let’s stop the checks from being written? It’s been incredibly entertaining.
Stewart: Oh, that it is; that’s very cool, and it is – I guess maybe this is the observation. We wasted so much time trying to keep people out of systems hopelessly; now everybody says, “Oh, you have to assume they’re in,” but that doesn’t mean you have the tools to really deal with them, and this is a tool to deal with people when they’re in.
Dan: There’s been a major shift from prevention to detection. We basically say, “Look, okay, they’re going to get in but they don’t necessarily know what perfectly to do once they’re in.” Their actions are fundamentally different than your legitimate users and they’re always going to be because they’re trying to do different things; so if you can detect properties of the different things that they’re doing you actually have signals, and it always comes down to signals in intelligence.
Stewart: Yeah; that’s right. I’m looking forward to NSA deploying WhiteOps technology, but I won’t ask you to respond to that one. Okay, Dan, this was terrific I have to say. I’d rather be on your side of an argument than against you, but it’s been a real pleasure arguing this out. Thanks for coming in Michael, Jason; I appreciate it.
Just to close up the CyberLaw Podcast is open to feedback. Send comments to firstname.lastname@example.org; leave a message at 202 862 5785. I’m still waiting for an entertainingly abusive voicemail. We haven’t got them. This has been episode 70 of the Steptoe CyberLaw Podcast brought to by Steptoe & Johnson. Next week we’re going to be joined by Catherine Lotrionte, who is the Associate Director of the Institute for Law, Science and Global Security at Georgetown. And coming soon we’re going to have Jim Baker, the General Counsel of the FBI; Rob Knake, a Senior Fellow for Cyber Policy at the Council on Foreign Relations. We hope you’ll join us next week as we once again provide insights into the latest events in technology, security, privacy in government.