Monthly Archives: July 2013

OSWAP Top 10 with Dave Wichers, Drunken Security News – Episode 339 – July 18, 2013

The OWASP Top Ten is an awareness document for web application security, representing broad consensus about the most critical web application security risks as determined by the OWASP community. The OWASP Top 10 is one of the earliest and longest running OWASP projects, first published in 2003, and updates have been produced in 2004, 2007, 2010, and now 2013.

Password Secrets of Popular Windows Applications

In today's Internet driven world, all of us use one or other applications starting from browsers, mail clients to instant messengers. Most of these applications store the sensitive information such as user name, password in their private location using proprietary methods.But most applications use simple methods or rather obscure methods to store the credentials which can easily put your privacy in jeopardy as any spyware on your system can easily uncover these secrets.

In this context, this article throws light on those dark regions by exposing the secret storage location and encryption mechanism used by most popular applications

Interview with Team Onapsis, Schuyler Towne on X-Locks Project, Drunken Security News – Episode 338 – July 11, 2013

Selena Proctor, Alex Horan and Mariano Nunez join us from Onapsis.

Schuyler Towne is on a mission to recover as much information as possible about the lock-related patents that were lost to the patent office fire of 1836. His primary interest is in the history and the story of the creators of the lost locks, but his goal is to conduct all of the research in public, using Zotero, so everyone can follow along and those particularly inclined can even participate. That rough research will remain available indefinitely, but he will go on to curate and organize the work for publication on the website. Depending on what we recover we could potentially restore entire patents to the patent record, or 3D print working locks based on their drawings. We could solve a mystery, or rewrite history.

Interview with Matt Bergin, Kati Rodzon & Mike Murray’s Social Engineering War Stories, Drunken Security News – Episode 337 – July 4, 2013

Matt "Level" Bergin, age twenty four, works for CORE Security as a Senior Security Consultant where his day job consists of discovering, exploiting, and mitigating vulnerabilities in their client's network environments. Before joining CORE, Matt became well recognized in the industry through his activities in the US Cyber Challenge and publications of vulnerability research such as his discovery of the Microsoft IIS 7.5 FTP Heap Overflow.

Kati Rodzon is the manager of Security Behavior Deisgn for MAD Security. Her last nine years have been spent studying psychology and ways to modify human behavior. From learning about the power of social pressure on groups, to how subtle changes in reinforcement can drastically change individual behavior, Kati has spent the better part of a decade learning how humans work and now applies that to security awareness.

Mike Murray has spent more than a decade helping companies to protect their information by understanding their vulnerability posture from the perspective of an attacker. Mike co-founded MAD Security, where he leads engagements to help corporate and government customers understand and protect their security organization.

Identity Officer

This morning, Dave Kearns of KuppingerCole revived an old conversation started by my friend Matt Pollicove of CTI back in 2006 about the potential need for an Identity Officer. I had some comments then, but I wanted to add another thought now that I'm older and a little wiser.

One of the things I've noticed over recent years is that big, brand name companies who are well-respected for their primary business and their ability to execute on internal IT projects have many little "messes" related to technology that nobody talks about. A mess could be a mistake (bad purchase, wrong implementer) or it could be something that started out OK and grew into a mess over time. One of the common messes out there is related to interconnectivity of various IAM solutions.

It looks like this: One group within the company bought Oracle or IBM for user account management and built a complex infrastructure around it that they're afraid to touch. Another bought SailPoint or Aveksa - maybe both - and incorporated 40% of the intended applications then the project stalled out. A third group is using Ping for Federation with partners while a fourth runs Microsoft FIM and ADFS to support other partners.

I recently spoke to the "Lead Architect for IAM" at one of the world's top banks. With a title like that, I figured he'd be in the middle of orchestrating the various interdependencies between IAM systems. When I mentioned an IAM brand name that I knew they had deployed, he said something like, "oh no, that's a different group". He knew it existed but didn't know much more about it.

In the above scenario, one obvious consideration is that there's time and money spent purchasing and implementing these technologies which have overlapping functionality. It's wasteful and inefficient. But there's a bigger problem with that scenario than cost and maintenance.

When the business wants to enable some new venture (new partnership, new regulation, M&A, etc.) it's extremely difficult to adapt to new requirements because of all the little messes that would need to be cleaned up. And which group should lead the effort? The access certification system is the newest and its owners have some political pull. But the provisioning system is larger, more established, and now supports the desired certification scenarios. Each of the four or five IAM systems has valuable data. How do you bring it all together to meet the immediate need?

I probably don't need to spell out where an Identity Officer could have made a positive impact in the above scenario. Reduced cost, reduced overhead, greater flexibility, speed to implement. I think Dave is on to something by reviving this topic. As a doctor of IAM, he's taking a holistic look at the identity needs of organizations. It's not just about technology or workflows. It's also about understanding executive ownership and aligning IAM with business needs. Organizational structure is a big part of that conversation.

Episode #168: Scan On, You Crazy Command Line

Hal gets back to our roots

With one ear carefully tuned to cries of desperation from the Internet, it's no wonder I picked up on this plea from David Nides on Twitter:

Whenever I see a request to scan for files based on a certain criteria and then copy them someplace else, I immediately think of the "find ... | cpio -pd ..." trick I've used in several other Episodes.

Happily, "find" has "-mtime", "-atime", and "-ctime" options we can use for identifying the files. But they all want their arguments to be in terms of number of days. So I need to calculate the number of days between today and the end of 2012. Let's do that via a little command-line kung fu, shall we? That will make this more fun.


$ days=$(( ($(date +%Y) - 2012)*365 + $(date +%j | sed 's/^0*//') ))
$ echo $days
447

Whoa nelly! What just happened there? Well, I'm doing math with the bash "$(( ... ))" operator and assigning the result to a variable called "days" so I can use it later. But what's all that line noise in the middle?

  • "date +%Y" returns the current year. That's inside "$( ... )" so I can use the value in my calculations.
  • I subtract 2012 from the current year to get the number of years since 2012 and multiply that by 365. Screw you, leap years!
  • "date +%j" returns the current day of the year, a value from 001-365.
  • Unfortunately the shell interprets values with leading zeroes as octal and errors out on values like "008" and "097". So I use a little sed to strip the leading zeroes.

Hey, I said it would be fun, not that it would necessarily be a good idea!

But now that I've got my "$days" value, the answer to David's original request couldn't be easier:


$ find /some/dir -mtime +$days -atime +$days -ctime +$days | cpio -pd /new/dir

The "find" command locates files whose MAC times are all greater than our "$days" value-- that's what the "+$days" syntax means. After that, it's just a matter of passing the found files off to "cpio". Calculating "$days" was the hard part.

My final solution was short enough that I tweeted it back to David. Which took me all the way back to the early days of Command-Line Kung Fu, when Ed Skoudis had hair would tweet cute little CMD.EXE hacks that he could barely fit into 140 characters. And I would respond with bash code that would barely line wrap. Ah, those were the days!

Of course, Tim was still in diapers then. But he's come so far, that precocious little rascal! Let's see what he has for us this time!

Tim gets an easy one!

Holy Guacamole! This is FINALLY an easy one! Robocopy makes this super easy *and* it plays well with leap years. I feel like it is my birthday and I can finally get out of these diapers.

PS C:\> robocopy \some\dir \new\dir /MINLAD (Get-Date).DayOfYear /MINAGE (Get-Date).DayOfYear /MOV

Simply specify the source and destination directories and use /MOV to move the files. MINLAD will ignore files that have been accessed in the past X days (LAD = Last Access Date), and MINAGE does the same based on the creation date. All we need is the number of days since the beginning of the year. Fortunately, getting that number is super easy in PowerShell (I have no pity for Hal).

All Date objects have the property DayOfYear which is (surprise, surprise) the number of days since the beginning of the year (Get-Member will show all the available properties and methods of an object). All we need is the current date, which we get Get-Date.

DONE! That's all folks! You can go home now. I know you expected a long complicated command, but we don't have one here. However, if you feel that you need to read more you can go back and read the episodes where we cover some other options available with robocopy.

This command is so easy, simple, and short I could even fit it into a tweet!