Monthly Archives: September 2016

CNIL Publishes New Rules on Biometric Access Control in the Workplace

On September 27, 2016, the French Data Protection Authority (“CNIL”) announced the adoption of two new decisions, Single Authorizations AU-052 and AU-053, that will now cover all biometric access control systems in the workplace. These two new decisions repeal and replace the previous biometric decisions adopted by the CNIL and lay down the CNIL’s new position on biometric systems used to control access to the premises, software applications and/or devices in the workplace.  

Background

Since 2006, the CNIL has distinguished between “traceless” and “traceable” biometric systems. Traceable biometric systems, such as systems based on fingerprint recognition, allow personal data to be captured and used without the knowledge of the individual. Conversely, traceless biometric systems (e.g., systems based on hand geometry recognition and finger vein pattern recognition) leave very few traces of data. As a result, the CNIL imposed stricter rules on the use of traceable biometric systems because of the higher risks that these systems posed to the individuals’ privacy (i.e., risks of identity theft).

Given new technical developments, the CNIL considers that this distinction is now irrelevant: all biometrics should be considered traceable. The CNIL now only differentiates biometric systems on the basis of the storage method used.

The new Single Authorizations AU-052 and AU-053 repeal and replace the CNIL’s previous Single Authorizations AU-007, AU-008, AU-019 and AU-027.

New Obligations

The CNIL’s Single Authorizations AU-052 and AU-053 distinguish between the two following types of biometric systems:

  • Biometric systems that allow individuals to retain control of their biometric template because it is stored on a device held by the individual (e.g., chip card or USB key) or in a database in a form that is unusable without the involvement of the individual (e.g., by providing the individual with a secret key to decrypt the template). These systems must comply with the requirements laid down in the CNIL’s Single Authorization AU-052.
  • Biometric systems that do not allow individuals to retain control of their biometric template. These systems are subject to stricter rules and must comply with the requirements laid down in the CNIL’s Single Authorization AU-053.

The CNIL makes it clear that, in a professional context, organizations should use biometric access control systems that allow individuals to retain control of their biometric template. If that is not possible, organizations must justify the implementation of another biometric system and complete an analysis grid.

The CNIL’s new Single Authorizations AU-052 and AU-053 anticipate the application of the EU General Data Protection Regulation (“GDPR”) in May 2018. They take into account the principles of privacy by design and privacy by default, as well as the requirement to conduct data protection impact assessments, which data controllers will have to comply with by May 25, 2018.

Next Steps

As a general rule, biometric systems require the CNIL’s prior authorization. However, organizations may use the CNIL’s simplified registration procedure if the biometric system complies with the CNIL’s requirements laid down in one of its new Single Authorizations.

Organizations that previously filed a simplified registration in line with the CNIL’s previous Single Authorizations (AU-007, AU-008, AU-019 and AU-027) have two years (i.e., until September 2018) to comply with the new Single Authorization AU-052 or AU-053 and file a new simplified registration, or request the CNIL’s specific authorization. As permitted by the GDPR, EU Member States may impose additional limitations on the processing of biometric data, as will be the case in France.

Debunking fuel in the gas tank, case closed.

Picking up from yesterday’s post:

Imagine a time when carburetors ruled the earth (or at least car’s fuel systems), and a time before emissions controls extended to evaporating fuel vapor, say perhaps in the 70s when I began my career as a mechanic, working on cars of that era and older.  Back then, in ye olden days, fuel systems were open to the environment, both in cars and in the tanks at gas stations.  That meant that water vapor could condense in the fuel tanks and drip or run down the sides and pool at the bottom of the tanks.  This is why the fuel pickups in gas stations’ underground tanks were a few inches above the bottom, and why we always used water-detecting paste on the giant tank sticks used to measure the amount of fuel in the ground.  An inch or two of water at the bottom of the tank and no one cared as long as the amount didn’t increase rapidly- it would stay down there harmlessly.  Unless, of course, you got a fuel delivery which churned up everything on the bottom of the tanks, water, sediment, whatever.  Still, it would eventually settle back down- but if you happened fill up your car while the much was stirred up you could get the nasties, including water, into your car’s tank.  And no, most stations didn’t have great fuel filtration between the tank and the pumps.  To this day I avoid filling up my vehicles if I see a fuel truck in the gas station lot- I had to deal with too many dirty fuel systems to take the chance.  And even if you didn’t get water from a bad gas station fill up you could build up water from condensation on the roof of your fuel tank settling to the bottom.

Now we have a couple of paths to getting water into your car’s gas tank, where does that take the sugar myth?  It doesn’t take a lot of water to dissolve sugar that finds its way into the tank, especially given the constant vibration and sloshing that happens in a moving vehicle, so now we can move the sugar solution along with the gasoline towards the engine.  We still have a fuel filter to deal with, but they were generally simple paper filters designed to stop solids, not liquids, so our mix of gasoline and sugar water wouldn’t get stopped there.  This assumes that the vehicle has a fuel filter at all- which is not a safe assumption if you go far enough back in time, or if you happen to be dealing with someone who bypassed their fuel filter “because it kept clogging up”.  (If you think no one would ever do something that dumb, you have probably never worked a helpdesk).

And now the fuel hits the carburetor, where a little bowl acts as a reservoir for fuel before it finds its way into the intake system.  Carburetors are full of tiny orifices, the kind that don’t like dirt, or much of anything other than clean gasoline and clean air.  Sugar water can gum things up, block holes, or settle out into the bottom of the fuel bowl- and that’s where things are no longer theoretical.  I had to clean out a few carburetors with sticky goo in them in my “gas station mechanic” days, and I recall one where we dropped the gas tank and found an ugly mess in the tank.  Sugar in the tank could, under some circumstances, be annoying.  Not catastrophic but mildly disruptive, and a genuinely unpleasant thing to do to someone.

What’s the moral of the story?  I don’t think there is one, other than exaggeration and hyperbole feed urban legends whether they’re based on complete nonsense or a tiny grain of truth.

Bottom line, don’t put sugar in gas tanks.  Not just because it won’t work, but because it’s a rotten thing to do.

 

Jack

Debunked debunking, part 2

Another “debunked” automotive urban legend is the “Sugar in the gas tank will destroy your engine!!!11!” story.  Let’s take a look at this tale, and look at a few angles folks often miss when discussing it.  This myth has been thoroughly debunked, by people both smart and not-so-smart, but let’s look at it again.

First and foremost, sugar does not dissolve in gasoline.  You might be able to stir it into some kind of suspension, but it won’t really dissolve.  (Sugar doesn’t dissolve well in alcohol, either, but that’s a topic for my other blog.)  That would seem to thoroughly debunk the story by itself, and in modern vehicles in good condition it pretty much does.

Modern, good condition… I just opened two interesting views into one angle to the tale.

Second, modern (there’s that word again) vehicles have very thorough fuel filtering which will prevent sugar granules from making it anywhere near the engine.

And finally for this post, even if sugar did dissolve in gas (which it doesn’t) and sugar made it through the filter(s) (which it won’t), the sugared fuel would only flow through the fuel, intake, and exhaust systems.  I suppose it might make it into the lower parts of the engine if the pistons/rings/cylinder walls were junk but then the engine is already trashed.

Let’s talk about what could happen in the scenario above, assuming sugar did dissolve in gas and/or filtration didn’t stop it.  It is a safe bet that fuel injectors wouldn’t like it, they might gum up eventually as the sugar burned (caramelized?) due to engine heat.  I suppose, since we’re suspending disbelief, that sugar could build up on the valves and contribute to burned valves- but the operating temperatures of modern valves are extremely high and  since they’re designed to function at such temperatures that I doubt it would be a problem as the sugar would burn off without building up.  Continuing with the fantasy, maybe turbochargers and catalytic converters wouldn’t enjoy the sugar solution- but again the extreme heat would burn the sugar somewhere in the process and probably burn it cleanly with no significant ill effects.

So there we have it, thoroughly debunked.  Except maybe not.  What if we scale back the expected damage from catastrophic to annoying, and go back in time?  In the first post on debunking going back in time was also a key to understanding the battery myth.

The rest of this story comes tomorrow (really).

 

Jack

Evolution of Locky – A Cat & Mouse Game

1

In the on-going game of cat and mouse between cyber attackers and defensive internet security providers, the appearance of a new tactic from the Locky family of Ransomware comes as no surprise.

As we discussed in February this year, Locky targets victims through seemingly legitimate email attachments. Once the victim clicks on the attachment the malicious macro begins encrypting the users’ files.

Given the nature of this environment, security experts are constantly working on ways to stop Locky, coming up with solutions that will render it ineffective.

Distribution of the latest attack

In the latest development, cyber attackers have come up with new tactics to bypass security. The malware is still distributed via email attachments, but no longer uses a Trojan. These emails have varying names and subject lines to attract the victims’ attention and usually contain Zip files.

locky-2
The Malware skips the downloader Trojan and gets the Locky variant in DLL format, and is then executed using Windows rundll32.exe. By using a script file as well as a DLL, instead of a Trojan and .exe, Locky is not immediately detected and blocked, and the Ransomware can begin its course.

To further ensure its success cyber attackers have given Locky an added fall-back mechanism, this means that the malware will still be able to complete its actions even in cases where it can’t reach command and control servers. The weak point in this is that the encryption key is the same for every computer.

These attacks appear to present in weekly waves and have already targeted victims in North and South America, and Europe, as well as attacks in Africa and Asia.

3

In order to protect yourself, security experts suggest setting up filters for script files that arrive via email, as well as ensuring your antivirus is up to date. Advanced solutions such as Panda’s Adaptive Defence allow for active classification of every running application by leveraging Endpoint Detection & Response (EDR) technologies. This means that you have a greater chance of defending your network against today’s advanced threats.

The post Evolution of Locky – A Cat & Mouse Game appeared first on CyberSafety.co.za.

CNIL Publishes Internet Sweep Results on Connected Devices

On September 23, 2016, the French Data Protection Authority (“CNIL”) published the results of the Internet sweep on connected devices. The sweep was conducted in May 2016 to assess the quality of the information provided to users of connected devices, the level of security of the data flows and the degree of user empowerment (e.g., user’s consent and ability to exercise data protection rights).

As we previously reported, the sweep was coordinated by the Global Privacy Enforcement Network, a global network of approximately 50 data protection authorities (“DPAs”). The CNIL and 24 other member DPAs, including the UK Information Commissioner’s Office, participated in the coordinated online audit. More than 300 connected devices were tested and audited around the world by the participating DPAs. The sweep revealed that, of these 300 devices:

  • 59 percent failed to provide users with clear and complete information on how their personal data is collected, used and disclosed;
  • 68 percent failed to provide information on how personal data is stored;
  • 72 percent failed to explain how users could delete their data off the device; and
  • 38 percent failed to include contact details if users had privacy concerns.

In France, the CNIL tested 12 connected devices in the following sectors: home automation (connected fire alarms and camera systems); health (connected scales and blood pressure monitors); and well-being (connected watches and activity bracelets), and found the following:

  • Users of connected devices are not sufficiently informed of the processing of their personal data. In particular, the CNIL found that the information was not specific to the connected device used but covered the entire product range of the supplier, and did not provide users sufficient information to understand how their personal data will be used (e.g., whether or not the data will be shared with third parties, who it will be shared with, the purposes for which it would be shared, etc.).
  • Users have a satisfactory degree of control over their personal data. The CNIL found that the personal data collected by the devices appeared necessary for the performance of the service performed by that device and/or were subject to the user’s consent. In addition, three-quarters of the tested devices had security measures in place to prevent unauthorized access to the data collected or to the device itself (e.g., identification was required).

The CNIL, like the other DPAs, reserves the right to conduct more extensive testing by carrying out inspections on the data processing activities related to the use of connected devices.

Fox stealer: another Pony Fork



Gift for SweetTail-Fox-mlp
 by Mad-N-Monstrous


Small data drop about another Pony fork : Fox stealer.
First sample of this malware I saw was at beginning of September 2016 thanks to Malc0de. After figuring out the panel name and to which advert it was tied we were referring to it as PonyForx.

Advert :
2016-08-11 - Sold underground by a user going with nickname "Cronbot"

--------
Стилер паролей и нетолько - Fox v1.0

Мы выпускаем продукт на продажу. Уже проходит финальная стадия тестирования данного продукта.

О продукте : 
1. Умеет все что умеет пони. + добавлен новый софт.
2. Актуален на 2016 год.
3. Написан на С++ без дополнительных библиотек.
4. Админка от пони.

Условия : 
1. Только аренда.
2. Распространяется в виде EXE и DLL.
3. Исходники продавать не будем.

Аренда 250$ в месяц.
Исходники 2000$ разово.

----Translated by Jack Urban : ----

Password stealer and more - Fox v.1.0
We are releasing the product for general sale. Final stage of testing for this product is already underway.
About the product:
1. Is able to do everything that pony does. + new software has been added.
2. Relevant for 2016.
3. Written in C++ without additional libraries.
4. Admin from pony.
Conditions:
1. For rent only.
2. Distributed as an EXE and DLL.
3. We will not be selling the source.
Rent is $250 a month.
Originals are a 2000$ one time fee. 

--------

It's being loaded (with Locky Affid 13) by the Godzilla from ScriptJS (aka AfraidGate) group .

MISP taxonomy tags reflecting ScriptJS activity in the last months
(note : it's not the first time this group is pushing a stealer, they were dropping Pony with their Necurs between August and December 2015 [1] )

2016-09-26 - ScriptJS infection chain into Neutrino into Godzilla loader into PonyForx and Locky Affid 13
Here we can see the browsing history of the VM being sent to PonyForx (Fox stealer) C2

Fox stealer (PonyForx) fingerprint in Cuckoo

Sample :
Associated C2:
blognetoo[.]com/find.php/hello
blognetoo[.]com/find.php/data
blognetoo[.]com|104.36.83.52
blognetoo[.]com|45.59.114.126
Caught by ET rule :
2821590 || ETPRO TROJAN Win32.Pony Variant Checkin

[1] ScriptJS's Pony :
master.districtpomade[.]com|188.166.54.203 - 2015-08-15 Pony C2 from ScriptJS
​js.travelany[.]com[.]ve|185.80.53.18 - 2015-12-10 Pony C2 from ScriptJS

Read More : 
http://pastebin.com/raw/uKLhTbLs few bits about ScriptJS

Korean Privacy Law Updated

On September 22, 2016, Korean law firm Bae, Kim & Lee LLC released a Legal Update outlining amendments to Korea’s Personal Information Protection Act (“PIPA”) and the Act on the Promotion of IT Network Use and Information Protection (“IT Network Act”).

The amendments to PIPA include:

  • notification requirements for third-party transfers; and
  • an obligation to submit to regular inspection by MOI.

Effective September 30, 2016, “companies that either process sensitive information or unique identifying information of 50,000 data subjects or more, or process personal information of 1 million data subjects or more should be prepared to implement the obligation to notify data subjects if personal information has been obtained indirectly from third parties [and] comply with MOI’s request for document review in connection with MOI’s regular inspection on the company’s security measures.”

Amendments to the IT Network Act include clarification of statutory retention period applicable to unused data. This amendments addresses “the issue of how the IT service providers should handle personal data whose “statutory retention period” has expired, but which data the IT service provider has a legal obligation to retain pursuant to other laws.”

Read Bae, Kim & Lee’s Legal Update.

Security Weekly #467 – It’s Not About the Gin

This week we interview Jon Searles and Will Genovese, the founders of the NESIT hacker space and organizers of Bsides Connecticut.

Security Weekly Web Site: http://securityweekly.com Follow us on Twitter: @securityweekly

Full Show Notes: http://wiki.securityweekly.com/wiki/index.php/Episode467#Interview:_Jon_Searles_and_Will_Genovese_from_BSidesCT_and_NESIT

Putting Infosec Principles into Practice

LinuxSecurity.com: When you’re dealing with a security incident it’s essential you – and the rest of your team – not only have the skills they need to comprehensively deal with an issue, but also have a framework to support them as they approach it. This framework means they can focus purely on what they need to do, following a process that removes any vulnerabilities and threats in a proper way – so everyone who depends upon the software you protect can be confident that it’s secure and functioning properly.

Department of Transportation Issues Cyber Guidance for Autonomous Cars

On September 20, 2016, the Department of Transportation, through the National Highway Traffic Safety Administration (“NHTSA”), released federal cyber guidance for autonomous cars entitled Federal Automated Vehicles Policy (“guidance”).

The guidance makes a number of recommendations, including that automated vehicles should be designed to comply with “established best practices for cyber physical vehicle systems.” To that end, the guidance recommends manufacturers follow “guidance, best practices and design principles” published by National Institute for Standard and Technology, NHTSA and other industry groups, including the Automotive Information Sharing and Analysis Center (“Auto-ISAC”). Manufacturers also are encouraged to engage in information sharing – sharing data recorded during driving for the purpose of reducing crashes and improving highway safety, as well as sharing cyber threat signatures. The guidance recommends manufacturers report “any and all discovered vulnerabilities” to the Auto-ISAC as soon as possible.

Signaling a phased approach to driverless vehicle policy, the guidance is voluntary. The guidance is only the first step, however, and should not be viewed as foreclosing future federal regulations over driverless vehicles. President Obama noted in an op-ed that the guidance guides “states on how to wisely regulate these new technologies, so that when a self-driving car crosses from Ohio into Pennsylvania, its passengers can be confident that other vehicles will be just as responsibly deployed and just as safe,” but also said that “my administration is rolling out new rules of the road for automated vehicles.” Indeed, the President warned, “make no mistake: If a self-driving car isn’t safe, we have the authority to pull it off the road. We won’t hesitate to protect the American public’s safety.” Notably, the guidance comes on the heels of reports that Chinese cybersecurity researchers were able to hack into a driverless car from 12 miles away and tamper with electronically controlled features of the car, including brakes and locks.

Untangling quantum entanglement

Symmetrical encryption is far quicker and less resource-intense than public/private key encryption, but has the downside that the symmetrical key needs to be distributed among parties. For this reason, we use public/private key encryption to secure the transfer of the symmetrical key, and then use symmetrical encryption to secure the actual data that needs to …

Belgian Privacy Commission Issues Priorities and Thematic Dossier to Prepare for GDPR

On September 16, 2016, the Belgian Data Protection Authority (the “Privacy Commission”) published a 13-step guidance document (in French and Dutch) to help organizations prepare for the EU General Data Protection Regulation (“GDPR”).

The 13 steps recommended by the Privacy Commission are summarized below.

  • Awareness. Inform key persons and decision makers about the upcoming changes in order to assess the consequences of the GDPR on the company or organization.
  • Internal Records. Document what personal data is stored, where it came from and with whom it is shared. Record data processing activities and consider undertaking an information audit.
  • Privacy Notice. Review existing privacy notices and update them to comply with the GDPR.
  • Individuals’ Rights. Review current procedures to comply with individuals’ rights, including any procedures to delete or transfer personal data electronically.
  • Access Requests. Update existing procedures to address access requests and plan how individuals’ access requests will be handled within the new time limits imposed by the GDPR.
  • Legal Basis. Document data processing activities and identify the appropriate legal basis to carry out each type of data processing activity.
  • Consent. Review how consent is sought, collected and recorded, and ensure that procedures comply with the new requirements of the GDPR.
  • Children’s Personal Data. Develop mechanisms to verify the ages of individuals and gather parental or legal guardian consent for processing activities that involve children’s data.
  • Data Breach. Ensure appropriate procedures are in place to detect, investigate and report data breaches.
  • Data Protection by Design and Data Protection Impact Assessments. Become familiar with the concepts of Data Protection by Design and Data Protection Impact Assessment, and determine how to implement them within the organization.
  • Data Protection Officer. Appoint a Data Protection Officer (“DPO”), if required, or someone to take responsibility for data protection compliance. Review the position within the organization’s structure and governance arrangements.
  • International. Determine which data protection supervisory authority will be responsible for supervising your organization’s compliance with the GDPR.
  • Existing Contracts. Review existing contracts, in particular with data processors, and make the necessary changes to comply with the GDPR.

In addition, the Privacy Commission also published a thematic dossier on the GDPR (in French and in Dutch), split into three categories: (1) for data controllers, (2) for data processors, and (3) for individuals (to be published soon). For each category, the Privacy Commission offers a detailed overview of the GDPR’s fundamental principles and main concepts, including sanctions, scope of application, individuals’ rights, one-stop-shop mechanism, data transfers, accountability, appointment of a DPO, data security and data breach notifications. In addition, the thematic dossier will also include a FAQ section that collates the most frequently asked questions submitted by individuals and stakeholders via an online form.

New Jersey Moves Forward With Shopper Privacy Bill

On September 15, 2016, the New Jersey Senate unanimously approved a bill that seeks to limit retailers’ ability to collect and use personal data contained on consumers’ driver and non-driver identification cards. The bill, known as the Personal Information and Privacy Protection Act, must now be approved by the New Jersey Assembly.

Under the bill, retail establishments may scan an individual’s identification card (i.e., use an electronic device capable of deciphering, in an electronically readable format, information electronically encoded on the identification card) only for the following purposes:

  • to verify the authenticity of the identification card or to verify the identity of the person if the person pays for goods or services with a method other than cash, returns an item, or requests a refund or an exchange;
  • to verify the person’s age when providing age-restricted goods or services to the person;
  • to prevent fraud or other criminal activity if the person returns an item or requests a refund or an exchange and the business uses a fraud prevention service company or system;
  • to establish or maintain a contractual relationship;
  • to record, retain or transmit information as required by state or federal law;
  • to transmit information to a consumer reporting agency, financial institution or debt collector to be used as permitted by the Fair Credit Reporting Act, the Gramm-Leach Bliley Act and the Fair Debt Collection Practices Act; or
  • to record, retain or transmit information by a covered entity governed by the medical privacy and security rules pursuant to the Health Insurance Portability and Accountability Act of 1996.

The bill also would limit the types of information that retailers may scan from an individual’s identification card to name, address, date of birth, the state issuing the identification card and the identification card number. In addition, the bill (1) places limitations on retaining the relevant information; (2) imposes a data security requirement; (3) reiterates retailers’ obligation under New Jersey’s data breach notification law to notify affected residents and the relevant New Jersey regulator in the event of any breach of the security of the information; and (4) prohibits retailers from selling the relevant information to third parties.

Toolsmith In-depth Analysis: motionEyeOS for Security Makers



It's rather hard to believe, unimaginable even, but here we are. This is the 120th consecutive edition of toolsmith; every month for the last ten years, I've been proud to bring you insights and analysis on free and open source security tools. I hope you've enjoyed the journey as much as I have, I've learned a ton and certainly hope you have too. If you want a journey through the past, October 2006 through August 2015 are available on my web site here, in PDF form, and many year's worth have been published here on the blog as well.
I labored a bit on what to write about for this 10th Anniversary Edition and settled on something I have yet to cover, a physical security topic. To that end I opted for a very slick, maker project, using a Raspberry Pi 2, a USB web cam, and motionEyeOS. Per Calin Crisan, the project developer, motionEyeOS is a Linux distribution that turns a single-board computer into a video surveillance system. The OS is based on BuildRoot and uses motion as a backend and motionEye for the frontend.
  • Buildroot "is a simple, efficient and easy-to-use tool to generate embedded Linux systems through cross-compilation."
  • Motion (wait for it) is a program that monitors the video signal from cameras and is able to detect if a significant part of the picture has changed; in other words, it can detect motion.
  • motionEye is also Calin's project and is web frontend for the motion daemon.

Installation was insanely easy, I followed Calin's installation guidelines and used Win32DiskImager to write the image to the SD card. Here's how straightforward it was in summary.
1) Download the latest motionEyeOS image. I used build 20160828 for Raspberry Pi 2.
2) Write the image to SD card, insert the SD into your Pi.
3) Plug a supported web camera in to your Pi, power up the Pi. Give it a couple minutes after first boot per the guidelines: do not disconnect or reboot your board during these first two minutes. The initialization steps:
  • prepare the data partition on the SD card
  • configure SSH remote access
  • auto-configure any detected camera devices
4) Determine the IP addressed assigned to the Pi, DHCP is default. You can do this with a monitor plugged in the the Pi's HDMI port, via your router's connected devices list, or with a network scan.
For detailed installation instructions, refer to PiMyLifeUp's Build a Raspberry Pi Security Camera Network. It refers to a dated, differently named (motionPie) version of motionEyeOS, but provides great detail if you need it. There are a number of YouTube videos too, just search motionEyeOS.

Configuration is also ridiculously simple. Point your browser to the IP address for the Pi, http://192.168.248.20 for me on my wired network, and http://192.168.248.64 once I configured motionEyeOS to use my WiFi dongle.
The first time you login, the password is blank so change that first. In the upper left corner of the UI you'll see a round icon with three lines, that's the setting menu. Click it, change your admin and user (viewer) passwords STAT. Then immediately enable Advanced Settings.
Figure 1: Preferences

You'll definitely want to add a camera, and keep in mind, you can manage multiple cameras with on motionEyeOS devices, and even multiple motionEyeOS systems with one master controller. Check out Usage Scenarios for more.
Figure 2: Add a camera

Once your camera is enabled, you'll see its feed in the UI. Note that there are unique URLs for snapshots, streaming and embedding.

Figure 3: Active camera and URLs
When motion detection has enabled the camera, the video frame in the UI will be wrapped in orange-red. You can also hover over the video frame for additional controls such as full screen and immediate access to stored video.

There are an absolute plethora of settings options, the most important of which, after camera configuration, is storage. You can write to local storage or a network share, this quickly matters if you choose and always-on scenario versus motion enabled.
Figure 4: Configure file storage
You can configure text overlay, video streaming, still images, schedules, and more.
Figure 5: Options, options, options
The most important variable of all us how you want to be notified. 
There are configuration options that allow you to run commands so you script up a preferred process or use one already devised.
Figure 6: Run a command for notification

Best of all, you can make uses of a variety of notification services including email, as well as Pushover, and IFTTT via Web Hooks.
Figure 7: Web Hook notifications
There is an outstanding article on using Pushover and IFTTT on Pi Supply's Maker Zone. It makes it easy to leverage such services even if you haven't done so before.
The net result, after easy installation, and a little bit of configuration is your on motion-enabled CCTV system that costs very little compared to its commercial counterparts.
Figure 8: Your author entering his office under the watchful eye of Camera1
Purists will find image quality a bit lacking perhaps, but with the right camera you can use Fast Network Camera. Do be aware of the drawbacks though (lost functionality).

In closing, I love this project. Kudos to Calin Crisan for this project. Makers and absolute beginners alike can easily create a great motion enabled video/still camera setup, or a network of managed cameras with always on video. The hardware is inexpensive and readily available. If you've not explored Raspberry Pi this is a great way to get started. If you're looking for a totally viable security video monitoring implementation, motionEyeOS and your favorite IoT hardware (the project supports other boards too) are a perfect combo. Remember too that there are Raspberry Pi board-specific camera modules available.

Ping me via email or Twitter if you have questions (russ at holisticinfosec dot org or @holisticinfosec).
Cheers…until next time.

New York Announces Proposed Cybersecurity Regulation to Protect Consumers and Financial Institutions

On September 13, 2016, New York Governor Andrew Cuomo announced a proposed regulation that would require banks, insurance companies and other financial services institutions to establish and maintain a cybersecurity program designed to ensure the safety of New York’s financial services industry and to protect New York State from the threat of cyber attacks. 

The proposed regulation requires regulated financial institutions to take various actions, including:

  • adopting a written cybersecurity policy;
  • establishing a cybersecurity program;
  • designating a Chief Information Security Officer to oversee and enforce its new program and policy; and
  • implementing policies and procedures designed to ensure the security of information systems and nonpublic information accessible to, or held by, third parties, along with a variety of other requirements to protect the confidentiality, integrity and availability of information systems.

The proposed regulation is subject to a 45-day notice and public comment period. If adopted, this will be the first regulation of its kind in the U.S.

Final Rules for the Data Privacy Act Published in the Philippines

Recently, the National Privacy Commission (the “Commission”) of the Philippines published the final text of its Implementing Rules and Regulations of Republic Act No. 10173, known as the Data Privacy Act of 2012 (the “IRR”). The IRR has a promulgation date of August 24, 2016, and went into effect 15 days after the publication in the official Gazette.

We previously reported on the preceding draft text of the IRR. There are several points of interest that were resolved in the final text, which presents a more practical framework than had been proposed in the draft IRR. Any changes to the final IRR will require a regulatory amendment by the Commission rather than an act of the legislature.

Some points of interest that have been resolved or finalized include the following:

  • The IRR has two separate defined terms, “personal data” and “personal information,” but the potential discrepancy between the two terms has been resolved. “Personal information” refers to information which can identify a particular individual, and is consistent with the definition provided in the statute. “Personal data” is defined as all types of “personal information,” which presumably includes both “ordinary” personal information and sensitive personal information.
  • The draft IRR had used the term “personal data” to describe “personal information” that has been input into an information and communication system, which would mean “personal information” that has been digitally and electronically formatted. This definition no longer appears in the final IRR. In addition, the terms “personal information” and “personal data” are now used more consistently in relation to their definitions. This may result in less ambiguity and a lower prospect of confusion from the use of the two terms.
  • The final IRR has now been made consistent with a provision in the original statute which stated that the Data Privacy Act would not apply to personal information collected in a foreign jurisdiction (in compliance with the laws or rules of that jurisdiction) which is being processed in the Philippines. The draft IRR had provided that, in such instances, the data privacy laws of the foreign jurisdiction would apply in relation to the collection of personal information, while the Philippine Data Privacy Act would apply to processing that takes place within the Philippines. This would have entailed a complex analysis as to where collection-related obligations under the foreign jurisdiction end and where processing-related obligations under Philippine law begin, and how the two sets of legal obligations might intersect.
  • The final IRR requires that, even where personal information has been collected in a foreign jurisdiction for processing in the Philippines, the Philippine requirements to implement information security measures will still apply. This will impose some security-related costs on that portion of the information-processing operations that take place within the Philippines.
  • The final IRR requires that sharing of personal data in the private sector proceeds according to a data sharing agreement. The data sharing agreement may be subject to review by the Commission on its own initiative or following a complaint of a data subject. The draft IRR might have been interpreted to require review by the Commission in all instances, which would have imposed a substantial burden on all sharing of personal data, as well as a burden on the resources of the Commission itself.
  • The final IRR sets forth rules on the internal organizational operations and structure of personal information controllers, such as requirements to (1) appoint a privacy officer, (2) maintain records of processing activities, (3) implement physical security measures and technical security measures, and (4) carry out regulator monitoring for security breaches. However, these obligations only apply “where appropriate.” The draft IRR might have been interpreted to require compliance in all instances. Where and when these potentially complicated requirements will be “appropriate” will depend on a number of factors, including the nature of personal data, the risks posed by the processing, the size and complexity of the organization and its operations, current best practices and cost of security implementation.
  • The final IRR gives the data subject an additional right to object or withhold consent to processing. This appears to be a new right that did not appear in the original text of the statute. This right is substantially retained from the draft IRR, with changes to specifically allow the data subject to object to processing for direct marketing, automated processing or profiling.
  • The final IRR provides more clarity on the notification requirements in connection with to a data breach. Individuals must be notified of data breaches only when both (1) sensitive personal information or information that may be used to enable identity fraud are involved; and (2) the personal information controller believes that the breach is likely to pose a real risk of serious harm to any affected data subject.
  • If the notification requirement does apply, the notification must be made within 72 hours, though notification may be delayed in certain limited circumstances. The final IRR stipulate the categories of content that must appear in the notification.
  •  The requirement under the draft IRR to notify affected individuals in the event of any breach that involves personal, sensitive or privileged information has been removed. That had been a material expansion of the circumstances under which a breach notification had to be made. By removing this requirement, the final IRR keeps the notification requirement within a relatively restricted range of circumstances. However, written reports of security incidents and personal data breaches have to be prepared and a summary has to be provided to the Commission on an annual basis. This amounts to a less onerous notification obligation.
  • In summary, the data breach notification requirement is now more clearly subject to a “risk-based approach” (i.e., the requirement to notify does not arise automatically, but arises instead on a case-by-case basis depending on an evaluation of the risk involved). Only data breaches that involve higher levels of risk must be notified.
  • The final IRR has requirements to register data processing operations and to notify the Commission of automated processing operations, but these now apply only in particular circumstances. The requirement to register with the Philippine data protection authority only applies to processing by personal information controllers and processors which employ 250 or more persons, or to processing that involves risk to the rights and freedoms of data subjects, takes place more than occasionally, or involves more than a de minimis amount (at least 1,000 individuals) of sensitive personal information. The requirement to notify individuals of data processing only applies to processing that is the sole basis of decision making that would significantly affect the data subject.
  • The draft IRR required both universal registration and notification. This would have both increased the burden of processing data and contrasted with the international trend (i.e., the new EU General Data Protection Regulation, which modifies the registration requirements of the previous EU Data Protection Directive).
  • In relation to the accountability principle, the final IRR makes generalized references to the possibility of indemnification on the basis of applicable provisions of Philippine civil law and criminal liability. The final IRR now avoids the discussion of the potential for joint liability, along with the personal information controller, on the part of personal information processors, privacy officers, employees and agents, which had appeared in the draft IRR.

The following additional items are worth noting:

  • The requirements in the final IRR to notify data subjects (at the time of the collection of their personal information) now include an obligation to provide “meaningful information” about the “logic” that will be involved in processing personal information. Requiring that this be done for each and every instance in which personal information is to be collected and processed, and in a way that would satisfy a regulatory authority and the lawyer drafting the notice, is challenging.
  • The final IRR contains a provision stating that personal data may not be retained in perpetuity in contemplation of a future use yet to be determined. This may have potential to impair the processing of “big data” in the Philippines.
  • The draft IRR had established a right of data portability. The final IRR seems to restrict the applicability of this right, by making it apply only where the data subject’s personal information is processed by electronic means and in a structured and commonly-used format. This would seem to enable data processors and controllers to avoid an obligation to comply with this right, by processing personal data using unstructured or unusual formats.
  • The draft IRR had prohibited the processing of privileged information (i.e., private communications made between an individual and his or her lawyer in preparation for litigation), unless the same requirements applicable to the processing of sensitive personal information had been satisfied. While this provision may be potentially problematic, the final IRR mitigates this by providing an exception for uses of privileged information in the context of court proceedings, legal claims and constitutional or statutory mandates. It is not clear if this exception will be adequate to cover all possible situations where an exception will be needed, but further amendments to the IRR could be made to address any shortcomings.
  • In relation to the accountability principle, the final IRR discusses the idea of liability, but does not discuss other aspects of the principle. In particular, the final IRR does not establish rules by which a personal information controller might establish that it observes good internal data handling practices and demonstrates that they comply with applicable standards, or by which the Commission would require production and review of these practices against its standards. The final IRR also does not discuss how to apply the accountability principle in the context of cross-border data transfers; while a provision of the IRR discusses data sharing, it does not appear to describe what a company must do to share data internationally in accordance with the IRR.

TalkTalk Appeal Against ICO Fine for Late Notification of Data Breach Dismissed by First-Tier Tribunal

On August 30, 2016, the First-tier Tribunal (Information Rights) (the “Tribunal”) dismissed an appeal from UK telecoms company TalkTalk Telecom Group PLC (“TalkTalk”) regarding a monetary penalty notice issued to it on February 17, 2016, by the UK Information Commissioner’s Office (“ICO”). The ICO had issued the monetary penalty notice to TalkTalk, for the amount of £1,000, for an alleged failure to report an October 2015 data breach to the ICO within the legally required time period.

The ICO’s decision to issue a monetary penalty notice was based on a telephone complaint received by TalkTalk from one of its customers on November 16, 2015. The customer then sent a letter of complaint to TalkTalk on November 18, 2015, and raised the matter with the ICO on the same day. On November 27, 2015, TalkTalk responded to a request from the ICO for further information, and stated that the incident was being investigated and that the ICO would be notified if and when TalkTalk determined that a data breach had occurred. On December 1, 2015, TalkTalk formally notified the ICO of the data breach.

The UK Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”) requires internet service providers to notify a personal data breach to the ICO and the Commission Regulation No 611/2013 sets the content of that notification and requires that any such data breaches are notified to the ICO no later than 24 hours after detection of the breach, where feasible.

The issue to be decided by the Tribunal was whether TalkTalk could rightly be said to have detected the personal data breach, or to have acquired sufficient awareness of the breach to notify the ICO, when TalkTalk received the customer’s letter of complaint on November 18, 2015.

TalkTalk argued that it only “detected” or acquired sufficient awareness of the personal data breach after TalkTalk had concluded its own investigation into the complaint raised by the customer, and that it was normal practice for notification to take place within 24 hours of the conclusion of an investigation and not within 24 hours of receipt of a complaint. TalkTalk stressed that, given that it has approximately 4 million customers and receives approximately 50 such complaints a month, it could not reasonably investigate each complaint, and potentially report any data breaches to the ICO, within 24 hours of receipt of each complaint.

The ICO argued that “detection” is distinct from “conclusive confirmation,” and that detection is deemed to have occurred when TalkTalk has acquired “sufficient awareness” that a personal data breach had occurred, so as to enable it to make a meaningful notification. The ICO also argued that, given the level of detail and supporting evidence that the customer provided in the complaint letter in November, the threshold was met well before TalkTalk concluded its internal investigations.

The Tribunal upheld the ICO’s decision to issue a monetary penalty notice on the basis that all of the information that TalkTalk was required to provide to the ICO under the Notification Regulation was available to TalkTalk as a result of the customer’s November letter of complaint, and none of the information provided to the ICO appeared to derive from the investigation carried out by TalkTalk after receipt of that letter. The Tribunal was careful to distinguish these facts from the situation where a customer makes a general complaint of a suspected data breach which would require further investigation before the notification obligation under PECR crystallizes.

Although the fine in contention in the proceedings was only £1,000, the ruling is expected to provide useful guidance as to the knowledge threshold for notification under PECR and, going forward, the interpretation of the data breach notification obligation under the EU General Data Protection Regulation, which will enter into force in May 2018.

Best toolsmith tool of the last ten years

As we celebrate Ten Years of Toolsmith and 120 individual tools covered in detail with the attention they deserve, I thought it'd be revealing to see who comes to the very top of the list for readers/voters.
I've built a poll from the last eight Toolsmith Tools of the Year to help you decide, and it's a hell of a list.
 Amazing, right? The best of the best.

You can vote in the poll to your right, it'll be open for two weeks.

Advocate General Advises Revision of PNR Agreement between EU and Canada

On September 8, 2016, Advocate General Paolo Mengozzi of the Court of Justice of the European Union (“CJEU”) issued his Opinion on the compatibility of the draft agreement between Canada and the European Union on the transfer of passenger name record data (“PNR Agreement”) with the Charter of Fundamental Rights of the European Union (“EU Charter”). This is the first time that the CJEU has been called upon to issue a ruling on the compatibility of a draft international agreement with the EU Charter.

Background

Starting in May 2010, the EU and Canada negotiated a PNR Agreement. The PNR Agreement provides that PNR data, which is collected from passengers for the purpose of reserving flights between Canada and the EU, is to be transferred to competent Canadian authorities and then used, retained and, where appropriate, further disclosed to prevent and detect terrorist offenses and other serious transnational criminal offenses. PNR data under the PNR Agreement includes passenger travel habits, payment details, dietary requirements and other information that might contain sensitive data on a passenger’s health, ethnic origin or religious beliefs.

Similar agreements were signed and concluded by the EU with the U.S. and Australia with the approval of the European Parliament. Although the draft PNR Agreement with Canada was signed on June 25, 2014, the European Parliament decided to refer the matter to the CJEU on account of concerns that the draft PNR Agreement could violate fundamental rights enshrined in the EU Charter and, in particular, the right to respect for privacy and the protection of personal data.

Opinion

In substance, the Advocate General found that, as currently drafted, certain provisions of the draft PNR Agreement were clearly incompatible with the EU Charter. This incompatibility resulted from (1) Canada’s ability to process PNR data beyond what it is strictly necessary and independent of the stated purposes of the PNR Agreement; (2) the processing, use and retention by Canada of PNR data containing sensitive data; (3) the retention of PNR data for a maximum of five years without referring to the purpose pursued by the agreement; and (4) a lack of safeguards and oversight mechanisms for the subsequent transfer of PNR data to other foreign authorities. The Advocate General posited that in order for the draft PNR Agreement to be compatible with the EU Charter, it would need to be amended to provide certain guarantees, including a clear and precise rendering of the categories of PNR data included within the scope of the PNR Agreement, an exclusion of sensitive data from the scope of the PNR Agreement, and limiting the number of ‘targeted’ persons to those individuals who can reasonably be suspected of participating in a terrorist offense or serious transnational crime.

The Advocate General issued the Opinion in light of the CJEU’s decisions in the Digital Rights Ireland and Schrems cases, which invalidated the Data Retention Directive and the European Commission’s Safe Harbor adequacy decision, respectively. While the Advocate General’s Opinion is not binding on the CJEU, the court’s judgments have historically tended to follow the Advocate General’s stated views. It remains to be seen how this Opinion will impact existing PNR Agreements the EU has in place with the U.S. and Australia.

Lisa Sotto Speaks on Cybersecurity: Supply and Demand (Part 3)

In Part 3 of Lisa J. Sotto’s discussion at Bloomberg Law’s Second Annual Big Law Business Summit, she speaks on supply and demand in the privacy and cybersecurity fields. Lisa, partner and head of Hunton & Williams LLP’s Global Privacy and Cybersecurity practice group, points out that “demand very much outweighs supply.” To be a successful lawyer in this field, Lisa emphasizes the need for experience, recognizing that, “there is so much nuance, [and data privacy is] culturally based so you cannot just open a book and understand what to do.” In the next 10 years, Lisa hopes to see more lawyers in the field who are trained to “manage a breach that implicates [global] data breach notification laws.”

View the third segment.

Access Part 1 and Part 2 of the cybersecurity videos from the Big Law Business Summit.

Toolsmith Release Advisory: Kali Linux 2016.2 Release

On the heels of Black Hat and DEF CON, 31 AUG 2016 brought us the second Kali Rolling ISO release aka Kali 2016.2. This release provides a number of updates for Kali, including:
  • New KDE, MATE, LXDE, e17, and Xfce builds for folks who want a desktop environment other than Gnome.
  • Kali Linux Weekly ISOs, updated weekly builds of Kali that will be available to download via their mirrors.
  • Bug Fixes and OS Improvements such as HTTPS support in busybox now allowing the preseed of Kali installations securely over SSL. 
All details available here: https://www.kali.org/news/kali-linux-20162-release/
Thanks to Rob Vandenbrink for calling out this release. 

Toolsmith Tidbit: Will Ballenthin’s Python-evtx

Andrew Case (@attrc) called out Will Ballenthin's (@williballenthin) Python-evtx on Twitter, reminding me that I'm long overdue in mentioning it here as well.
Will's Python-evtx description from his website for same follows:
"python-evtx is a pure Python parser for recent Windows Event Log files (those with the file extension “.evtx”). The module provides programmatic access to the File and Chunk headers, record templates, and event entries. For example, you can use python-evtx to review the event logs of Windows 7 systems from a Mac or Linux workstation. The structure definitions and parsing strategies were heavily inspired by the work of Andreas Schuster and his Perl implementation Parse-Evtx."

Assuming you've running Python 2.7, install it via pip install python-evtx or download source from Github: https://github.com/williballenthin/python-evtx

FTC Seeks Input on GLB Safeguards Rule

On August 29, 2016, the Federal Trade Commission announced that it is seeking public comment on the Gramm-Leach-Bliley Act (“GLB”) Safeguards Rule. The GLB Safeguards Rule, which became effective in 2003, requires financial institutions to develop, implement and maintain a comprehensive information security program to safeguard customer information.

The FTC requests comments on several general questions pertaining to the GLB Safeguards Rule, such as:

  • Is there a continued need for specific provisions of the GLB Safeguards Rule?
  • What significant costs has the GLB Safeguards Rule imposed on consumers and how could it be modified to reduce those costs?
  • What benefits has the GLB Safeguards Rule provided to businesses and how could it be modified to increase those benefits?
  • What modifications to the GLB Safeguards Rule should there be to account for changes in technology or economic conditions?

The FTC also requests comments on several specific issues pertaining to the GLB Safeguards Rule. These include:

  • Should the elements of a comprehensive information security program include a response plan in the event of a breach? If so, what should such a plan contain?
  • Should the GLB Safeguards Rule be modified to include more specific and prescriptive requirements for information security programs?
  • Should the GLB Safeguards Rule be modified to reference or incorporate information security standards or frameworks such as the National Institute of Standards and Technology’s Cybersecurity Framework or the Payment Card Industry Data Security Standard?
  • Should the GLB Safeguards Rule include its own definitions of terms such as “financial institution”?
    • Should the term “financial institution” be expanded to include “entities that are significantly engaged in activities that the Federal Reserve Board has found to be incidental to financial activities?”
    • Should that definition of “financial institution” also include “activities that have been found to be closely related to banking or incidental to financial activities by regulation or order in effect after the enactment of the [GLB Safeguards Rule]?”

The FTC has invited interested parties to comment on the GLB Safeguards Rule by November 7, 2016.

View the FTC’s Federal Register notice seeking public comment on the GLB Safeguards Rule.