Due to faster connectivity and the lower barriers to application development with open source software, the amount of applications and data held by organizations has continued to grow. As a
Quantum cryptography is secure from all future advances in mathematics and computing, including from the number-crunching abilities of a quantum computer. Data proliferation continues to take place at an ever-accelerating
The post Quantum Cryptography: The next-generation of secure data transmission appeared first on The Cyber Security Place.
Scott Carter, Senior Manager – US, Venafi, discusses the distrust of Symantec’s certificate authority and how this represents a wider problem in the industry as a number of CA’s have
The post Crypto-agility: the key to ensuring long term website security appeared first on The Cyber Security Place.
‘2019 will in all likeliness be a year of new approaches to MiTB attacks, as hackers exploit known client-side vulnerabilities.’The breach of the personal accounts of Marriott International Hotel customers, as well
The post Online security predictions for 2019: From cryptojacking to MiTB attacks appeared first on The Cyber Security Place.
Ransomware is something that many small and midsized businesses (SMBs) need to watch out for. It’s not just for the big companies anymore. Many smart SMBs know that data is
The post Ransomware’s Importance to Small and Midsized Businesses (SMBs) appeared first on The Cyber Security Place.
More than 75 per cent of companies asked said they don’t have a formal cyber security incident response plan that’s applied consistently across their organisation.The numbers are in, and it’s
The post It’s time for businesses to take a fresh look at cyber resilience appeared first on The Cyber Security Place.
Malware. The word alone makes us all cringe as we instantly relate it to something malicious happening on our computers or devices. Gone are the days when we thought the
The post Hacked Without a Trace: The Threat of Fileless Malware appeared first on The Cyber Security Place.
By Julia Sowells Senior Information Security Specialist at Hacker Combat, In the turn of the century 18 years ago, people have embraced Web 2.0, a new dynamic web replacing the static
The post The Only Counter Strategy Against Data Loss: Reliable Backup Methodology appeared first on The Cyber Security Place.
Many corporate boards have made significant progress about understanding the importance of cyber security to the competitive health and sustainability of the companies they oversee. They’ve certainly gotten the message
The post How Corporate Boards Can Be More Proactive Mitigating Cyber Risks appeared first on The Cyber Security Place.
Phishing is fast becoming malware’s favorite vector, proving to be incredibly pervasive with 76 percent of businesses having reported to being a victim of a phishing attack in the last
The post Gone Phishing: Everything You Need to Know About the Ever Present Threat to Your Data appeared first on The Cyber Security Place.
‘The best solution to avoid phishing attacks is to have the right security technologies in place. The application of machine learning, deep learning and NLP have made it increasingly possible
The post 7 top tips for a phishing-proof Black Friday — according to a CTO appeared first on The Cyber Security Place.
By Jimmy Astle, Senior Threat Reseracher, and Paul Drapeau, Enterprise Architect – Security Efficacy at Carbon Black, Recent revelations in the press from Bloomberg regarding Chinese hardware implants and supply-chain compromise
The post China Chip Hack Shines Spotlight on Hardware and Supply-Chain Risk appeared first on The Cyber Security Place.
The Media Trust has discovered a recent malvertising campaign involving Apple Pay that is part of a large-scale phishing and redirect campaign targeting iPhone users visiting premium newspapers and magazines.
[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
I recently took a fresh look at the “SWAMP”, the Software Assurance Marketplace- it is a great idea and a valuable resource. The short and incomplete story is that SWAMP is a suite of software analysis tools integrated into a centralized, cloud-based software testing environment- and it is available to software developers, software tool developers, and researchers- for free.
From their website:
“Software is a crucial component of daily living, affecting worldwide economic structures and the services we depend on every day. With the increasing rate of security breaches, it is clear that conventional network security solutions are no longer able to defend our privacy, corporate data, and critical banking information. Today’s applications need to be built more securely at the code level, and that code needs to be tested regularly.
The SWAMP was developed to make it much easier to regularly test the security of these applications and to provide an online laboratory for software assessment tool inventors to build stronger tools. Testing is often complicated and challenging, because comprehensive testing requires the use of several disparate tools with no central means of managing the process. The SWAMP is a no-cost, high-performance, centralized cloud computing platform that includes an array of open-source and commercial software security testing tools, as well as a comprehensive results viewer to simplify vulnerability remediation. A first in the industry, the SWAMP also offers a library of applications with known vulnerabilities, enabling tool developers to improve the effectiveness of their own static and dynamic testing tools. Created to advance the state of cybersecurity, protect critical infrastructures, and improve the resilience of open-source software, the SWAMP integrates security into the software development life cycle and keeps all user activities completely confidential.”
The Marketplace team includes some serious academic centers for technology, the Morgridge Institute and the Department of Computer Sciences at U of Wisconsin-Madison, the Pervasive Technology Institute at Indiana University, and the National Center for Supercomputing Applications (NCSA) at U of Illinois Urbana-Champaign. In my conversation with Bart Miller and Miron Livny of SWAMP it was clear that this project was built for practical use in the real-world, it is not an academic exercise- this is immensely practical and useful stuff.
There are many more details on their background page, including some impressive tech specs (at least I consider 700 cores, 5 TB of RAM, and 104 TB of HDD impressive).
We are going to try to get folks from SWAMP on the Security Weekly Podcast to discuss the marketplace in depth. Stay tuned for more on that.
Let me explain this a little deeper, because this thought merits such a discussion.
Think about what you go through if you're testing a web application. I can speak to this type of activity since it was something I focused on for a significant portion of my professional career. Essentially the whole of the problem breaks down to being able to define what the word secure means. Many organizations that I've first-hand witnessed stand up a software security program over the years follow the standard OWASP Top 10. It's relatively easy to understand, it's fairly well maintained, and it's relatively easy to test software against. It's hard to argue with the notion that the OWASP Top 10 is not the standard for determining whether a piece of software is secure or not.
Herein lies the problem. As many of you who do software security testing can testify to, without at least a structured framework (aka checklist) to go against, the testing process becomes never-ending. I don't know about you, but I've never had the luxury of taking all the time I needed, everything always needed to go live yesterday and I or my team was always the speed bump on the way to production readiness. So we first settled on making sure none of the OWASP Top 10 were present in software/applications we tested. Since this created an unreal amount of bugs, we narrowed scope down to just the OWASP Top 2. If we could eliminate injection and cross-site scripting the applications would be significantly more secure, and everything would be better.
Another issue, then. After all that testing, and box-checking, when we were fairly sure the application didn't have remote file includes, cross-site scripting (XSS), SQL Injection or any of that other critical stuff - we allowed the app to go live and it quickly got hacked. The issue this caused for us was not only one of credibility, but also of confusion. How could the app not have any of those critical vulnerabilities but still get easily hacked?!
Now back to the issue at hand.
The fact is that even when you've managed to avoid all the common programming mistakes, and well-known vulnerabilities you can still produce a vulnerable application. Look at what EBay is going through right now. Fact is, even though there may not be any XSS or SQLi in their code - they still have issues allowing people to take over accounts. Why? It's because there is more to securing an application than making sure there aren't any coding mistakes. Fully removing the OWASP Top 10 (good luck with that!) from all your code bases may make your applications more safe than they are now - but it won't make them secure. And therein lies the problem.
When you hand your application over to someone who is going to test it for code issues like the OWASP Top 10, and only that, you're going to miss massive bugs that may still lurk in your code. Heartbleed anyone? Maybe there is a logic flaw in your code. Maybe there is a procedural mistake that allows for someone to bypass a critical security mechanism. Maybe you've forgotten to remove your QA testing user from your production code. Thing is, you may not actually know if you just test it for app security issues with traditional or even emerging tools. Static analysis? Nope. Dynamic analysis? Nope. Manual code review? Maybe.
The ugly truth is that unless you have someone who not only understands what the code should do under normal conditions - but also what it should never do, you will continue to have applications with security issues. This is why automated scanners fail. This is why static analysis tools fail. This is why penetration testers can still fail - unless they're thinking outside the code and thinking in terms of application functionality and performance.
The reality is that for those applications that simply can't easily fail - you not only need to get it tested by some brilliant security and development minds, but also by someone who understands that beautiful combination of software development, security, and application business processes and design. Someone who looks at your application and says: "You know what would be interesting?"...
In my mind this goes a great deal to explaining why there are so many failing software security programs out there in the enterprise. We seem to be checking all the right boxes, testing for all the right things, and still coming up short. Maybe it's because the structural integrity hasn't been validated by the demolitions expert.
Test your applications and software. Go beyond what everyone tells you to check and look deep into the business processes to understand how entire mechanisms can be abused or entirely bypassed. That's how we're going to get a step closer to having better, more safe and secure code.