Category Archives: cloud

Global WAN optimization market forecast to reach $1.4 billion by 2025

The WAN optimization market is expected to grow from $1,047.1 million in 2020 to $1,446.2 million by 2025, at a Compound Annual Growth Rate (CAGR) of 6.7% during the forecast period of 2020-2025, according to ResearchAndMarkets. Most cloud-based applications need good bandwidth and low latency for effective utilization. In large-scale WAN deployments, latency, bandwidth constraints, and packet losses are inevitable. WAN optimization enables enterprises and service providers to save money and reduce costs with reduced … More

The post Global WAN optimization market forecast to reach $1.4 billion by 2025 appeared first on Help Net Security.

External attacks on cloud accounts grew 630 percent from January to April

The McAfee report uncovers a correlation between the increased use of cloud services and collaboration tools, such as Cisco WebEx, Zoom, Microsoft Teams and Slack during the COVID-19 pandemic, along with an increase in cyber attacks targeting the cloud. There are significant and potentially long-lasting trends that include an increase in the use of cloud services, access from unmanaged devices and the rise of cloud-native threats. These trends emphasize the need for new security delivery … More

The post External attacks on cloud accounts grew 630 percent from January to April appeared first on Help Net Security.

Organizations plan to migrate most apps to the cloud in the next year

More than 88% percent of organizations use cloud infrastructure in one form or another, and 45% expect to migrate three quarters or more of their apps to the cloud over the next twelve months, according to the O’Reilly survey. The report surveyed 1,283 software engineers, technical leads, and decision-makers from around the globe. Of note, the report uncovered that 21% of organizations are hosting all applications in a cloud context. The report also found that … More

The post Organizations plan to migrate most apps to the cloud in the next year appeared first on Help Net Security.

Principles of a Cloud Migration – Security W5H – The WHERE

cloud

“Wherever I go, there I am” -Security

I recently had a discussion with a large organization that had a few workloads in multiple clouds while assembling a cloud security focused team to build out their security policy moving forward.  It’s one of my favorite conversations to have since I’m not just talking about Trend Micro solutions and how they can help organizations be successful, but more so on how a business approaches the creation of their security policy to achieve a successful center of operational excellence.  While I will talk more about the COE (center of operational excellence) in a future blog series, I want to dive into the core of the discussion – where do we add security in the cloud?

We started discussing how to secure these new cloud native services like hosted services, serverless, container infrastructures, etc., and how to add these security strategies into their ever-evolving security policy.

Quick note: If your cloud security policy is not ever-evolving, it’s out of date. More on that later.

A colleague and friend of mine, Bryan Webster, presented a concept that traditional security models have been always been about three things: Best Practice Configuration for Access and Provisioning, Walls that Block Things, and Agents that Inspect Things.  We have relied heavily on these principles since the first computer was connected to another. I present to you this handy graphic he presented to illustrate the last two points.

But as we move to secure cloud native services, some of these are outside our walls, and some don’t allow the ability to install an agent.  So WHERE does security go now?

Actually, it’s not all that different – just how it’s deployed and implemented. Start by removing the thinking that security controls are tied to specific implementations. You don’t need an intrusion prevention wall that’s a hardware appliance much like you don’t need an agent installed to do anti-malware. There will also be a big focus on your configuration, permissions, and other best practices.  Use security benchmarks like the AWS Well-Architected, CIS, and SANS to help build an adaptable security policy that can meet the needs of the business moving forward.  You might also want to consider consolidating technologies into a cloud-centric service platform like Trend Micro Cloud One, which enables builders to protect their assets regardless of what’s being built.  Need IPS for your serverless functions or containers?  Try Cloud One Application Security!  Do you want to push security further left into your development pipeline? Take a look at Trend Micro Container Security for Pre-Runtime Container Scanning or Cloud One Conformity for helping developers scan your Infrastructure as Code.

Keep in mind – wherever you implement security, there it is. Make sure that it’s in a place to achieve the goals of your security policy using a combination of people, process, and products, all working together to make your business successful!

This is part of a multi-part blog series on things to keep in mind during a cloud migration project.  You can start at the beginning which was kicked off with a webinar here: https://resources.trendmicro.com/Cloud-One-Webinar-Series-Secure-Cloud-Migration.html.

Also, feel free to give me a follow on LinkedIn for additional security content to use throughout your cloud journey!

The post Principles of a Cloud Migration – Security W5H – The WHERE appeared first on .

Principles of a Cloud Migration – Security W5H – The When

cloud

If you have to ask yourself when to implement security, you probably need a time machine!

Security is as important to your migration as the actual workload you are moving to the cloud. Read that again.

It is essential to be planning and integrating security at every single layer of both architecture and implementation. What I mean by that, is if you’re doing a disaster recovery migration, you need to make sure that security is ready for the infrastructure, your shiny new cloud space, as well as the operations supporting it. Will your current security tools be effective in the cloud? Will they still be able to do their task in the cloud? Do your teams have a method of gathering the same security data from the cloud? More importantly, if you’re doing an application migration to the cloud, when you actually implement security means a lot for your cost optimization as well.

NIST Planning Report 02-3

In this graph, it’s easy to see that the earlier you can find and resolve security threats, not only do you lessen the workload of infosec, but you also significantly reduce your costs of resolution. This can be achieved through a combination of tools and processes to really help empower development to take on security tasks sooner. I’ve also witnessed time and time again that there’s friction between security and application teams often resulting in Shadow IT projects and an overall lack of visibility and trust.

Start there. Start with bringing these teams together, uniting them under a common goal: Providing value to your customer base through agile secure development. Empower both teams to learn about each other’s processes while keeping the customer as your focus. This will ultimately bring more value to everyone involved.

At Trend Micro, we’ve curated a number of security resources designed for DevOps audiences through our Art of Cybersecurity campaign.  You can find it at https://www.trendmicro.com/devops/.

Also highlighted on this page is Mark Nunnikhoven’s #LetsTalkCloud series, which is a live stream series on LinkedIn and YouTube. Seasons 1 and 2 have some amazing content around security with a DevOps focus – stay tuned for Season 3 to start soon!

This is part of a multi-part blog series on things to keep in mind during a cloud migration project.  You can start at the beginning which was kicked off with a webinar here: https://resources.trendmicro.com/Cloud-One-Webinar-Series-Secure-Cloud-Migration.html.

Also, feel free to give me a follow on LinkedIn for additional security content to use throughout your cloud journey!

The post Principles of a Cloud Migration – Security W5H – The When appeared first on .

Principles of a Cloud Migration – Security, The W5H – Episode WHAT?

cloud

Teaching you to be a Natural Born Pillar!

Last week, we took you through the “WHO” of securing a cloud migration here, detailing each of the roles involved with implementing a successful security practice during a cloud migration. Read: everyone. This week, I will be touching on the “WHAT” of security; the key principles required before your first workload moves.  The Well-Architected Framework Security Pillar will be the baseline for this article since it thoroughly explains security concepts in a best practice cloud design.

If you are not familiar with the AWS Well-Architected Framework, go google it right now. I can wait. I’m sure telling readers to leave the article they’re currently reading is a cardinal sin in marketing, but it really is important to understand just how powerful this framework is. Wait, this blog is html ready – here’s the link: https://wa.aws.amazon.com/index.en.html. It consists of five pillars that include best practice information written by architects with vast experience in each area.

Since the topic here is Security, I’ll start by giving a look into this pillar. However, I plan on writing about each and as I do, each one of the graphics above will become a link. Internet Magic!

There are seven principles as a part of the security framework, as follows:

  • Implement a strong identity foundation
  • Enable traceability
  • Apply security at all layers
  • Automate security best practices
  • Protect data in transit and at rest
  • Keep people away from data
  • Prepare for security events

Now, a lot of these principles can be solved by using native cloud services and usually these are the easiest to implement. One thing the framework does not give you is suggestions on how to set up or configure these services. While it might reference turning on multi-factor authentication as a necessary step for your identity and access management policy, it is not on by default. Same thing with file object encryption. It is there for you to use but not necessarily enabled on the ones you create.

Here is where I make a super cool (and free) recommendation on technology to accelerate your learning about these topics. We have a knowledge base with hundreds of cloud rules mapped to the Well-Architected Framework (and others!) to help accelerate your knowledge during and after your cloud migration. Let us take the use case above on multi-factor authentication. Our knowledge base article here details the four R’s: Risk, Reason, Rationale, and References on why MFA is a security best practice.

Starting with a Risk Level and detailing out why this is presents a threat to your configurations is a great way to begin prioritizing findings.  It also includes the different compliance mandates and Well-Architected pillar (obviously Security in this case) as well as descriptive links to the different frameworks to get even more details.

The reason this knowledge base rule is in place is also included. This gives you and your teams context to the rule and helps further drive your posture during your cloud migration. Sample reason is as follows for our MFA Use Case:

“As a security best practice, it is always recommended to supplement your IAM user names and passwords by requiring a one-time passcode during authentication. This method is known as AWS Multi-Factor Authentication and allows you to enable extra security for your privileged IAM users. Multi-Factor Authentication (MFA) is a simple and efficient method of verifying your IAM user identity by requiring an authentication code generated by a virtual or hardware device on top of your usual access credentials (i.e. user name and password). The MFA device signature adds an additional layer of protection on top of your existing user credentials making your AWS account virtually impossible to breach without the unique code generated by the device.”

If Reason is the “what” of the rule, Rationale is the “why” supplying you with the need for adoption.  Again, perfect for confirming your cloud migration path and strategy along the way.

“Monitoring IAM access in real-time for vulnerability assessment is essential for keeping your AWS account safe. When an IAM user has administrator-level permissions (i.e. can modify or remove any resource, access any data in your AWS environment and can use any service or component – except the Billing and Cost Management service), just as with the AWS root account user, it is mandatory to secure the IAM user login with Multi-Factor Authentication.

Implementing MFA-based authentication for your IAM users represents the best way to protect your AWS resources and services against unauthorized users or attackers, as MFA adds extra security to the authentication process by forcing IAM users to enter a unique code generated by an approved authentication device.”

Finally, all the references for each of the risk, reason, and rationale, are included at the bottom which helps provide additional clarity. You’ll also notice remediation steps, the 5th ‘R’ when applicable, which shows you how to actually the correct the problem.

All of this data is included to the community as Trend Micro continues to be a valued security research firm helping the world be safe for exchanging digital information. Explore all the rules we have available in our public knowledge base: https://www.cloudconformity.com/knowledge-base/.

This blog is part of a multi-part series dealing with the principles of a successful cloud migration.  For more information, start at the first post here: https://blog.trendmicro.com/principles-of-a-cloud-migration-from-step-one-to-done/

The post Principles of a Cloud Migration – Security, The W5H – Episode WHAT? appeared first on .

Why do I need a CASB for Shadow IT when I already have a SIEM?

Why does my organization need to have a Shadow IT solution when we already own a Next-Gen Firewall / Web Proxy and have all the logs in a Security Information and Event Management (SIEM) solution?

This is a question we are often asked by our customers. The answer is that MVISION Cloud CASB allows organizations to uncover Shadow IT usage that is not visible via a query in a SIEM or with Next-Generation Firewall (NGFW) / Secure Web Gateway (SWG) tools. NGFW and Web Proxies typically catalog web services using a category and a reputation score. So, a Russian email service, like mail.ru, would simply be categorized as “Web-based Email” with “Trustworthy” reputation. A typical output of a web reputation score from NGFW / SWG is shown below.

Source: WebRoot BrightCloud Threat Intelligence

What it doesn’t tell you is that mail.ru is hosted in Russia, that it does not encrypt user data at rest, and that it is a source of leaks to the Darknet. It’s definitely not the kind of site a security-conscious organization would want its employees using at work.

The reason for this discrepancy in cloud service assessment is that NGFW/SWG products primarily look at a cloud services from a traditional cyber security perspective: Is the site a source for spam, web attacks, malware, etc.? MVISION Cloud CASB starts there, and also looks at the cloud service business risk. MVISION Cloud provides each cloud service a risk score based on an assessment of 46 control points, covering over 240 risk attributes. Furthermore, McAfee MVISION Cloud maintains a detailed registry of over 26,000 cloud services, with approximately 100 new services added to the registry each month. For comparison, the registry of a leading NGFW vendor currently has a little over 3,000 services. The good news is that Shadow IT data discovered by MVISION Cloud can be consumed by an organization’s existing security stack to block user access or limit the scope of user activity within a service. Here’s how this service ranks in MVISION Cloud:

McAfee often gets asked the following question: If Shadow IT findings are based on web traffic log data stored in a SIEM, why can’t I find information about an organization’s Shadow usage directly from a SIEM console? The main reason is that a SOC analyst doesn’t know what he doesn’t know. If asked “Show me all PDF converters hosted outside of US that are used on organization’s network,” where does a SOC analyst even start, what does he search for?

The easier route is to utilize McAfee MVISION Cloud CASB and search the MVISION Cloud Registry for “Document Conversion” services and see which unsanctioned PDF converters are “in use.” The SOC analyst can then send the MVISION Cloud Registry data about the suspect services directly to a SIEM via API. This data can now be used to seed searches within the SIEM tool for further analysis by SOC analyst.

Another scenario where MVISION Cloud makes a traditional SIEM more “cloud aware” is logging URL space for complex services. For example, if a SOC analyst wants to block Netflix and creates a rule to block all *.netflix.com URLs, he will be surprised to find that Netflix is not actually blocked, and users can still access the content. The reason for this is that most NGFW/SWG products know of only a handful of ways to get to a cloud service. MVISION Cloud, through its crowd sourcing approach, knows of 100s of ways to get to a cloud service and updates these as URLs change. Going back to the Netflix example, below is a screenshot from the MVISION Cloud console showing some of the other URLs associated with the video streaming service.

If a SOC analyst searches for *.netflix.com in a SIEM console, he will only get a partial view of all Netflix activity. The SOC analyst would need MVISION Cloud to figure out the *.nflxvideo.net domains and other ephemeral URL strings to get a complete view of the Netflix service on the organization’s network. Ultimately, MVISION Cloud for Shadow IT should be used as a complimentary tool to an organization’s SIEM capability. It’s a symbiotic relationship. An organization’s SIEM is the source of Shadow IT data for MVISION Cloud, but it is MVISION Cloud that makes the SIEM tool cloud aware.

Keep reading about MVISION Cloud here.

The post Why do I need a CASB for Shadow IT when I already have a SIEM? appeared first on McAfee Blogs.

Top 10 Cloud Privacy Recommendations for Businesses

In the corporate world, privacy refers to employee/business data as well as customer/supplier data—you must safeguard both of them. Laws such as CCPA and GDPR, not to mention vertical market regulations, make it clear how important this issue is to regulators, who take into account the security tools in use and their settings during investigations. (Fines can be significantly lower if tools are well deployed.)

As businesses continue to accelerate to the cloud, there’s no better time to review all aspects of cloud data collection, use, storage, transfer and processing.

  1. Investigate shadow IT, unsanctioned cloud providers and THEIR security

The organization’s data can easily leak via shadow cloud services; for example, users converting a PDF of the employee phone list, translating a project plan, or using a cloud-based presentation tool or unmanaged collaboration services. The corporation is responsible for data loss from its employees, no matter how it occurs. So IT needs visibility into all cloud services, even those set up by individual users or small groups. Once you have a comprehensive picture of unsanctioned cloud usage, this information should be shared with the purchasing team to help them decide which services to approve.

  1. Integrate with global SSO

Global single sign-on services can ensure that users’ access is removed from all services when they leave the organization, as well as reduce the risk of data loss from password reuse. In a non-SSO service, users often call the helpdesk team when they’ve forgotten their passwords , so SSO has the added benefit of reducing call volume.

  1. Work with GRC and workshop how users use cloud

GRC (governance, risk and compliance) should be brought in to help define cloud use policies. Often, they are unsure how clouds are being used and what data is being uploaded, and therefore policies are general. Create a team including users, GRC and IT security to define policies for the real world by reviewing the possible actions that can be taken in each particular cloud service and ensure policies are defined for all eventualities.

  1. Review IaaS – Don’t assume DevOps did everything right

The fastest-growing area of cloud is IaaS—AWS, Azure and Google Cloud Platform. Here, it is very easy for developers to misconfigure the settings and leave data open to attackers.  Technology is needed to check for all IaaS services (we always find more than people believe they have) and their settings—ideally, this would be a system that can automatically change settings to secure options.

  1. Keep up to date with technology—serverless, containers, cloud email services, etc.

The cloud includes many technologies that are constantly evolving; therefore, security needs to change too. Developers are often at the forefront of technological advances—bringing in code from GitHub, running container systems that only live for a few minutes (even this isn’t too short a time to require safeguarding) and more. IT security needs to be in partnership with the development teams and deploy technologies to defend against the latest threats.

  1. Integrate with web gateway and DLP—don’t lose security as you move to cloud

After investing time and money over the last decade on security, you don’t want to lose that investment when moving to the cloud. As systems and data are moved skyward, you should deploy technologies that can integrate with your existing services and technology. For example, you shouldn’t have two different DLP models depending on the computing services used by your employees. Deploy systems that can integrate with each other, preferably with a single-pane-of-glass management system.

  1. Don’t assume CSPs will keep your logs forever

If the worst happens, you need to investigate the history of a data loss incident. CSPs will rarely save data logs forever—refer to your contract to find out how long they keep logs, and consider having your own logs so that forensic investigations can be executed even if the original data loss incident was some time ago.

  1. Consider differential policies based on location, device, etc.

Once data is in the cloud, the whole idea is to facilitate global working. Is that always appropriate? For example, what if an employee wants to download a sensitive corporate document via a cloud service to an unmanaged device? Consider the situations your employees will encounter, and form policy that provides the maximum amount of security required while causing the least amount of disruption possible.

  1. Promote the clouds you DO like to your users

Carrots work better than sticks to train users. Don’t just block the services you don’t like, promote widely the cloud services you approve of, those that conform to your security needs, your performance indicators and capabilities. Promote them via the intranet, blogs and internal marketing, and redirect requests to unsupported services back to those you like.

  1. Privacy and security is everyone’s responsibility: Bring in other departments and users

Perhaps the last recommendation should be the first: Use every method available to train users, but before you do, work with those users and their representatives to define appropriate policies. The aim is to encourage users to use cloud services that are not only safe, but will allow them to be as productive as possible. The users themselves typically have great ideas of the services they’d like to use, why and how, so bring them in to help define the policies and work together with GRC.

Here’s to successful and secure cloud deployment, and to keeping your users and customer personal data as secure as you can in 2020 and beyond.

For more information, take a look at our additional resource on safeguarding your personal data in the cloud . 

The post Top 10 Cloud Privacy Recommendations for Businesses appeared first on McAfee Blogs.

Top 10 Cloud Privacy Recommendations for Consumers

It’s Data Privacy Day and when it comes down to it, most of us don’t know exactly how many organizations have our data—let alone how it’s being collected or what it is being used for. Unfortunately, the stakes are higher than ever for those who are unwilling to take appropriate safeguards to defend their personal data, including identity theft, financial loss, and more.

While the cloud presents a wealth of opportunity for increased productivity, connectivity and convenience, it also requires a new set of considerations for ensuring safe use. There are many, but here are the top ten:

1. Don’t reuse passwords.

Password reuse is a common problem, especially in consumer cloud services. If you reuse passwords, you only need one of your cloud services to be breached—once criminals have stolen your credentials through one service, they potentially have access to every account that shares those same credentials, including banking platforms, email and other services where sensitive data is stored. When using a cloud service for the first time, it’s easy to think that if the data you are using in that particular service isn’t confidential, then it doesn’t matter if you use your favorite password. But a good way to think of it is this:  Many passwords, one breach. One password…. (potentially) many breaches. If you’re concerned about being able to remember them, look into obtaining a password manager.

2. Don’t share folders, share files

Many cloud services allow collaboration or file sharing. If you only want to share a few files, share those and not a complete folder. It’s all too easy to over-share without realizing what else is in the folder—or to forget who you shared it with (or that you shared it at all!) and later add private files that were never meant to be disseminated.

3. Be careful with auto-sync (it could bring in malware)

If you share a folder with someone else, many cloud services provide auto-sync, so that when another user adds new files, they get synced to everyone in the share. The danger here is that if someone you are sharing with gets infected by malware, this malware could be uploaded to the cloud and downloaded to your devices automatically.

4. Be careful of services that ask for your data

When logging into a new service, you may be asked for some personal data; for example, your date of birth. Why should they ask, and what will they do with this information?  If they can tie that to your email address, and another service obtains your zip-code and a third service asks for your mobile number, you can see that anyone collating that information could have enough to try to steal your identity. If there’s no reason why a service should have that data, use a different service (or, at least, give them incorrect information).

5. Read EULA & privacy policies – who owns the data?

I know this sounds hard, but it is worth it: Does the cloud provider claim that they own the data you upload? This may give them the right, or at least enough rights in their own mind, to sell your data to data brokers. This is more common than you think—you should never use a service that claims it owns your data.

6. Think twice about mobile apps and their data collection

Many cloud services have a mobile app as a way to access their service. Before using a mobile app, look at the data it says it will collect. Often the app collects more data than would be collected if you were to access the service via browser.

7. If unsure, ask your IT department if they have reviewed the service.

Some organizations’ IT departments will have already reviewed a cloud service and decided if it is acceptable for corporate use. It’s in their interest to keep their users secure, especially as so many devices now contain both personal and business data. Ask them if they have reviewed a service before you access it.

8. Don’t use public Wi-Fi hotspots without using a VPN for encryption.

Public Wi-Fi can be a place for data interception. Always use a VPN or encryption technology to ensure data is encrypted between your device and cloud services when on a public Wi-Fi.

9. Enable multi-factor authentication.

Cloud services that are well designed will offer additional security services, such as multi-factor authentication. Use those, and any other security features that you can.

10. Don’t share accounts with friends and family.

It’s often second nature to share with our friends and family. But are they as concerned about privacy as you are? Don’t share accounts, otherwise if they let their guard drop, your data could be compromised.

Check out more ways to take action and protect your data. 

Take a look at our additional resource for safeguarding your personal data in the cloud . 

The post Top 10 Cloud Privacy Recommendations for Consumers appeared first on McAfee Blogs.

Happy Birthday TaoSecurity.com


Nineteen years ago this week I registered the domain taosecurity.com:

Creation Date: 2000-07-04T02:20:16Z

This was 2 1/2 years before I started blogging, so I don't have much information from that era. I did create the first taosecurity.com Web site shortly thereafter.

I first started hosting it on space provided by my then-ISP, Road Runner of San Antonio, TX. According to archive.org, it looked like this in February 2002.


That is some fine-looking vintage hand-crafted HTML. Because I lived in Texas I apparently reached for the desert theme with the light tan background. Unfortunately I didn't have the "under construction" gif working for me.

As I got deeper into the security scene, I decided to simplify and adopt a dark look. By this time I had left Texas and was in the DC area, working for Foundstone. According to archive.org, the site look like this in April 2003.


Notice I've replaced the oh-so-cool picture of me doing American Kenpo in the upper-left-hand corner with the classic Bruce Lee photo from the cover of The Tao of Jeet Kune Do. This version marks the first appearance of my classic TaoSecurity logo.

A little more than two years later, I decided to pursue TaoSecurity as an independent consultant. To launch my services, I painstakingly created more hand-written HTML and graphics to deliver this beauty. According to archive.org, the site looked like this in May 2005.


I mean, can you even believe how gorgeous that site is? Look at the subdued gray TaoSecurity logo, the red-highlighted menu boxes, etc. I should have kept that site forever.

We know that's not what happened, because that wonder of a Web site only lasted about a year. Still to this day not really understanding how to use CSS, I used a free online template by Andreas Viklund to create a new site. According to archive.org, the site appeared in this form in July 2006.


After four versions in four years, my primary Web site stayed that way... for thirteen years. Oh, I modified the content, SSH'ing into the server hosted by my friend Phil Hagen, manually editing the HTML using vi (and careful not to touch the CSS).

Then, I attended AWS re:inforce the last week in June, 2019. I decided that although I had tinkered with Amazon Web Services as early as 2010, and was keeping an eye on it as early as 2008, I had never hosted any meaningful workloads there. A migration of my primary Web site to AWS seemed like a good way to learn a bit more about AWS and an excuse to replace my teenage Web layout with something that rendered a bit better on a mobile device.

After working with Mobirise, AWS S3, AWS Cloudfront, AWS Certificate Manager, AWS Route 53, my previous domain name servers, and my domain registrar, I'm happy to say I have a new TaoSecurity.com Web site. The front page like this:


The background is an image of Milnet from the late 1990s. I apologize for the giant logo in the upper left. It should be replaced by a resized version later today when the AWS Cloudfront cache expires.

Scolling down provides information on my books, which I figured is what most people who visit the site care about.


For reference, I moved the content (which I haven't been updated) about news, press, and research to individual TaoSecurity Blog posts.

It's possible you will not see the site, if your DNS servers have the old IP addresses cached. That should all expire no later than tomorrow afternoon, I imagine.

Let's see if the new site lasts another thirteen years?

Thoughts on Cloud Security

Recently I've been reading about cloud security and security with respect to DevOps. I'll say more about the excellent book I'm reading, but I had a moment of déjà vu during one section.

The book described how cloud security is a big change from enterprise security because it relies less on IP-address-centric controls and more on users and groups. The book talked about creating security groups, and adding users to those groups in order to control their access and capabilities.

As I read that passage, it reminded me of a time long ago, in the late 1990s, when I was studying for the MCSE, then called the Microsoft Certified Systems Engineer. I read the book at left, Windows NT Security Handbook, published in 1996 by Tom Sheldon. It described the exact same security process of creating security groups and adding users. This was core to the new NT 4 role based access control (RBAC) implementation.

Now, fast forward a few years, or all the way to today, and consider the security challenges facing the majority of legacy enterprises: securing Windows assets and the data they store and access. How could this wonderful security model, based on decades of experience (from the 1960s and 1970s no less), have failed to work in operational environments?

There are many reasons one could cite, but I think the following are at least worthy of mention.

The systems enforcing the security model are exposed to intruders.

Furthermore:

Intruders are generally able to gain code execution on systems participating in the security model.

Finally:

Intruders have access to the network traffic which partially contains elements of the security model.

From these weaknesses, a large portion of the security countermeasures of the last two decades have been derived as compensating controls and visibility requirements.

The question then becomes:

Does this change with the cloud?

In brief, I believe the answer is largely "yes," thankfully. Generally, the systems upon which the security model is being enforced are not able to access the enforcement mechanism, thanks to the wonders of virtualization.

Should an intruder find a way to escape from their restricted cloud platform and gain hypervisor or management network access, then they find themselves in a situation similar to the average Windows domain network.

This realization puts a heavy burden on the cloud infrastructure operators. They major players are likely able to acquire and apply the expertise and resources to make their infrastructure far more resilient and survivable than their enterprise counterparts.

The weakness will likely be their personnel.

Once the compute and network components are sufficiently robust from externally sourced compromise, then internal threats become the next most cost-effective and return-producing vectors for dedicated intruders.

Is there anything users can do as they hand their compute and data assets to cloud operators?

I suggest four moves.

First, small- to mid-sized cloud infrastructure users will likely have to piggyback or free-ride on the initiatives and influence of the largest cloud customers, who have the clout and hopefully the expertise to hold the cloud operators responsible for the security of everyone's data.

Second, lawmakers may also need improved whistleblower protection for cloud employees who feel threatened by revealing material weaknesses they encounter while doing their jobs.

Third, government regulators will have to ensure no cloud provider assumes a monopoly, or no two providers assume a duopoloy. We may end up with the three major players and a smattering of smaller ones, as is the case with many mature industries.

Fourth, users should use every means at their disposal to select cloud operators not only on their compute features, but on their security and visibility features. The more logging and visibility exposed by the cloud provider, the better. I am excited by new features like the Azure network tap and hope to see equivalent features in other cloud infrastructure.

Remember that security has two main functions: planning/resistance, to try to stop bad things from happening, and detection/respond, to handle the failures that inevitably happen. "Prevention eventually fails" is one of my long-time mantras. We don't want prevention to fail silently in the cloud. We need ways to know that failure is happening so that we can plan and implement new resistance mechanisms, and then validate their effectiveness via detection and response.

Update: I forgot to mention that the material above assumed that the cloud users and operators made no unintentional configuration mistakes. If users or operators introduce exposures or vulnerabilities, then those will be the weaknesses that intruders exploit. We've already seen a lot of this happening and it appears to be the most common problem. Procedures and tools which constantly assess cloud configurations for exposures and vulnerabilities due to misconfiguration or poor practices are a fifth move which all involved should make.

A corollary is that complexity can drive problems. When the cloud infrastructure offers too many knobs to turn, then it's likely the users and operators will believe they are taking one action when in reality they are implementing another.