Category Archives: business + partners

Context Matters: Turning Data into Threat Intelligence

Reading Time: ~ 3 min.

1949, 1971, 1979, 1981, 1983 and 1991.

Yes, these are numbers. You more than likely even recognize them as years. However, without context you wouldn’t immediately recognize them as years in which Sicily’s Mount Etna experienced major eruptions.

Data matters, but only if it’s paired with enough context to create meaning.

While today’s conversations about threat intelligence tend to throw a ton of impressive numbers and fancy stats out there, if the discussion isn’t informed by context, numbers become noise. Context is how Webroot takes the wealth of information it gathers—data from more than 67 million sources including crawlers, honeypots, as well as partner and customer endpoints—and turns it into actionable, contextual threat intelligence.

What defines contextual threat intelligence?

When determining a definition of contextual threat intelligence, it can be helpful to focus on what it is not. It’s not a simple list of threats that’s refreshed periodically. A list of known phishing sites may be updated daily or weekly, but given that we know the average lifespan of an in-use phishing site to be mere hours, there’s no guarantee such lists are up to date.

“Some threat intelligence providers pursue the low-hanging fruit of threat intelligence—the cheap and easy kind,” says Webroot Sr. Product Marketing Manager Holly Spiers. “They provide a list of of IP addresses that have been deemed threats, but there’s no context as to why or when they were deemed a threat. You’re not getting the full story. “

Contextual threat intelligence is that full story. It provides not only a constantly updated feed of known threats, but also historical data and relationships between data objects for a fuller picture of the history of a threat based on the “internet neighborhood” in which it’s active.

Unfortunately, historical relationships are another aspect often missing from low-hanging threat intelligence sources. Since threat actors are constantly trying to evade detection, they may use a malicious URL for a period before letting it go dormant while its reputation cools down. But because it takes more effort to start from scratch, it’s likely the actor will return to it before too long.

“Our Threat Investigator tool, a visualization demo that illustrates the relationship between data objects, is able to show how an IP address’s status can change over a period of time, says Spiers. “Within six months, it may show signs of being a threat, and then go benign.”

What are the elements of context?

Over the course of a year, millions of internet objects change state from benign to malicious and back numerous times as cyber criminals attempt to avoid detection. And because threats are often interconnected, being able to map their relationships allows us to better predict whether a benign object has the potential to turn malicious. It also helps us protect users from never-before-seen threats and even predict where future attacks may come from.

That’s where the power in prediction lies—in having contextual and historical data instead of looking at a static point in time.

Some elements that are needed to provide a deeper understanding of an interwoven landscape include:

  • Real-time data from real-world sources, supplemented by active web crawlers and passive sensor networks of honeypots designed to attract threats, provide the necessary data for training machine learning models to spot threats
  • An ability to analyze relationships connecting data objects allows threat intelligence providers to make a connections as to how a benign IP address, for example, may be only one step away from a malicious URL and to predict with high confidence whether the IP address will turn malicious in the future.
  • Both live and historical data helps in the development of a trusted reputation score based on behavior over time and common reputational influencers such as age, popularity, and past infections.

Seeing the signal through the noise

Context is the way to turn terabytes of data into something meaningful that prompts action. Having the power to be able to dig into the relationships of internet objects provides the context that matters to technology vendors. For consumers of contextual threat intelligence, it means fewer false positives and the ability to prioritize real threats.

“Working with real-world vendors is key,” according to Spiers. “The reach of contextual threat intelligence and number of individuals it touches can grow exponentially.”

Interested in learning more about contextual threat intelligence? Read about the importance of data quality for a threat intelligence platform in our latest issue of Quarterly Threat Trends.

The post Context Matters: Turning Data into Threat Intelligence appeared first on Webroot Blog.

Webroot DNS Protection: Now Leveraging the Google Cloud Platform

Reading Time: ~ 2 min.

We are  excited to announce Webroot® DNS Protection now runs on Google Cloud Platform (GCP). Leveraging GCP in this way will provide Webroot customers with security, performance, and reliability. 

Security

Preventing denial of service (DoS) attacks is a core benefit of Webroot DNS Protection. Now, the solution benefits from Google Cloud load balancers with built-in DoS protection and mitigation, enabling the prevention of attack traffic before it ever hits the agent core. 

“The big thing about Google Cloud is that it dynamically manages denial of service (DoS) attacks,” said Webroot Sales Engineer Jonathan Barnett. “That happens automatically, and we know Google has that figured out.”

Click here to learn why businesses need DNS protection.

Performance

With this release, Webroot DNS Protection now runs on the Google Cloud’s high-redundancy, low-latency networks in 16 regions worldwide. That means there’s no need for a Webroot customer in Australia to have a DNS request resolved in Los Angeles, when more convenient infrastructure exists close by.  

“Google Cloud provides the ability to scale by adding new regions or new servers whenever necessary as load or need determines, nationally or internationally,” said Barnett. “This allows us to provide geolocation-appropriate answers for our customers, maximizing performance.”

Reliability

Because of GCP’s global infrastructure footprint, Webroot can quickly and easily provision more of Google’s servers in any region to ensure latency times remain low. 

And because those regional deployments can be programmed to auto-scale with spikes in traffic, even drastically increasing loads won’t increase wait times for requests.

According to Barnett, “Even if Webroot were to take on a large number of customers in a short time period, say with the closing of a deal to offer DNS solutions to an enterprise-level client with a number of subsidiaries, our environments would automatically scale with the additional load.”

One more note on the release 

Another key feature of the April DNS agent update regards switching communications from port 53, which is typically associated with DNS requests, to port 443, which is more commonly associated with SSL certificates.

The reason for this change is that, given port 443’s relevance to routine requests like banking sites and those accepting payment information, it is rarely constrained, modified, or controlled. This will reduce the need to configure firewalls or make other admin adjustments in order for Webroot DNS Protection to function as intended. 

It’s good to be in good company

With Webroot DNS Protection now leveraging the GCP will power your network-level protection. Fewer outages, latency, and bottlenecks. Ready to experience Webroot DNS Protection for yourself? Try it free for 30-days here. 

The post Webroot DNS Protection: Now Leveraging the Google Cloud Platform appeared first on Webroot Blog.

The Importance of the MSP Sales Process

Reading Time: ~ 3 min.

I’ve been in this business a long time, and I can honestly say that many MSPs lack a concrete sales process structure. That’s pretty worrisome because, let’s face it, you have to have a plan in order to succeed at just about anything. Imagine you’re an engineer working on server maintenance or a network infrastructure build—you wouldn’t do that without a plan, would you? Your sales strategy should be handled no differently. 

Dos and Don’ts for your Sales Process

First, let’s talk about some don’ts. Avoid taking a call and immediately giving a quote over the phone, as well as going straight to the customer site to conduct ad hoc assessments and sales presentations in the same breath. To build value, you need to stretch this into multiple touches, by which I mean multiple meetings. Sure, that’s more work for you up front, but it’s crucial for establishing trust with the client. You need to open and sustain a dialog about their needs so you can tailor a unique solution for them, without diving right into a pitch. By leading with careful consideration and attention to their needs, you can begin building a lasting relationship and, eventually, bring them a better offering.

Here’s how I recommend you structure your process.

Schedule an on-site strategy session with your client.

Meeting with a prospect face-to-face will demonstrate your investment in a trust relationship. Now, you have to listen to them. Don’t lead with a pitch. Let them tell you what their problems are, pay close attention to them as they express their needs, and take note of all their pain points.

This is also the ideal opportunity to truly grasp of whether the demands are excessive or unreasonable for your capabilities. Each relationship you enter into with clients is a partnership that comes with shared responsibilities. Be more than a fulfull/deliver shop. 

Perform an in-depth assessment and discovery.

You need to discover everything that’s on the client’s network and assess exactly where they stand. Don’t do this on the same day as that initial meet; schedule a second one. Take the extra time between the meetings to prepare more specific questions that will delve more deeply into the needs your prospect expressed. This will help show the client that you’re invested in their unique challenges.

When you come back, bring an engineer or assistant with you. You need someone with you who can interview different staff members and find out about the specific issues they face. Ask basic questions to understand how the employees feel about where the company’s IT stands, like: What kind of issues are you having?; What do you see wrong with your computer network?; How could your network be improved?; and What things would you like to see change? 

As you’re doing your assessment and discovery, make sure to bring cybersecurity into the discussion. Managed cybersecurity is often a poor experience, so this is your chance to feel out how else you can alleviate their pains (and set yourself apart from their current provider.) 

And, finally, book the third meeting. 

Make the pitch.

Ideally, your third meeting would be at your location. If there’s some reason you can’t do it in your own shop, take the prospect off-site for lunch at a restaurant that has private meeting rooms. Essentially, you want to avoid doing the presentation in their office, where they can easily get interrupted.

In this case, it will pay to be overly prepared. Again, if you listened closely, the prospect would’ve already told you what to focus on to help them succeed. Use that knowledge to craft the right message to deliver during this meeting. 

Start by walking through the pain points they and their employees revealed. Talk over anything else you found in your discovery/assessment that could be improved. Have an itemized list, and then ask them if they agree with all the issues you’ve found.

Once you get agreement, then you can go into your sales pitch and present them with a well-tailored offering that can actually solve their challenges and help them grow. 

Ultimately, by listening to your prospect, exhibiting an understanding of their needs, and demonstrating your level of commitment to providing value and nurturing the relationship itself, you’ll be well on your way to building a meaningful, successful business partnership.

Download my Multi-Million Dollar MSP Sales Process that will guide you through the above steps like a pro. The last few pages of the document include links to helpful templates as well as worksheets for you to hit the ground running on this process.   

Keep crushing it!

The post The Importance of the MSP Sales Process appeared first on Webroot Blog.

What Defines a Machine Learning-Based Threat Intelligence Platform?

Reading Time: ~ 4 min.

As technology continues to evolve, several trends are staying consistent. First, the volume of data is growing exponentially. Second, human analysts can’t hope to keep up—there just aren’t enough of them and they can’t work fast enough. Third, adversarial attacks that target data are also on the rise.

Given these trends, it’s not surprising that an increasing number of tech companies are building or implementing tools that promise automation and tout machine learning and/or artificial intelligence, particularly in the realm of cybersecurity. In this day and age, stopping threats effectively is nearly impossible without some next-generation method of harnessing processing power to bear the burden of analysis. That’s where the concept of a cybersecurity platform built on threat intelligence comes in.

What is a platform?

When you bring together a number of elements in a way that makes the whole greater or more powerful than the sum of its parts, you have the beginnings of a platform. Think of it as an architectural basis for building something greater on top. If built properly, a good platform can support new elements that were never part of the original plan.

With so many layers continually building on top of and alongside one another, you can imagine that a platform needs to be incredibly solid and strong. It has to be able to sustain and reinforce itself so it can support each new piece that is built onto or out of it. Let’s go over some of the traits that a well-architected threat intelligence platform needs.

Scale and scalability

A strong platform needs to be able to scale to meet demand for future growth of users, products, functionality. Its size and processing power need to be proportional to the usage needs. If a platform starts out too big too soon, then it’s too expensive to maintain. But if it’s not big enough, then it won’t be able to handle the burden its users impose. That, in turn, will affect the speed, performance, service availability, and overall user experience relating to the platform.

You also need to consider that usage fluctuates, not just over the years, but over different times of day. The platform needs to be robust enough to load balance accordingly, as users come online, go offline, increase and decrease demand, etc.

Modularity can’t be forgotten, either. When you encounter a new type of threat, or just want to add new functionality, you need to be able to plug that new capability into the platform without disrupting existing services. You don’t want to have to worry about rebuilding the whole thing each time you want to add or change a feature. The platform has to be structured in such a way that it will be able to support functionality you haven’t even thought of yet.

Sensing and connection

A threat intelligence platform is really only as good as its data sources. To accurately detect and even predict new security threats, a platform should be able to take data from a variety of sensors and products, then process it through machine learning analysis and threat intelligence engines.

Some of the more traditional sensors are passive, or “honeypots” (i.e. devices that appear to look open to attack, which collect and return threat telemetry when compromised.) Unfortunately, attack methods are now so sophisticated that some can detect the difference between a honeypot and a real-world endpoint, and can adjust their behavior accordingly so as not to expose their methods to threat researchers. For accurate, actionable threat intelligence, the platform needs to gather real-world data from real-world endpoints in the wild.

One of the ways we, in particular, ensure the quality of the data in the Webroot® Platform, is by using each deployment of a Webroot product or service—across our home user, business, and security and network vendor bases—to feed threat telemetry back into the platform for analysis. That means each time a Webroot application is installed on some type of endpoint, or a threat intelligence partner integrates one of our services into a network or security solution, our platform gets stronger and smarter.

Context and analysis

One of the most important features a threat intelligence platform needs is largely invisible to end users: contextual analysis. A strong platform should have the capacity to analyze the relationships between numerous types of internet objects, such as files, apps, URLs, IPs, etc., and determine the level of risk they pose.

It’s no longer enough to determine if a given file is malicious or not. A sort of binary good/bad determination really only gives us a linear view. For example, if a bad file came from an otherwise benign domain that was hijacked temporarily, should we now consider that domain bad? What about all the URLs associated with it, and all the files they host?

For a more accurate picture, we need nuance. We must consider where the bad file came from, which websites or domains it’s associated with and for how long, which other files or applications it might be connected to, etc. It’s these connections that give us a three-dimensional picture of the threat landscape, and that’s what begins to enable predictive protection.

The Bottom Line

When faced with today’s cyberattacks, consumers and organizations alike need cybersecurity solutions that leverage accurate threat telemetry and real-time data from real endpoints and sensors. They need threat intelligence that is continually re-analyzed for the greatest accuracy, by machine learning models that are trained and retrained, which can process data millions of times faster than human analysts, and with the scalability to handle new threats as they emerge. The only way to achieve that is with a comprehensive, integrated machine-learning based platform.

The post What Defines a Machine Learning-Based Threat Intelligence Platform? appeared first on Webroot Blog.