Category Archives: Executive Perspectives

Modernizing FedRAMP is Essential to Enhanced Cloud Security

According to an analysis by McAfee’s cloud division, log data tracking the activities of some 200,000 government workers in the United States and Canada, show that the average agency uses 742 cloud services, on the order of 10 to 20 times more than the IT department manages. The use of unauthorized applications creates severe security risks, often resulting simply from employees trying to do their work more efficiently.

By category, collaboration tools like Office 365 or Gmail are the most commonly used cloud applications, according to McAfee’s analysis, with the average organization running 120 such services. Cloud-based software development services such as GitHub and Source Forge are a distant second, followed by content-sharing services. The average government employee runs 16.8 cloud services, according to the 2019 Cloud Adoption and Risk Report. Lack of awareness creates a Shadow IT problem that needs to be addressed.  One of the challenges is that not all storage or collaboration services are created equally, and users, without guidance from the CIO, might opt for an application that has comparatively lax security controls, claims ownership of users’ data, or one that might be hosted in a country that the government has placed trade sanctions on.

To help address the growing challenge of security gaps in IT cloud environments, Congressmen Gerry Connolly (D-VA), Chairman of the House Oversight and Reform Committee’s Government Operations Subcommittee, and Mark Meadows (R-NC), Ranking Member of the Government Operations Subcommittee, recently introduced the Federal Risk and Authorization Management Program (FedRAMP) Authorization Act (H.R. 3941). The legislation would codify FedRAMP – the program that governs how cloud security solutions are deployed within the federal government, address agency compliance issues, provide funding for the FedRAMP Project Management Office (PMO) and more. The FedRAMP Authorization Act would help protect single clouds as well as the spaces between and among clouds. Since cloud services are becoming easier targets for hackers, McAfee commends these legislators for taking this important step to modernize the FedRAMP program.

FedRAMP provides a standardized approach to security assessment and monitoring for cloud products and services that agency officials use to make critical risk-based decisions. Cloud solutions act as gatekeepers, allowing agencies to extend the reach of their cloud policies beyond their current network infrastructure. To monitor data authentication and protection within the cloud, cloud access security brokers, or CASBs, allow organizations deeper visibility into their cloud security solutions. In today’s cybersecurity market, there are many cloud security vendors, and organizations therefore have many solutions from which to choose to enable them to secure their cloud environments.  McAfee’s CASB, MVISION Cloud, helps ensure that broad technology acquisitions maintain or exceed the levels of security outlined in the FedRAMP baselines.

McAfee supports the FedRAMP Authorization Act, which would bring FedRAMP back to its original purpose by providing funding for federal migration and mandating the reuse of authorizations. FedRAMP must be modernized to best serve government agencies and IT companies. We look forward to working with our partners in Congress to move this legislation forward. Additionally, we have seen that agencies overuse waivers to green light technology that sacrifices security for expediency.  We must find a better way to thread this needle and ensure that broad technology acquisitions maintain or exceed the levels of security outlined in the FedRAMP baselines as this bill works its way through the legislative process and finds its way to the President’s desk for signature into law.

The post Modernizing FedRAMP is Essential to Enhanced Cloud Security appeared first on McAfee Blogs.

The Cybersecurity Playbook: Why I Wrote a Cybersecurity Book

I ruined Easter Sunday 2017 for McAfee employees the world over. That was the day our company’s page on a prominent social media platform was defaced—less than two weeks after McAfee had spun out of Intel to create one of the world’s largest pure-play cybersecurity companies. The hack would have been embarrassing for any company; it was humiliating for a cybersecurity company. And, while I could point the finger of blame in any number of directions, the sobering reality is that the hack happened on my watch, since, as the CMO of McAfee, it was my team’s responsibility to do everything in our power to safeguard the image of our company on that social media platform. We had failed to do so.

Personal accountability is an uncomfortable thing. Defensive behavior comes much more naturally to many of us, including me. But, without accountability, change is hindered. And, when you find yourself in the crosshairs of a hacker, change—and change quickly—you must.

I didn’t intend to ruin that Easter Sunday for my colleagues. There was nothing I wanted less than to call my CEO and peers and spoil their holiday with the news. And, I didn’t relish having to notify all our employees of the same the following Monday. It wasn’t that I was legally obligated to let anyone know of the hack; after all, McAfee’s systems were never in jeopardy. But our brand reputation took a hit that day, and our employees deserved to know that their CMO had let her guard down just long enough for an opportunistic hacker to strike.

I tell you this story not out of self-flagellation or so that you can feel, “Hey, better her than me!” I share this story because it’s a microcosm of why I wrote a book, The Cybersecurity Playbook: How Every Leader and Employee Can Contribute to a Culture of Security.

I’m not alone in having experienced an unfortunate hack that may have been prevented had my team and I been more diligent in practicing habits to minimize it. Every day, organizations are attacked the world over. And, behind every hack, there’s a story. There’s hindsight of what might have been done to avoid it. While the attack on that Easter Sunday was humbling, the way in which my McAfee teammates responded, and the lessons we learned, were inspirational.

I realized in the aftermath that there’s a real need for a playbook that gives every employee—from the frontline worker to the board director—a prescription for strong cybersecurity hygiene. I realized that everyone can play an indispensable role in protecting her organization from attack. And, I grasped that common sense is not always common practice.

There’s no shortage of cybersecurity books available for your consumption from reputable, talented authors with a variety of experiences. You’ll find some from journalists, who have dissected some of the most legendary breaches in history. You’ll find others from luminaries, who speak with authority as being venerable forefathers of the industry. And you’ll find more still from technical experts, who decipher the intricate elements of cybersecurity in significant detail.

But, you won’t find many from marketers. So why trust this marketer with a topic of such gravity? Because this marketer not only works for a company that has its origins in cybersecurity but found herself on her heels that fateful Easter Sunday. I know what it’s like to have to respond—and respond fast—when time is not on your side and your reputation is in the hands of a hacker. And, while McAfee certainly had a playbook to act accordingly, I realized that every company should have the same.

So, whether you’re in marketing, human resources, product development, IT or finance—or a board member, CEO, manager or individual contributor—this book gives you a playbook to incorporate cybersecurity habits in your routine. I’m not so naïve as to believe that cybersecurity will become everyone’s primary job. But, I know that cybersecurity is now too important to be left exclusively in the hands of IT. And, I am idealistic to envision a workplace where sound cybersecurity practice becomes so routine, that all employees regularly do their part to collectively improve the defenses of their organization. I hope this book empowers action; your organization needs you in this fight.

Allison Cerra’s book, The Cybersecurity Playbook: How Every Leader and Employee Can Contribute to a Culture of Security, is scheduled to be released September 12, 2019 and can be preordered at amazon.com.

The post The Cybersecurity Playbook: Why I Wrote a Cybersecurity Book appeared first on McAfee Blogs.

House Actions on Election Security Bode Well for 2020

As a U.S. cybersecurity company, McAfee supports legislation that aims to safeguard U.S. election security. After the 2016 election, McAfee sees the importance of improving and preserving election security; we even offered free security tools to local election boards prior to the 2018 elections and released educational research on how localities can best protect themselves in future elections. As the 2020 primary elections quickly approach, it is more important than ever that the federal government takes steps to ensure our election infrastructure is secure and that states and localities have the resources they need to quickly upgrade and secure systems.

The U.S. House of Representatives recently passed H.R. 2722, the Securing America’s Federal Elections (SAFE) Act, legislation introduced by Rep. Zoe Lofgren (D-CA) that would allocate $600 million for states to secure critical election infrastructure. The bill would require cybersecurity safeguards for hardware and software used in elections, prevent the use of wireless communication devices in election systems and require electronic voting machines to be manufactured in the United States. The SAFE Act is a key step to ensuring election security and integrity in the upcoming 2020 election.

Earlier this year, the House also passed H.R. 1, the For the People Act. During a House Homeland Security Committee hearing prior to the bill’s passage, the committee showed commitment to improving the efficiency of election audits and continuing to incentivize the patching of election systems in preparation for the 2020 elections. H.R. 1 and the SAFE Act demonstrate the government’s prioritization of combating election interference. It is exciting to see the House recognize the issue of election security, as it is a multifaceted process and a vital one to our nation’s democracy.

McAfee applauds the House for keeping its focus on election security and prioritizing the allocation of resources to states. We hope that Senate leadership will take up meaningful, comprehensive election security legislation so our country can fully prepare for a secure 2020 election.

The post House Actions on Election Security Bode Well for 2020 appeared first on McAfee Blogs.

Expanding Our Vision to Expand the Cybersecurity Workforce

I recently had the opportunity to testify before Congress on how the United States can grow and diversify the cyber talent pipeline. It’s great that members of Congress have this issue on their radar, but at the same time, it’s concerning that we’re still having these discussions. A recent (ISC) Study puts the global cybersecurity workforce shortage at 2.93 million. Solving this problem is challenging, but I offered some recommendations to the House Homeland Security Committee’s Subcommittee on Cybersecurity, Infrastructure Protection and Innovation.

Increase the NSF CyberCorps Scholarships for Service Program

The National Science Foundation (NSF) together with the Department of Homeland Security (DHS) designed a program to attract more college students to cybersecurity, and it’s working. Ten to 12 juniors and seniors at each of the approximately 70 participating institutions across the country receive free tuition for up to two years plus annual stipends. Once they’ve completed their cybersecurity coursework and an internship, they go to work for the federal government for the same amount of time they’ve been in the program. Afterwards, they’re free to remain federal employees or move elsewhere, yet fortunately, a good number of them choose to stay.

Congress needs to increase the funding for this program (which has been flat since 2017) from $55 million to at least $200 million. Today the scholarships are available at 70 land grant colleges. The program needs to be opened up to more universities and colleges across the country.

Expand CyberCorps Scholarships to Community Colleges

Community colleges attract a wide array of students – a fact that is good for the cybersecurity profession. Some community college attendees are recent high school graduates, but many are more mature, working adults or returning students looking for a career change or skills training. A strong security operation requires differing levels of skills, so having a flexible scholarship program at a community college could not only benefit graduates but also provide the profession with necessary skills.

Furthermore, not everyone in cybersecurity needs a four-year degree. In fact, they don’t need to have a traditional degree at all. Certificate programs provide valuable training, and as employers, we should change our hiring requirements to reflect that reality.

Foster Diversity of Thinking, Recruiting and Hiring

Cybersecurity is one of the greatest technical challenges of our time, and we need to be as creative as possible to meet it. In addition to continually advancing technology, we need to identify people from diverse backgrounds – and not just in the standard sense of the term. We need to diversify the talent pool in terms of race, ethnicity, gender and age, all of which lead to creating an inclusive team that will deliver better results. However, we also should seek out gamers, veterans, people working on technical certificates, and retirees from computing and other fields such as psychology, liberal arts as well as engineering. There is no one background required to be a cybersecurity professional. We absolutely need people with deep technical skills, but we also need teams with diverse perspectives, capabilities and levels of professional maturity.

Public-Private Sector Cross Pollination

We also must develop creative approaches to enabling the public and private sectors to share talent, particularly during significant cybersecurity events. We should design a mechanism for cyber professionals – particularly analysts or those who are training to become analysts – to move back and forth between the public and private sector so that government organizations would have a continual refresh of expertise. This type of cross-pollination would help everyone share best practices on technology, business processes and people management.

One way to accomplish this would be for DHS to partner with companies and other organizations such as universities to staff a cadre of cybersecurity professionals – operators, analysts and researchers – who are credentialed to move freely between public and private sector service. These professionals, particularly those in the private sector, could be on call to help an impacted entity and the government respond to a major attack in a timely way. Much like the National Guard, a flexible staffing approach to closing the skills gap could become a model of excellence.

We’re Walking the Talk

McAfee is proud to support the community to establish programs that provide skills to help build the STEM pipeline, fill related job openings, and close gender and diversity gaps. These programs include an Online Safety Program, onsite training programs and internships for high school students. Our employees also volunteer in schools help educate students on both cybersecurity risks and opportunities. Through volunteer-run programs across the globe, McAfee has educated more than 500,000 children to date.

As part of the McAfee’s new pilot Achievement & Excellence in STEM Scholarship program, we’ll make three awards of $10,000 for the 2019-2020 school year. Twelve students from each of the three partner schools will be invited to apply, in coordination with each partner institution’s respective college advisor. Target students are college-bound, high school seniors with demonstrated passion for STEM fields, who are seeking a future in a STEM-related path. This type of a program can easily be replicated by other companies and used to support the growth and expansion of the workforce.

We’re Supporting Diversity

While we recognize there is still more to do in fostering diversity, we’re proud to describe the strides we’re making at McAfee. We believe we have a responsibility to our employees, customers and communities to ensure our workplace reflects the world in which we live. Having a diverse, inclusive workforce is the right thing to do, and after we became an independent, standalone cybersecurity company in 2017, we made and have kept this a priority.

 The steps we’re taking include:

  • Achieving pay parity between women and men employees in April 2019, making us the first pureplay cybersecurity company to do so.
  • In 2018, 27.1% of all global hires were female and 13% of all U.S. hires were underrepresented minorities.
  • In June 2018, we launched our “Return to Workplace” program for men and women who have paused their career to raise children, care for loved ones or serve their country. The 12-week program offers the opportunity to reenter the tech space with the support and resources needed to successfully relaunch careers.
  • Last year, we established the Diversity & Culture Council, a volunteer-led global initiative focused on creating an infrastructure for the development and maintenance of an integrated strategy for diversity and workplace culture.
  • McAfee CEO Chris Young joined CEO Action for Diversity Inclusion, the largest group of CEOs and presidents committed to act on driving an inclusive workforce. By taking part in CEO Action, Young personally commits to advancing diversity and inclusion with the coalition’s three-pronged approach of fostering safe workplaces.

Looking to the Future

While I’d love to see a future where fewer cybersecurity professionals were needed, I know that for the foreseeable future, we’ll not only need great technology but also talented people. With that reality, we in the industry need to expand our vision and definition of what constitutes cybersecurity talent. The workforce shortage is such that we have to do expand our concepts and hiring requirements. In addition, the discipline itself will benefit from a population that brings more experiences, skills and diversity to bear on a field that is constantly changing.

The post Expanding Our Vision to Expand the Cybersecurity Workforce appeared first on McAfee Blogs.

A Robust Federal Cybersecurity Workforce Is Key To Our National Security

The Federal government has long struggled to close the cybersecurity workforce gap. The problem has continued to get worse as the number of threats against our networks, critical infrastructure, intellectual property, and the millions of IoT devices we use in our homes, offices and on our infrastructure increase. Without a robust cyber workforce, federal agencies will continue to struggle to develop and execute the policies needed to combat these ongoing issues.

The recent executive order on developing the nation’s cybersecurity workforce was a key step to closing that gap and shoring up the nation’s cyber posture. The widespread adoption of the cybersecurity workforce framework by NIST, the development of a rotational program for Federal employees to expand their cybersecurity expertise and the “president’s cup” competition are all crucial to retaining and growing the federal cyber workforce. If we are to get serious about closing the federal workforce gap, we have to encourage our current professionals to stay in the federal service and grow their expertise to defend against the threats of today and prepare for the threats of tomorrow.

Further, we must do more to bring individuals into the field by eliminating barriers of entry and increasing the educational opportunities available for people so that there can be a strong, diverse and growing cybersecurity workforce in both the federal government and the private sector. Expanding scholarship programs through the National Science Foundation (NSF) and Department of Homeland Security (DHS) for students who agree to work for federal and state agencies will go a long way to bringing new, diverse individuals into the industry.  Additionally, these programs should be expanded to include many types of educational institutions including community colleges. Community colleges attract a different type of student than a 4-year institution, increasing diversity within the federal workforce while also tapping into a currently unused pipeline for cyber talent.

The administration’s prioritization of this issue is a positive step forward, and there has been progress made on closing the cyber skills gap in the U.S., but there is still work to be done. If we want to create a robust, diverse cyber workforce, the private sector, lawmakers and the administration must work together to come up with innovative solutions that build upon the recent executive order.

The post A Robust Federal Cybersecurity Workforce Is Key To Our National Security appeared first on McAfee Blogs.

McAfee Playing an Ever Growing Role in Tackling Disinformation and Ensuring Election Security

As Europe heads to the polls this weekend (May 23-26) to Members of the European Parliament (“MEPs”) representing the 28 EU Member States, the threat of disinformation campaigns aimed at voters looms large in the minds of politicians. Malicious players have every reason to try to undermine trust in established politicians, and push voters towards the political fringes, in an effort to destabilise European politics and weaken the EU’s clout in a tense geopolitical environment.

Disinformation campaigns are of course not a new phenomenon, and have been a feature of public life since the invention of the printing press. But the Internet and social media have given peddlers of fake news a whole new toolbox, offering bad actors unprecedented abilities to reach straight into the pockets of citizens via their mobile phones, while increasing their ability to hide their true identity.

This means that the tools to fight disinformation need to be upgraded in parallel. There is no doubt that more work is needed to tackle disinformation, but credit should also go to the efforts that are being made to protect citizens from misinformation during elections.  The European Commission has engaged the main social media players in better reporting around political advertising and preventing the spread of misinformation, as a complement to the broader effort to tackle illegal content online. The EU’s foreign policy agency, the External Action Service, has also deployed a Rapid Alert System involving academics, fact-checkers, online platforms and partners around the world to help detect disinformation activities and sharing information among member states of disinformation campaigns and methods, to help them stay on top of the game. The EU has also launched campaigns to ensure citizens are more aware of disinformation and improving their cyber hygiene, inoculating them against such threats.

But adding cybersecurity research, analysis and intelligence trade craft to the mix is a vital element of an effective public policy strategy.  And recently published research by Safeguard Cyber is a good example of how cybersecurity companies can help policymakers get to grips with disinformation.

The recent engagement with the European Commission think-tank, the EPSC, and Safeguard Cyber is a good example of how policymakers and cyber experts can work together, and we encourage more such collaboration and exchange of expertise in the months and years ahead.  McAfee Fellow and Chief Scientist Raj Samani told more than 50 senior-ranking EU officials in early May that recent disinformation campaigns are “direct, deliberate attacks on our way of life” that seek to disrupt and undermine the integrity of the election process.  And he urged policy makers that the way to address this is to use cyber intelligence and tradecraft to understand the adversary, so that our politicians can make informed decisions on how best to combat the very real threat this represents to our democracies. In practice this means close collaboration between best-in-class cybersecurity researchers, policymakers and social media players to gain a deeper understanding of the modus operandi of misinformation actors and respond more quickly.

As the sceptre of disinformation is not going to go away, we need a better understanding the actors involved, their motivations and most importantly, the rapidly changing technical tools they use to undermine democracy.  And each new insight into tackling disinformation will be put to good use in elections later this year in Denmark, Norway, Portugal, Bulgaria, Poland and Croatia and Austria.

The post McAfee Playing an Ever Growing Role in Tackling Disinformation and Ensuring Election Security appeared first on McAfee Blogs.

Why AI Innovation Must Reflect Our Values in Its Infancy

In my last blog, I explained that while AI possesses the mechanics of humanness, we need to train the technology to make the leap from mimicking humanness with logic, rational and analytics to emulating humanness with common sense. If we evolve AI to make this leap the impact will be monumental, but it will require our global community to take a more disciplined approach to pervasive AI proliferation. Historically, our enthusiasm for and consumption of new technology has outpaced society’s ability to evolve legal, political, social, and ethical norms.

I spend most of my time thinking about AI in the context of how it will change the way we live. How it will change the way we interact, impact our social systems, and influence our morality.  These technologies will permeate society and the ubiquity of their usage in the future will have far reaching implications. We are already seeing evidence of how it changes how we live and interact with the world around us.

Think Google. It excites our curiosity and puts information at our fingertips. What is tripe – should I order it off the menu? Why do some frogs squirt blood from their eyes? What does exculpatory mean?

AI is weaving the digital world into the fabric of our lives and making information instantaneously available with our fingertips.

AI-enabled technology is also capable of anticipating our needs. Think Alexa. As a security professional I am a hold out on this technology but the allure of it is indisputable. It makes the digital world accessible with a voice command. It understands more than we may want it to – Did someone tell Alexa to order coffee pods and toilet tissue and if not – how did Alexa know to order toilet tissue? Maybe somethings I just don’t want to know.

I also find it a bit creepy when my phone assumes (and gets it right) that I am going straight home from the grocery store letting me know, unsolicited, that it will take 28 minutes with traffic. How does it know I am going home? I could be going to the gym. It’s annoying that it knows I have no intention of working out. A human would at least have the decency to give me the travel time to both, allowing me to maintain the illusion that the gym was an equal possibility.

On a more serious note, AI-enabled technology will also impact our social, political and legal systems. As we incorporate it into more products and systems, issues related to privacy, morality and ethics will need to be addressed.

These questions are being asked now, but in anticipation of AI becoming embedded in everything we interact with it is critical that we begin to evolve our societal structures to address both the opportunities and the threats that will come with it.

The opportunities associated with AI are exciting.  AI shows incredible promise in the medical world. It is already being used in some areas. There are already tools in use that leverage machine learning to help doctors identify disease related patterns in imaging. Research is under way using AI to help deal with cancer.

For example, in May 2018, The Guardian reported that skin cancer research using a convolutional neural network (CNN – based on AI) detected skin cancer 95% of the time compared to human dermatologists who detected it 86.6% of the time. Additionally, facial recognition in concert with AI may someday be commonplace in diagnosing rare genetic disorders, that today, may take months or years to diagnose.

But what happens when the diagnosis made by a machine is wrong? Who is liable legally? Do AI-based medical devices also need malpractice insurance?

The same types of questions arise with autonomous vehicles. Today it is always assumed a human is behind the wheel in control of the vehicle. Our laws are predicated on this assumption.

How must laws change to account for vehicles that do not have a human driver? Who is liable? How does our road system and infrastructure need to change?

The recent Uber accident case in Arizona determined that Uber was not liable for the death of a pedestrian killed by one of its autonomous vehicles. However, the safety driver who was watching TV rather than the road, may be charged with manslaughter. How does this change when the car’s occupants are no longer safety drivers but simply passengers in fully autonomous vehicles. How will laws need to evolve at that point for cars and other types of AI-based “active and unaided” technology?

There are also risks to be considered in adopting pervasive AI. Legal and political safeguards need to be considered, either in the form of global guidelines or laws. Machines do not have a moral compass. Given that the definition of morality may differ depending on where you live, it will be extremely difficult to train morality into AI models.

Today most AI models lack the ability to determine right from wrong, ill intent from good intent, morally acceptable outcomes from morally irreprehensible outcomes. AI does not understand if the person asking the questions, providing it data or giving it direction has malicious intent.

We may find ourselves on a moral precipice with AI. The safeguards or laws I mention above need to be considered before AI becomes more ubiquitous than it already is.  AI will enable human kind to move forward in ways previously unimagined. It will also provide a powerful conduit through which humankind’s greatest shortcomings may be amplified.

The implications of technology that can profile entire segments of a population with little effort is disconcerting in a world where genocide has been a tragic reality, where civil obedience is coerced using social media, and where trust is undermined by those that use mis-information to sew political and societal discontent.

There is no doubt that AI will make this a better world. It gives us hope on so many fronts where technological impasses have impeded progress. Science may advance more rapidly, medical research progress beyond current roadblocks and daunting societal challenges around transportation and energy conservation may be solved.  It is another tool in our technological arsenal and the odds are overwhelmingly in favor of it improving the global human condition.

But realizing its advantages while mitigating its risks will require commitment and hard work from many conscientious minds from different quarters of our society. We as the technology community have an obligation to engage key stakeholders across the legal, political, social and scientific community to ensure that as a society we define the moral guardrails for AI before it becomes capable of defining them, for or in spite of, us.

Like all technology before it, AI’s social impacts must be anticipated and balanced against the values we hold dear.  Like parents raising a child, we need to establish and insist that the technology reflect our values now while its growth is still in its infancy.

The post Why AI Innovation Must Reflect Our Values in Its Infancy appeared first on McAfee Blogs.