If an employee who recently gave two weeks’ notice starts downloading large numbers of files from the company network and copying them to a thumb drive, it is entirely possible that he or she has no malicious intent. The employee could be saving innocuous files related to their employment record or examples of marketing pieces they created.
However, in a small number of cases, the employee could be attempting to take confidential product designs, sensitive legal information, private employee data or trade secrets with them to a rival company.
It can be difficult for a company to even spot these “insider risks,” much less distinguish between routine behavior and the outlier that could destroy a company’s competitive advantage or reputation.
That’s why Microsoft is offering a new Insider Risk Management solution within Microsoft 365 that uses machine learning to intelligently detect potentially risky behavior within a company. It also quickly identifies which activities are most likely to pose real security threats, even inadvertently.
Because mistakes are a larger source of actual risk than insider attacks, the solution was designed to help employees make the right choices and avoid common security lapses. To be effective, engineers knew, the solution also had to help people do their jobs rather than slow them down.
“Fundamentally, a company’s employees are usually trying to do the right thing,” said Bret Arsenault, Microsoft’s chief information security officer and corporate vice president. “But sometimes intention is different than outcome.”
A couple of years ago, the security threats keeping Arsenault awake at night weren’t limited to hackers, cybercriminals or nation state attacks that Microsoft employs a small army of experts and leading-edge technologies to thwart. He increasingly worried about the potential risks, largely unintentional but occasionally malicious, from employees who already have easy access to a company’s most sensitive information.
For instance, that could include someone who inadvertently keeps sensitive information in a folder that’s searchable to anyone in the company, making it vulnerable to theft. Or the person who just hits the wrong button and mistakenly emails a highly confidential document outside the company.
In a recent survey of cybersecurity professionals, 90 percent of organizations indicated that they felt vulnerable to insider risk, and two-thirds considered malicious insider attacks or accidental breaches more likely than external attacks. More than half of organizations reported that they had experienced an insider attack in the past year, according to an insider threat report from Crowd Research Partners.
“In the security industry there has been a disproportionate amount of focus on external adversaries,” Arsenault said. “But with thousands of employees logging into a company’s systems every day, the threat of users — whether with inadvertent or malicious intent — may be a higher risk scenario. And that’s when we realized we needed to expand our focus.”
Arsenault tasked engineers from his security team and Microsoft 365 with creating a solution that leverages machine learning to intelligently detect and prevent internal security breaches, and to eventually turn that into a solution for customers. But it had to be designed with Microsoft core principles in mind: respecting employee privacy, assuming positive intent at the outset and encouraging the free flow of information and collaboration within a company.
The Insider Risk Management solution combines the massive array of signals from Microsoft 365 productivity tools, Windows operating systems and Azure cloud services with machine learning algorithms that can identify anomalous and potentially risky behavior from people using those products.
Product engineers worked closely with internal security analysts, human resources and other experts within Microsoft — and consulted with workers’ advocates in countries that share Microsoft’s strong commitment to privacy — to ensure the solution struck the right balance in respecting employees’ privacy and workflows.
“We knew that insider risk was becoming a more pervasive and expensive challenge, but also that we had to have an entirely different lens for addressing it,” said Erin Miyake, Microsoft’s senior program manager for insider threats, who worked with human resources, compliance and product experts to develop the new solution.
To start, you’re looking at people who already have access to company assets as part of their jobs, so it’s harder to detect, she said.
Then, because you’re analyzing activity from people who are already in your workforce, it’s essential to balance risk management with company culture, privacy, fairness and compliance needs. Those considerations simply don’t come up when you’re protecting a company from faceless cybercriminals in distant countries, said Talhah Mir, principal program manager in the Microsoft 365 security and compliance team.
“Employees absolutely should have access to the things they need for their jobs and shouldn’t feel unnecessary friction,” Mir said. “This is really about taking all these signals that already exist in the background and reasoning over it at scale with machine learning to find that thread in that sea of information that identifies possibly suspicious activities.”
All initial reports of unusual behavior in the Insider Risk Management system can be anonymized at the outset — to protect reputations and prevent any bias from creeping into the process. But because data signals only get you so far, the tool also offers a collaboration platform for investigators, human resource experts or business managers to determine whether the unusual behavior might be malicious or just something outside a person’s normal workflow.
Microsoft engineers working on the Insider Risk Management solution consulted with internal legal and human resources departments to delineate what thresholds would need to be met within Microsoft for anyone involved in an investigation to take necessary next steps.
“The system doesn’t pass any judgment or assume ill intent,” Mir said. “If there is an anomaly, you start from the place that the end user is probably just trying to get their job done, but we’re still going to trust and verify.”
The new solution uses machine learning algorithms to look for patterns of unusual and potentially risky behavior, which might be downloading hundreds of sensitive files from a SharePoint site, copying files to a USB device, disabling security software or emailing sensitive files outside of the company. It leverages Microsoft Graph and other services to look for anomalous signals across Windows, Azure and Office products such as SharePoint, OneDrive, Teams and Outlook.
None of those activities are inherently threatening, as employees do these things each day as part of their jobs. But the patterns become more meaningful as the system draws information from other sources, such as classification and labeling tools offered in Office 365 that can be used to flag sensitive documents and datasets.
That allows the algorithms to begin to distinguish between the risks posed by the employee who might be downloading uncontroversial presentations or documents — perhaps because they’re about to embark on a sales trip — and the employee who’s downloading highly confidential designs for a product under development.
The system can also indicate if downloaded files contain customer banking or credit card information, which would be a red flag for would-be identity theft. And, with the proper permissions, an analyst can see the content of downloaded files to further assess how harmful an outside leak of that information might be.
The Insider Risk Management solution can also plug into third-party human resources software, for instance, to bring in other pertinent data, such as whether an employee has recently resigned.
The algorithms factor in all of that information and assign each unusual activity a numerical “risk score,” which helps people tasked with managing insider risk to easily see where they need to focus additional attention.
That mirrors solutions such as the Azure Secure Score and Azure Security Center, which help Microsoft customers protect their data stored in the cloud by monitoring for, identifying and prioritizing the most serious security vulnerabilities. That could include mistakes in the way a customer configures a firewall that could allow a hacker to gain access and reflects the shared responsibility that both enterprises and cloud providers have to protect data in the cloud from all threats.
Microsoft’s own digital risk security team initially developed the insider risk machine learning algorithms as part of its own in-house solution to better detect potential insider risks from the data that’s already generated by its 150,000 employees around the world. The anomaly detection — which uses audit logs from existing tools — is part of a long line of technologies that have enabled the company to provide better security in ways that are relatively frictionless for employees, Arsenault said.
To make the solution viable within Microsoft, Arsenault said, it needed to be able to detect aberrant behavior in a way that doesn’t undermine the culture of productivity, innovation and trust that has made the company successful.
“People usually go where they’re invited, but they stay where they feel like they belong. I didn’t want to do anything to change that,” Arsenault said.
Microsoft 365 product engineers worked with the internal security team to scale the solution and ensure it met other customers’ needs.
“Customers are really starving for a solution here because they just don’t have a great way to deal with this today,” said Alym Rayani, Microsoft’s senior director for Microsoft 365 + Security. “The things that we and other vendors do around threat protection from outside attackers who are after money or customer information have definitely helped with that problem. The thing people haven’t tackled as well from a technology perspective are these insider risks, which can come in a variety of flavors.”
That’s why the Microsoft teams wanted to offer an end-to-end solution that enables companies to not only detect, but also investigate and prevent potential security breaches or data leaks. If an insider risk analyst follows protocol to move an investigation beyond the anonymous phase, he or she can use the tool to invite other people like an employee’s manager to collaborate and offer additional information.
A manager can clarify whether the anomalous behavior was something the employee was asked to do or is within the scope of their job, which closes the case quickly and efficiently.
The Insider Risk Management solution is also designed to prevent inadvertent security breaches through education. If an employee is storing sensitive documents in a SharePoint site that’s not secure or taking shortcuts with security controls, someone who sees an alert can head off potential problems by sending a reminder to that employee about company policy or recommending an online training in best practices.
Other internal security monitoring tools require companies to install special software to gather telemetry from laptops or to pipe data into a separate server to be analyzed, Mir said. But Microsoft’s Insider Risk Management solution can leverage signals from the same productivity tools that millions of employees already use worldwide.
“Our solution is largely turnkey,” Mir said. “If our customers already have the depth of Microsoft 365 and everything it offers in terms of productivity and security and compliance, you can just come in, configure the solution to your requirements and get started.”
Top image: Erin Miyake, Microsoft senior program manager for insider threats (left), and Talhah Mir, principal program manager in the Microsoft 365 security and compliance team (right), collaborated with security, human resources and compliance experts across the company to develop the Insider Risk Management solution. Photo by Scott Eklund/Red Box Pictures.
- Learn more: Intelligent compliance and risk management solutions
- Watch video: Insider Risk Management from Microsoft 365
- Learn more: Microsoft at RSA Conference 2020
- Read more: Leverage AI and machine learning to address insider risks
- Read more: Navigate data protection and risk in the cloud era
- Watch video: Navigate data protection and risk in the cloud era (Insider Risk Management demo starts at 22 minutes)
- Read more: Securing the cloud: Inside the high-tech, high-stakes race to keep the cloud safe, secure and empowering for all
Jennifer Langston writes about Microsoft research and innovation. Follow her on Twitter.