Table of Contents
Microsoft is focusing on cybersecurity after unveiling an AI-powered Copilot assistant for Office apps. Microsoft Security Copilot is a new cybersecurity helper aimed to assist defenders in identifying breaches and better understanding the massive volumes of signals and data available to them on a daily basis.
Copilot, which is powered by OpenAI’s GPT-4 generative AI and Microsoft’s own security-specific model, appears to be a simple prompt box like any other chatbot. You might ask, “What are all the security incidents in my enterprise?” and it will provide a summary. However, it is using the 65 trillion daily signals collected by Microsoft in its threat intelligence collecting and security-specific talents to enable security professionals to hunt down threats.
Microsoft Security Copilot is intended to supplement rather than replace the work of a security analyst, and it even has a pinboard component for coworkers to collaborate and share information. Security personnel can utilise the Security Copilot to assist with incident investigations or to quickly summarise events for reporting purposes.
Copilot takes natural language inputs, so security experts might request a summary of a specific vulnerability, feed in files, URLs, or code snippets for analysis, or request incident and alert information from other security tools. All prompts and responses are kept, providing investigators with a complete audit trail.
Results can be pinned and summarized on a shared workspace, allowing colleagues to work on the same threat analyses and investigations at the same time. “This is like having individual workspaces for investigators and a shared notebook with the ability to promote things you’re working on,” Chang Kawaguchi, an AI security architect at Microsoft, tells The Verge in an interview.
A prompt book function is one of the most intriguing parts of Security Copilot. It’s essentially a collection of actions or automation that individuals can combine into a single, simple button or prompt. This might include having a shared prompt for reverse engineering a script so that security researchers don’t have to wait for someone on their team to undertake this type of investigation. Security Copilot can also be used to build a PowerPoint slide that details occurrences and attack routes.
Microsoft, like Bing, plainly sources results when security researchers request information on the most recent vulnerabilities. Microsoft is making use of data from the Cybersecurity and Infrastructure Security Agency, the National Institute of Standards and Technology’s vulnerability database, and its own threat intelligence.
Microsoft Security Copilot Data Usage
That doesn’t mean Microsoft’s Security Copilot will always be correct. “We know these models make mistakes from time to time, so we’re providing the ability to ensure we have feedback,” adds Kawaguchi. The feedback loop is far more complex than the thumbs-up or thumbs-down buttons found on Bing. “It’s a little more complicated than that because there are a lot of ways it could go wrong,” Kawaguchi explains. Microsoft will allow customers to comment with specifics about what is wrong in order to have a better understanding of any hallucinations.
“I don’t think anyone can guarantee zero hallucinations,” says Kawaguchi, “but what we’re trying to do through things like exposing sources, providing feedback, and grounding this in data from your own context is ensure that folks can understand and validate the data they’re seeing.” “Because there is no correct answer in some of these examples, having a probabilistic answer is significantly better for the organisation and the individual conducting the investigation.”
While Microsoft’s Security Copilot appears to be a prompt and chatbot interface similar to Bing, it is confined to security-related queries. You won’t be able to get the most recent weather forecast or ask the Security Copilot what its favourite colour is here. “This is very intentionally not Bing,” adds Kawaguchi. “We don’t consider this a chat experience. We conceive of it as a notebook experience rather than a freeform conversation or general purpose chatbot.”
Microsoft’s aggressive AI drive
Security Copilot is the most recent example of Microsoft’s aggressive AI drive. The Microsoft 365 Copilot appears to be changing Office documents forever, and Microsoft-owned GitHub is transforming its own Copilot into a chatty companion to help engineers generate code. Microsoft’s Copilot ambitions do not appear to be slowing down, so we can expect to see this AI assistant technology spread across the company’s products and services.
Microsoft is beginning to preview this new Microsoft Security Copilot with “a few customers” today, and no timeline has been set for its wider rollout. “We’re not yet talking about a timeline for general availability,” Kawaguchi says. “Because so much of this is about learning and learning responsibly, we believe it’s critical to get it to a small group of people and begin that process of learning, as well as to make this the best possible product and ensure we’re delivering it responsibly.”