Home / Security Copilot
๐Ÿ›ก๏ธ

Microsoft Security Copilot

AI-powered security operations. prompts, plugins, SCU management, and SOC workflows

What is Security Copilot?

Microsoft Security Copilot is an AI-powered security analysis tool that helps security teams defend their organizations at machine speed and scale. Built on Microsoft's large language model infrastructure and enriched with security-specific intelligence, including 65+ trillion daily signals from Microsoft's global threat intelligence network and expertise from tracking 300+ threat actors, Security Copilot transforms how SOC teams investigate incidents, hunt for threats, and strengthen security posture.

Unlike generic AI tools, Security Copilot is purpose-built for cybersecurity. It understands security context, speaks the language of analysts, and integrates natively across the Microsoft security stack: Defender XDR, Sentinel, Purview, Entra ID, and Intune. It doesn't replace your analysts; it makes every analyst on your team perform like your best analyst.

The SOC Before and After

Picture a typical SOC at 2 AM. An alert fires: a suspicious sign-in from an unfamiliar country, followed by a mailbox rule creation and a large file download. The on-call analyst needs to piece together the story. Was this a real compromise or a traveling executive? They open five different portals. They write KQL queries to correlate sign-in logs with email activity. They cross-reference the IP against threat intelligence feeds. They check if the user's device shows any malware indicators. Forty-five minutes later, they have a preliminary assessment.

Now picture that same alert with Security Copilot. The analyst types: "Summarize incident 4291: what happened, which entities are involved, and is this user's behavior anomalous?" In seconds, Copilot returns a structured narrative: the user signed in from Lagos, Nigeria (first time from this country), created an inbox rule to forward emails externally, and downloaded 2.3 GB from SharePoint. Copilot flags that the IP is associated with a known credential-harvesting infrastructure, the mailbox rule matches patterns seen in BEC campaigns, and the user has no travel history to Nigeria in Entra ID. It recommends immediate containment: revoke sessions, reset credentials, disable the forwarding rule.

What took 45 minutes now takes 3. That's not a marginal improvement; it's a fundamental transformation of how security operations work. Multiply that across dozens of incidents per day, and you begin to understand why Security Copilot isn't just another tool in the SOC. It's a force multiplier that changes the equation between attackers and defenders.

Ecosystem Integration

Microsoft Security Copilot is not a single tool. It is an AI layer that sits on top of Microsoft's security stack, using data, signals, and actions from multiple Microsoft security solutions through built-in integrations and plugins.

From Connected SOC to Intelligent SOC

Microsoft’s security platform is built on the idea that defenders need a unified view across identity, endpoints, cloud, data, and applications. Six pillars converge into one ecosystem:

๐Ÿค–

Artificial Intelligence

AI-powered SOC

โš”๏ธ

Extended Detection & Response

Protection across platforms

๐Ÿ“ก

SIEM

Flexible detection across digital estate

โ˜๏ธ

Cloud Security

Protection from code to runtime

๐Ÿ”

Exposure Management

Reduced exposure across digital estate

๐ŸŽฏ

Threat Intelligence

Comprehensive threat insights

SIEM, XDR, cloud security, exposure management, and threat intelligence all converge into one ecosystem. But even with this integrated foundation, humans still shoulder the responsibility of connecting signals and determining next steps. An analyst looking at a Defender XDR incident must manually correlate it with Sentinel detections, enrich it with threat intelligence, check exposure data, and decide on response actions across multiple portals.

Security Copilot changes that. By embedding AI directly into this unified SOC, we gain intelligence that understands context, recognises threat patterns, and guides analysts with clear next-step reasoning. It reads across all six pillars simultaneously - correlating XDR alerts with SIEM detections, enriching them with threat intelligence, factoring in exposure data, and recommending response actions - all in a single natural-language interaction. It turns a connected SOC into an intelligent SOC.

The Path Forward: Autonomous Protection

And that intelligence is key as we move toward the next evolution: autonomous protection. Today, Copilot assists analysts with recommendations and guided response. Tomorrow, trusted AI agents will be able to detect, investigate, and contain threats independently - with human oversight at decision points. The journey from manual SOC → connected SOC → intelligent SOC → autonomous SOC is the trajectory that Microsoft’s unified SecOps vision is building toward. Security Copilot is the bridge between where SOCs are today and where they need to be.

What Copilot Does for Your SOC

Each capability represents hours saved per incident, blind spots eliminated, and expertise gaps closed. These aren't theoretical features - they're workflows that enterprise SOC teams use every day.

Incident Summarization

Copilot ingests all alerts, entities, timelines, and evidence attached to an incident and produces a clear, executive-ready narrative in seconds. Junior analysts can triage complex multi-stage attacks that previously required senior expertise. Microsoft reports SOC teams using Copilot resolve incidents 22% faster on average.

Threat Intelligence Analysis

Tap into Microsoft Defender Threat Intelligence (MDTI) to instantly profile threat actors, map IOCs to campaigns, and understand if threats are actively targeting your industry. Ask "What do we know about Storm-1567?" and get a complete actor profile with TTPs, infrastructure, and recommended detections.

Script & Code Analysis

Paste an obfuscated PowerShell script, a suspicious command line, or a malware payload. Copilot deobfuscates, explains what each line does, identifies malicious intent, and maps it to MITRE ATT&CK techniques. What takes a reverse engineer 30 minutes takes Copilot 10 seconds.

KQL Query Generation

Describe what you're looking for in plain English . "Find all devices that ran PowerShell encoded commands in the last 48 hours" . and Copilot generates production-ready KQL for Sentinel or Defender advanced hunting. It even explains the query logic so you learn while you work.

Guided Response & Remediation

After investigation, Copilot recommends specific remediation steps based on the incident context - not generic checklists, but tailored actions. "Revoke active sessions for this user, reset their credentials, remove the malicious inbox rule, and quarantine the downloaded files." actionable, sequenced, complete.

Custom Promptbooks

Build reusable multi-step investigation playbooks that codify your team's best practices. A senior analyst designs the promptbook once; every analyst on the team can execute it consistently. Share across the SOC to standardize triage quality and reduce training time for new hires.

Prompt Engineering for Security

Effective prompting is key to getting the best results from Security Copilot. Follow these best practices to craft prompts that deliver actionable insights.

Be Specific: Include specific details like time ranges, IP addresses, or user names. "Show me sign-ins for user@contoso.com in the last 24 hours" is better than "show me sign-ins."
Provide Context: Tell Copilot what you're investigating and why. "I'm investigating a potential phishing attack. analyze the email headers for malicious indicators."
Iterate Naturally: Build on previous responses. Follow up with "Now show me all related alerts for the same user" or "Can you generate a KQL query for this?"
Use Skill References: Reference specific skills like "/AskGPT", "/GetIncident", or "/AnalyzeScript" to target specific capabilities.
# Incident Investigation
"Summarize incident #12345 including timeline, affected entities, and severity."

# Threat Hunting
"Search for any devices that communicated with IP 198.51.100.23 in the last 7 days."

# Script Analysis
"Analyze this PowerShell script for malicious behavior: [paste script]"

# KQL Generation
"Write a KQL query to find all failed sign-in attempts from outside the US in the last 48 hours."

# Threat Intelligence
"Tell me about the threat actor Storm-0558 and their recent tactics."

# Posture Assessment
"What are the top 5 security recommendations for improving our Microsoft Secure Score?"

Two Ways to Use Security Copilot

Security Copilot offers two distinct experiences. Understanding when to use each is key to maximising your investment.

๐Ÿ–ฅ๏ธ

Standalone Experience

Access via securitycopilot.microsoft.com. A full-featured portal for open-ended security investigations, threat intelligence research, script analysis, and multi-step promptbook execution.

Best For

  • Open-ended incident investigation across multiple products
  • Complex threat intelligence research and actor profiling
  • Building and executing multi-step promptbooks
  • Script and malware analysis (paste and analyze)
  • KQL query generation for Sentinel and Defender
  • Cross-product security posture assessment
๐Ÿ”—

Embedded Experience

Copilot appears as a side panel inside Microsoft security products. The context is scoped to the product you are working in, giving you AI assistance right where you need it.

Available In

  • Defender XDR - Summarize incidents, analyze scripts, generate KQL, guided response
  • Microsoft Sentinel - Summarize incidents with AI
  • Microsoft Entra - Investigate risky users, app risk, lifecycle workflows
  • Microsoft Intune - Device queries, policy management, troubleshooting
  • Microsoft Purview - DLP alert investigation, insider risk, eDiscovery, compliance
  • Defender for Cloud - Analyze, delegate, and remediate recommendations
  • Defender Threat Intelligence - Threat actor and campaign research
  • Azure Firewall - IDPS signature enrichment and recommendations
💡 Developer Scenarios: Security Copilot also supports developer workflows - build custom agents and plugins, automate security tasks, and integrate Copilot into your own tools. See Security Copilot Developer documentation.

Zero Trust Principles for Security Copilot

Deploying Security Copilot should follow Zero Trust principles: verify explicitly, use least privilege, and assume breach. Because Copilot accesses sensitive security data through on-behalf-of (OBO) authentication, a compromised admin account with Copilot access could expose your entire security posture to an attacker.

Step 1: Identity & Access Policies

  • Require MFA for all admin and SecOps staff
  • Block legacy authentication (clients that don't support modern auth)
  • Require compliant devices via Intune
  • Force password change on high-risk sign-ins (E5)
  • Create separate Conditional Access policies for privileged users
  • Include security tools in CA scope (Defender XDR, Entra, Intune, Purview)

Step 2: Least Privilege Access

  • Assign Copilot Contributor (not Owner) for most SOC staff
  • Reserve Copilot Owner for admins who manage plugins and settings
  • Review role assignments with Microsoft Entra PIM (time-bound, approval-required)
  • Use Privileged Access Management for just-in-time access to sensitive tasks
  • Remove default "All users" from Copilot Contributor if doing staged rollout

Step 3: Secure Privileged Devices

  • Deploy privileged access workstations (PAWs) for SOC analysts
  • Enable application control, credential guard, and exploit guard
  • Update Intune compliance policies to require hardened device configs
  • Transition security groups to the new PAW compliance policy

Step 4: Threat Protection & Third-Party Access

  • Deploy Defender XDR + Sentinel to detect compromised admin accounts
  • Validate threat protection for Azure, AWS, and GCP workloads
  • Secure third-party plugin access - these live outside the Microsoft trust boundary
  • Add third-party security products to dedicated CA policies
  • Monitor plugin usage with Defender for Cloud Apps session controls
💡 Deployment Strategy: For large organisations, use a phased deployment: Evaluate (small pilot group) → Pilot (next wave with protections validated) → Full Deployment (all SecOps staff). Assign Copilot roles only after identity, device, and threat protections are confirmed for each user. See Apply Zero Trust principles to Security Copilot.

Security Copilot Labs

AI-powered security operations labs. activate Copilot, build promptbooks, automate threat intelligence, and design end-to-end SOC workflows.

01
Beginnerโฑ 60 min ยท 8 steps

Enable Security Copilot & First Investigation

Activate Security Copilot with SCU capacity planning, configure data source plugins for Sentinel and Defender XDR, run your first natural-language investigation prompts, review Copilot-generated incident summaries, and evaluate response accuracy.

02
Intermediateโฑ 90 min ยท 12 steps

Build Custom Promptbooks for Incident Triage

Design a multi-step triage promptbook with sequential investigation stages, create reusable prompt templates with parameterized inputs, share promptbooks with the SOC team via the Copilot library, and measure Mean Time to Triage improvements against baseline.

03
Intermediateโฑ 100 min ยท 14 steps

Automate Threat Intelligence Summarization

Connect threat intelligence plugins (Microsoft TI, MDTI), build prompts for IOC enrichment and CVE impact analysis, create a daily threat briefing workflow that aggregates intelligence across sources, and integrate Copilot outputs with Sentinel watchlists.

04
Advancedโฑ 150 min ยท 18 steps

Build an End-to-End SOC Workflow with Copilot

Design a complete SOC workflow: automated alert triage with Copilot, evidence collection across Defender products, guided investigation with cross-product KQL queries, executive report generation with AI-written summaries, and post-incident lessons-learned documentation.

05
Advancedโฑ 120 min ยท 14 steps

Embedded Experiences & Zero Trust Deployment

Explore Copilot embedded in Defender XDR, Sentinel, Entra, Intune, and Purview. Apply Zero Trust principles with Conditional Access, least privilege roles, PIM, and phased deployment.

Learning Resources

Security Copilot FAQ

How much does Security Copilot cost?

Security Copilot uses Security Compute Units (SCUs) as its billing model. Key details:

  • SCU-based pricing: You provision SCU capacity (minimum 1 SCU) and pay hourly. Each SCU supports approximately 4–5 complex prompts per hour.
  • Scaling: You can increase or decrease SCU capacity at any time through the Azure portal or admin centre. Scale up during incident investigations, scale down during quiet periods.
  • Cost estimate: At approximately $4/SCU/hour, a single SCU running 24/7 costs ~$2,920/month. Most SOC teams start with 1–3 SCUs and adjust based on usage patterns.
  • Token usage: Each prompt consumes tokens based on input size, plugin calls, and response complexity. Promptbooks with multiple steps consume more tokens than single prompts.
  • No per-user licensing: SCU capacity is shared across all authorised users in your tenant. You pay for compute, not seats.

Tip: Start with 1 SCU for evaluation, monitor usage in the Copilot analytics dashboard, then right-size your capacity based on actual consumption patterns.

Get started with Security Copilot

What data sources and plugins does Security Copilot integrate with?

Security Copilot connects to your security ecosystem through plugins. Each plugin provides tools that Copilot can invoke during investigations:

  • Microsoft first-party: Defender XDR (incidents, devices, hunting), Microsoft Sentinel (KQL queries, incidents, watchlists), Entra ID (user details, risk events, sign-in logs), Intune (device compliance, app inventory), Purview (DLP alerts, sensitivity labels), Threat Intelligence (IOC lookups, threat articles)
  • Third-party plugins: ServiceNow, Splunk, CrowdStrike, Google Security Operations, and others through the plugin marketplace
  • Custom plugins: Build your own plugins using OpenAPI specifications to connect Copilot to internal tools, custom APIs, or proprietary data sources
  • Web search: Optional internet search for OSINT and public threat intelligence

All plugins respect your existing RBAC: Copilot can only access data that the current user has permission to see. Enable only the plugins your team needs to optimise token usage.

Manage plugins

What are promptbooks and how do I use them?

Promptbooks are reusable, multi-step prompt sequences that automate complex security workflows. Think of them as saved investigation procedures that run multiple prompts in sequence:

  • Built-in promptbooks: Microsoft provides pre-built promptbooks for incident triage, vulnerability impact assessment, threat actor profiling, and suspicious script analysis
  • Custom promptbooks: Create your own by chaining prompts with input parameters (e.g., "Given incident ID [X], summarise the attack story, list affected users, check their risk scores, and recommend containment actions")
  • Parameters: Promptbooks accept input variables so the same workflow works for any incident, user, or IOC
  • Team sharing: Share promptbooks with your SOC team so all analysts follow consistent investigation procedures
  • Metrics: Track Mean Time to Triage (MTTT) improvements by comparing promptbook-assisted investigations vs. manual

Promptbooks are the highest-ROI feature of Security Copilot: they codify expert analyst knowledge into repeatable workflows that junior analysts can execute consistently.

Using promptbooks

Is my security data used to train the AI model?

No. Microsoft provides strong data privacy guarantees for Security Copilot:

  • No model training: Your security data, prompts, and responses are never used to train the underlying foundation models (GPT-4.1, etc.)
  • Tenant isolation: Your data stays within your Microsoft 365 tenant boundary and is not accessible to other customers
  • Data residency: Copilot follows your existing Microsoft 365 data residency settings. If your tenant is in the EU, your prompts are processed in the EU.
  • Compliance: Security Copilot is covered by SOC 2 Type 2, ISO 27001, and inherits your tenant Microsoft 365 compliance certifications
  • Session data: Copilot sessions are stored for 90 days for your review, then deleted. Admins can purge sessions at any time.
  • Audit logging: All Copilot usage is logged in the Microsoft 365 audit log for compliance and governance

Privacy and data security

Can Security Copilot write and explain KQL queries?

Yes, KQL assistance is one of Copilot's most powerful capabilities:

  • Natural language to KQL: Describe what you want to find in plain English (e.g., "Show me all failed sign-ins from anonymous IPs in the last 24 hours") and Copilot generates the corresponding KQL query for Sentinel or Defender XDR advanced hunting
  • KQL to plain English: Paste any existing KQL query and Copilot explains what it does in clear, non-technical language. useful for reviewing analyst-written detections or understanding inherited rules
  • Query optimisation: Copilot can suggest performance improvements for slow queries, recommend better operators, and fix syntax errors
  • Context-aware: When used within a Sentinel incident or Defender XDR investigation, Copilot generates queries scoped to the relevant tables, timeframes, and entities

This capability dramatically reduces the KQL learning curve for junior analysts and accelerates hunting for experienced analysts.

Using Security Copilot

What roles and permissions are required?

Security Copilot access requires multiple layers of permissions:

  • Copilot access: A Security Administrator must enable Copilot and assign access to specific users or groups in the Copilot settings
  • Security roles: Users need at least Security Reader role for read-only investigations or Security Operator for actions. More specific roles (e.g., Sentinel Contributor) unlock deeper capabilities.
  • RBAC enforcement: Copilot respects all existing role-based access controls. Users can only query data and take actions their roles permit. A Security Reader cannot isolate a device even through Copilot.
  • Plugin permissions: Some plugins require additional permissions (e.g., the Intune plugin needs Intune Reader role)
  • SCU provisioning: Only Global Administrator or Security Administrator can provision or modify SCU capacity

Best practice: Start with a pilot group of senior SOC analysts, validate the experience, then expand access to Tier 1 analysts with appropriate guard rails.

Authentication and access