AI Security

Is It Safe to Use GitHub Copilot
With Sensitive Code?

Every time Copilot autocompletes a line, your code travels to Microsoft's servers. Here's exactly what gets sent, what gets kept, and what to do about it.

7 min read·Updated Mar 2026

Is it safe to use GitHub Copilot with sensitive code? The answer depends entirely on which plan you're on, which settings you've configured, and what "sensitive" means in your context. This guide covers exactly what Copilot sends to Microsoft, what it retains, what two critical vulnerabilities in 2025–2026 revealed about the attack surface, and how to protect yourself regardless of plan.

What GitHub Copilot Actually Sends to Microsoft

GitHub Copilot is not a local tool. Every time it generates a suggestion, it sends a prompt to Microsoft Azure servers for processing. That prompt contains:

High risk The current file content

The file you're actively working in — including any hardcoded values, connection strings, or API keys that happen to be present.

High risk Surrounding context from open tabs

Copilot reads other files open in your editor to improve suggestions. If your .env file is open, Copilot may read it.

Medium risk File paths and repository structure

File paths and names are included in telemetry data sent to GitHub and Microsoft.

Low risk User engagement data

Which suggestions you accept or dismiss, latency metrics and usage patterns — shared with Microsoft and OpenAI.

How Data Handling Differs by Plan

Plan Code Retained? Used for Training? Safe for Sensitive Work?
Free Yes (by default) Yes (by default) ✗ No — opt out first
Individual / Pro Opt-out available Opt-out available ⚠️ Only with settings disabled
Business No No ✓ With caveats
Enterprise No No ✓ Strongest protections

The caveat that applies to every plan: your code still leaves your machine and is processed on Microsoft Azure. "Not retained" means it isn't stored after the session — not that it never touched a server.

Two Security Incidents That Changed the Picture

In 2025–2026, two critical vulnerabilities proved that Copilot's code context is an active attack surface.

CamoLeak — CVSS 9.6 (Aug 2025)

A vulnerability in Copilot Chat's image rendering could be exploited to silently exfiltrate private source code and secrets. AWS keys, zero-day details and private code could be extracted without the developer noticing. Patched by disabling image rendering entirely in Copilot Chat.

RoguePilot — Repo Takeover (Feb 2026)

Orca Security disclosed a vulnerability where a malicious prompt hidden in a GitHub issue's HTML comments could achieve full repository takeover by exfiltrating the GITHUB_TOKEN through Copilot's context ingestion. Both are patched — but they exposed a fundamental risk in how Copilot handles context.

The .copilotignore File — Your Most Important Protection

Regardless of plan, limit what Copilot reads using a .copilotignore file. This works like .gitignore — list patterns you want excluded from Copilot's context.

# .copilotignore — add to your project root
*.env
*.env.*
config/secrets.yml
config/database.yml
**/credentials/*
**/*.pem
**/*.key
**/id_rsa*
secrets/
.aws/credentials

Critical: .gitignore does not protect you from Copilot. Files ignored by git may still be read if they are open in your editor. Use .copilotignore explicitly.

Copilot vs Fully Local Alternatives

Tool Code Leaves Machine? Training Use For Sensitive Work?
GitHub Copilot Free Yes — always Yes (by default) ✗ No
GitHub Copilot Business Yes — ephemeral No ⚠️ With caveats
Continue.dev + Ollama Never None ✓ Yes
Tabnine (air-gapped) Never None ✓ Yes
VS Code IntelliSense Never None ✓ Yes

Don't Forget Copilot Chat — Sanitize Logs Before Pasting

One scenario developers miss: pasting error logs or stack traces into Copilot Chat. Application logs often contain API keys, database URLs, customer emails and IP addresses. The same risk that applies to ChatGPT applies here.

Run any logs through ResourceCentral's Log Sanitizer before pasting into Copilot Chat. For Python tracebacks specifically, the Python Log Cleaner also strips file paths that reveal your server structure.

Sanitize Before Pasting Into Copilot Chat

Remove API keys, emails and file paths from logs before sharing with any AI tool.

FAQ

Is it safe to use GitHub Copilot with sensitive code? +

Conditionally. On Business and Enterprise plans GitHub does not retain your code or use it for training. On Free and Individual plans your code may be collected unless you opt out. On all plans your code leaves your machine and is processed on Microsoft Azure — it is never truly local.

Does GitHub Copilot send your code to Microsoft? +

Yes. Every suggestion involves sending surrounding code context to Microsoft Azure. On Business and Enterprise this is ephemeral and not retained. On Free and Individual it may be retained and used for training unless you disable snippet collection in GitHub settings.

How do I stop Copilot reading my .env file? +

Add a .copilotignore file to your project root and include *.env and any other sensitive patterns. Do not rely on .gitignore — files ignored by git can still be read by Copilot if they are open in your editor tab.

Should I use Copilot on client work or NDA-bound projects? +

Check your contract. Many NDAs prohibit transmitting source code to third-party services. Even on Business plans where data is not retained, your code still transits Microsoft infrastructure. When in doubt, use a fully local alternative like Continue.dev + Ollama or Tabnine's air-gapped deployment.

Is GitHub Copilot Free safe for personal private repos? +

Not without opting out first. On the Free plan, code snippets may be collected and used for product improvement by default. Go to GitHub Settings → Copilot and disable "Allow GitHub to use my code snippets from the code editor for product improvements." Even with this disabled, code still leaves your machine for processing — the opt-out controls retention and training use only.

Related