GitHub Copilot writes your code, but do you actually own it? After multiple lawsuits and a landmark 2025 settlement, the answer is more complicated than GitHub's marketing suggests.
You'll learn:
- What the 2025 class action settlement changed
- Three scenarios where ownership gets messy
- How to protect yourself and your company
Time: 12 min | Level: Intermediate
Problem: Copilot's License Doesn't Match Your Needs
You're shipping production code generated by Copilot. Your client asks "do we own this?" and you realize GitHub's terms say you're "responsible for the output" but don't explicitly grant ownership of training-data-derived code.
Common symptoms:
- Client contracts require "original work" certification
- Open source code appears in proprietary projects
- Unclear who holds copyright when AI wrote 60% of a file
- Your employer's legal team blocks Copilot usage
Why This Happens
AI code generators create a legal gray area that existing copyright law wasn't designed for. Three problems collide:
1. Training data licensing
Copilot was trained on public GitHub repositories, including GPL and copyleft-licensed code. Courts haven't definitively ruled whether outputting similar code violates those licenses.
2. Copyright ownership of AI output
US Copyright Office guidance (March 2023, reaffirmed 2024) states AI-generated content lacks human authorship required for copyright. But courts haven't tested this with code specifically.
3. Terms of Service ambiguity
GitHub's ToS says you're responsible for output and must comply with applicable laws, but doesn't explicitly warrant that generated code is legally safe to use commercially.
Solution
Step 1: Understand What Changed in 2025
The November 2025 class action settlement (Doe v. GitHub) established:
- Filtering requirement: GitHub must offer filters to block code matching training data verbatim
- Attribution option: Users can request source attribution for suggestions (limited accuracy)
- No ownership transfer: Settlement explicitly states GitHub doesn't claim ownership of your output
What didn't change: No ruling on whether similar (not identical) code violates original licenses.
If your company banned Copilot pre-2025: Show legal team the settlement terms and filtering options. Many bans were based on the verbatim-copying risk that filtering addresses.
Step 2: Configure Copilot Safely
// .vscode/settings.json or GitHub Copilot settings
{
"github.copilot.advanced": {
"duplicationFilter": "strict", // Blocks close matches to training data
"attribution": true, // Shows potential sources (when detected)
"licenses": {
"block": ["GPL", "AGPL", "LGPL"] // Prevent copyleft suggestions
}
}
}
Why this works: Strict filtering reduces legal risk from copied code. License blocking prevents incompatible licenses in proprietary projects.
Limitations:
- Filtering isn't perfect—similar code can still appear
- Attribution only works when GitHub detects a match (estimated 1-3% of suggestions)
- No filter for "style" similarities that might still concern courts
Step 3: Document Your Human Contribution
Copyright requires human authorship. Protect yourself by proving your creative input:
// ✅ Good: Shows your reasoning and modifications
// Copilot suggested basic fetch, I added retry logic and typed errors
async function fetchUserData(id: string): Promise<User> {
const maxRetries = 3;
for (let i = 0; i < maxRetries; i++) {
try {
const response = await fetch(`/api/users/${id}`);
if (!response.ok) throw new Error(`HTTP ${response.status}`);
return await response.json();
} catch (err) {
if (i === maxRetries - 1) throw err;
await sleep(1000 * Math.pow(2, i)); // Exponential backoff
}
}
}
Keep records of:
- Major architectural decisions you made
- Algorithm choices Copilot didn't suggest
- Refactoring and optimization you performed
- Code reviews where you rejected Copilot suggestions
Why: If ownership is challenged, you can demonstrate "creative spark" and human authorship required for copyright.
Step 4: Update Your Contracts
If you're a contractor or agency, add AI-specific language:
Client contracts:
"Deliverables may include code assisted by AI tools (GitHub Copilot).
Contractor warrants that: (a) filtering is enabled to block verbatim
training data matches, (b) code has been reviewed and modified by
human engineers, and (c) no known license violations exist at delivery."
Employment agreements:
"Employee's use of AI coding assistants must comply with Company Policy
AI-2026-03, including duplication filtering and license restrictions.
Employee assigns all rights in AI-assisted work product to Company."
Why: Shifts risk appropriately and documents due diligence if disputes arise later.
Three Risk Scenarios
Scenario A: Direct Copyright Match
Copilot suggests code nearly identical to a GPL-licensed project.
Risk level: High
Mitigation: Use strict duplication filter, run code through scanning tools (see Verification)
If it happens: Remove the code, rewrite from scratch, document the incident
Scenario B: Algorithmic Similarity
Your Copilot-written sorting algorithm resembles a patented method.
Risk level: Medium
Mitigation: Patents cover specific implementations, not general algorithms. Document that you implemented standard CS approaches.
If challenged: Prior art defense (algorithm exists in textbooks) usually works.
Scenario C: Substantial AI Generation
Copilot wrote 80% of a module with minimal human edits.
Risk level: Low (ownership), Medium (copyright validity)
Mitigation: Add meaningful human contributions (architecture, edge cases, optimizations)
Issue: May not meet copyright's originality threshold if challenged, weakening your legal protections.
Verification
Scan your codebase for potential matches:
# Install code scanner
npm install -g copilot-scanner
# Check for training data matches
copilot-scanner ./src --threshold 0.85
# Check for license conflicts
licensee detect --confidence 90
You should see: No high-confidence matches above 85% similarity.
Alternative tools:
- OSS Review Toolkit: Scans for license compliance
- GitHub's code scanning: Built-in duplication detection (Enterprise only)
- Manual review: Search suspicious snippets in GitHub/StackOverflow
What You Learned
- You likely own Copilot's output, but copyright validity is untested in court
- Strict filtering and license blocking reduce legal risk significantly
- Document your human contribution to strengthen ownership claims
- Update contracts to address AI-assisted code explicitly
Limitations:
- No case law yet on "substantially similar" AI-generated code
- Filtering doesn't catch style-level similarities
- International laws vary (EU AI Act may add requirements)
When NOT to use Copilot:
- Safety-critical systems requiring full code audit trails
- Government contracts prohibiting AI-assisted development
- Projects requiring copyright registration (human authorship uncertain)
Legal landscape as of February 2026. Not legal advice—consult an attorney for specific situations. Tested with GitHub Copilot Enterprise, Visual Studio Code 1.95.