On Thursday, April 24, GitHub changes the default on Copilot's data collection settings. If you use Copilot Free, Pro, or Pro+, your code snippets, prompts, outputs, and contextual data will be used to train Microsoft and GitHub's AI models. Unless you opt out.1
The change was announced March 25 in an update to GitHub's privacy statement and terms of service.1 Before April 24, the default was off. After April 24, the default is on. If you previously opted out, your preference is preserved. Everyone else gets enrolled automatically.
What Gets Collected
GitHub's updated terms specify "inputs, outputs, code snippets, and associated context" from Copilot interactions.1 That means your prompts, the code Copilot generates for you, and the surrounding code context that Copilot reads to make its suggestions.
This is not your stored repositories. GitHub clarifies that the collection covers interaction data, not private repository content at rest.1 But the distinction is thinner than it sounds. If you ask Copilot to refactor a function, it reads the function, reads the surrounding file, and generates a replacement. All three are now training data.
Who's Protected, Who Isn't
Copilot Business and Enterprise users are exempt. Their organizations control data policies through existing Data Protection Agreements, and no prompt telemetry is collected from enterprise seats.2
Copilot Free, Pro, and Pro+ users are affected. Individual developers, freelancers, open-source contributors, and anyone building side projects on a personal account. The same week GitHub paused new signups and tightened usage limits, it expanded what it collects from the users who remain.
The two-tier structure is deliberate. Enterprise clients would walk. Individual developers, GitHub is betting, will not.
The Pattern
We wrote about this in The Opt-Out Illusion last month. The playbook: launch with privacy-friendly defaults to build trust, then flip the defaults once the user base is locked in. The opt-out exists. The friction is the point. Most people never change defaults. GitHub knows this. Every platform that runs this play knows this.
The timing is also the pattern. This lands the same week as the Copilot plan restrictions and two days after Anthropic tested removing Claude Code from its Pro tier. The AI coding tool market is simultaneously rationing access and expanding data collection. Less service, more extraction.
How to Opt Out
- Go to
github.com/settings/copilot - Under Privacy, uncheck the options that allow GitHub to use your data for product improvements and AI model training
- Save. Changes take effect immediately.
If you don't see these settings, you're on a Business or Enterprise plan and are already exempt. Opting out does not affect the quality of Copilot's suggestions for you.1
Do It Before Thursday
If you opt out before April 24, nothing changes for you. If you do nothing, the default flips and your interactions start training models. GitHub says "once you opt out, we stop collecting from that point forward."1 They did not say they delete what was collected before you opted out.
Disclosure
This article was written with the assistance of Claude, an AI made by Anthropic. Anthropic competes with Microsoft/GitHub in the AI coding tool market. Our editorial standards, including source verification and AI disclosure, are described on our ethics page.
Sources
- GitHub Blog / Changelog — "Updates to our Privacy Statement and Terms of Service: How we use your data," March 25, 2026.
- GitHub Community Discussion #147437 — User discussion and confirmation of the opt-out process and tier exemptions, April 2026.