CyberSecurityNews

Mozilla Criticizes Microsoft for Installing Copilot on Windows Without User Consent


Mozilla has publicly criticized Microsoft for deploying its AI assistant, Copilot, onto Windows systems without user consent, a practice the Firefox maker describes as prioritizing corporate revenue over user rights.

In a blog post titled “Old Habits Die Hard,” Mozilla accused Microsoft of using automatic installs, hardware defaults, and deceptive UI design to aggressively push Copilot across the Windows ecosystem.

At the core of Mozilla’s complaint is Microsoft’s decision to auto-install the M365 Copilot app on any Windows device running Microsoft 365 desktop apps, without prompting or user consent.

Beyond software, Microsoft introduced a dedicated physical Copilot key on Copilot+ PC keyboards, with no straightforward mechanism to remap it to another function.

Copilot was also pinned to the Windows 11 taskbar by default, and Microsoft had planned to embed the AI assistant directly into the Windows notification center, the Settings app, and File Explorer, some of the most fundamental surfaces of the operating system.

Mozilla Criticizes Microsoft for Copilot

These deployment tactics triggered significant user backlash, which Mozilla argues ultimately forced Microsoft’s hand. In March 2026, Microsoft announced it would pull back Copilot integration from Photos, Notepad, Snipping Tool, and Widgets, a rollback framed as a commitment to integrating AI “where it’s most meaningful.”

google

Mozilla’s position is clear: Microsoft’s sudden pivot toward being “intentional” about Copilot is an admission that the company repeatedly made choices to serve its business interests at the expense of its users.

Mozilla’s criticism extends well beyond Copilot. The organization points to a documented history of Microsoft using deceptive design patterns or “dark patterns” to override user choice across Windows.

Independent research commissioned by Mozilla previously exposed how Microsoft deliberately complicates the process of changing default browsers, and how Windows UI routes users back to Microsoft Edge even after they have explicitly selected a different browser.

Additional examples from the Windows 11 rollout include the taskbar Search bar being hardcoded to open Microsoft Edge regardless of the user’s default browser, and applications like Microsoft Outlook and Teams ignoring default browser settings entirely to open links in Edge.

Notably, Microsoft excluded the European Economic Area from automatic Copilot installation, a detail that strongly suggests legal and regulatory pressure, not user-centric design, is what shapes these decisions.

In contrast, Mozilla has introduced a centralized AI Controls panel in Firefox 148 that includes a single “Block AI Enhancements” toggle to disable every AI feature simultaneously, with each feature also individually controllable.

Critically, user preferences persist across browser updates, meaning AI features cannot silently re-enable themselves after a major upgrade — a direct architectural contrast to Microsoft’s approach.

Mozilla has also deployed AI features such as on-device language translations and alt-text generation in PDFs — all optional and user-directed. The broader message from Mozilla is unambiguous: AI should operate on the user’s terms, not the platform vendor’s.

Microsoft’s Copilot rollback, while a step in the right direction, underscores a growing concern in the cybersecurity and privacy communities: when dominant platform vendors use their control over infrastructure to bypass user consent, it sets a dangerous industry precedent.

With AI features increasingly touching sensitive work files, identity systems, and cloud services, the stakes of unchecked default deployments extend directly into enterprise security risk.

Mozilla’s public rebuke signals that the user consent debate is far from over and that pressure from both users and rival platforms will remain a critical check on Big Tech’s AI ambitions.

Follow us on Google News, LinkedIn, and X for daily cybersecurity updates. Contact us to feature your stories.

googlenews



Source link