Copilot Vulnerability Lets Attackers Bypass Audit Logs and Gain Hidden Access

Copilot Vulnerability Lets Attackers Bypass Audit Logs and Gain Hidden Access

A critical vulnerability in Microsoft’s M365 Copilot allowed users to access sensitive files without leaving any trace in audit logs, creating significant security and compliance risks for organizations worldwide.

The flaw, discovered in July 2024, remained largely hidden from customers despite being classified as an “important” vulnerability by Microsoft.

Simple Exploit with Serious Consequences

The vulnerability exploited a fundamental flaw in how Copilot handles audit logging. Under normal circumstances, when users ask M365 Copilot to summarize documents, the system records these access events in audit logs—a critical security feature for tracking file access.

However, researchers discovered that simply requesting Copilot to avoid providing file links would cause these audit entries to disappear entirely.

Copilot Vulnerability Lets Attackers Bypass Audit Logs and Gain Hidden Access 3
Resource Accessed
Resource Accessed

“Just like that, your audit log is wrong. For a malicious insider, avoiding detection is as simple as asking Copilot,” explained the researcher who discovered the flaw.

The vulnerability was so straightforward that it could occur accidentally during routine Copilot interactions, meaning many organizations likely have incomplete audit logs without realizing it.

The implications extend far beyond simple security concerns. Organizations subject to regulatory requirements like HIPAA rely on comprehensive audit logs to demonstrate compliance with technical safeguard requirements.

Legal proceedings often depend on audit trails as crucial evidence, and the US government has previously emphasized audit logging as an essential security feature.

The vulnerability was reported to Microsoft through their Microsoft Security Response Center (MSRC) portal on July 4th, but the handling process deviated significantly from the company’s published guidelines.

Rather than following their established reproduction and development phases, Microsoft appeared to quietly fix the issue while the report remained in “reproducing” status.

Most concerning was Microsoft’s decision not to issue a CVE (Common Vulnerabilities and Exposures) number or notify customers about the flaw.

When questioned about this approach, Microsoft cited their policy of only issuing CVEs for “critical” vulnerabilities, despite classifying this issue as “important”. The company reportedly had no plans to disclose the vulnerability’s existence to affected customers.

Microsoft deployed the fix on August 17th, 2024, automatically pushing the mitigation to Copilot systems without requiring user intervention.

However, organizations that used Copilot before this date may have compromised audit logs with no way of knowing which accesses went unrecorded.

The researcher’s decision to publicly disclose the vulnerability stems from Microsoft’s refusal to inform customers about the audit log gaps.

This situation raises broader questions about Microsoft’s transparency practices and how many other security issues may be silently addressed without customer notification.

For organizations heavily reliant on audit trails for security monitoring and regulatory compliance, this incident highlights the risks of trusting automated AI systems with critical security functions and underscores the need for comprehensive security auditing practices that extend beyond vendor-provided logging mechanisms.

Find this News Interesting! Follow us on Google News, LinkedIn, and X to Get Instant Updates!


Source link

About Cybernoz

Security researcher and threat analyst with expertise in malware analysis and incident response.