0 Members and 4 Guests are viewing this topic.
Microsoft confirmed that a Copilot security bug was allowing the AI assistant to read and summarize emails that were labeled as confidential. According to a report from Bleeping Computer, the bug bypassed Microsoft's data loss prevention policies, which are meant to protect sensitive information.The bug was discovered in late January (tracked as CW1226324) and specifically affects Copilot Chat and the "work tab" feature. The bug let Copilot read and summarize emails in the sent and drafts folders, including messages that were explicitly labeled as confidential, which should have had restricted access.Copilot Chat is Microsoft's version of Google Gemini or ChatGPT. It's meant to be content-aware and can interact with 365 apps like Word, Excel, Powerpoint and Outlook. The company began rolling it out to Microsoft 365 business customers in September 2025.