However, some experts warned the speed at which companies compete to add new AI features meant these kinds of mistakes were inevitable.
Copilot Chat can be used within Microsoft programs such as Outlook and Teams, used for emails and chat functions, to get answers to questions or summarise messages.
“We identified and addressed an issue where Microsoft 365 Copilot Chat could return content from emails labelled confidential authored by a user and stored within their Draft and Sent Items in Outlook desktop,” a Microsoft spokesperson told BBC News.
“While our access controls and data protection policies remained intact, this behaviour did not meet our intended Copilot experience, which is designed to exclude protected content from Copilot access,” they added.
“A configuration update has been deployed worldwide for enterprise customers.”
The blunder was first reported by tech news outlet Bleeping Computer, external, which said it had seen a service alert confirming the issue.
It cited a Microsoft notice saying “users’ email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat”.
The notice added that a work tab within Copilot Chat had summarised email messages stored in a user’s drafts and sent folders, even when they had a sensitivity label and a data loss prevention policy configured to prevent unauthorised data sharing.
Reports suggest Microsoft first became aware of the error in January.
Its notice about the bug was also shared on a support dashboard for NHS workers in England – where the root cause is attributed to a “code issue”.
A section of the notice on the NHS IT support site, external implies it has been affected.
But it told BBC News the contents of any draft or sent emails processed by Copilot Chat would remain with their creators, and patient information has not been exposed.
