Concerns have been raised over the security of Copilot Studio, Microsoft’s automated chatbot creation tool, despite its rapid adoption by enterprises since its release less than nine months ago. Security researcher Michael Bargury, a former senior security architect at Microsoft, highlighted the lack of default security measures in all bots created or modified using Copilot Studio.
At the Black Hat USA conference in Las Vegas, Bargury demonstrated how developers could inadvertently introduce security vulnerabilities into copilots created with the tool. He emphasized the ease with which mistakes could be made, leading to data exfiltration or policy bypassing.
The adoption of Copilot Studio has been on the rise, with a 60% increase in the last quarter, according to Microsoft CEO Satya Nadella. The number of organizations using Copilot Studio has also grown, reaching 50,000 shops including well-known names like Carnival, Cognizant, Eaton, KPMG, Majesco, and McKinsey.
Initially, there were several security flaws in copilot bots created by Copilot Creator, such as publicly accessible bots without authentication requirements and the ability to impersonate users easily. These vulnerabilities could potentially lead to data leaks and bypassing of security controls.
Bargury discovered that copilots could access private SharePoint sites when designed to interact with public ones, posing a significant security risk. He also found tens of thousands of open Copilot Studio bots on the web, making them vulnerable to remote attacks.
Microsoft has since addressed these issues and introduced new admin controls to prevent the creation of insecure copilots. Admins are advised to update their implementations to protect their organizations from potential security breaches.
Despite the productivity benefits of tools like Microsoft Copilot, Bargury stressed the importance of balancing productivity with security. He believes that Microsoft is working towards enhancing the security of Copilot Studio to prevent future vulnerabilities.
In his Black Hat session, Bargury outlined 15 security issues he identified with Copilot Studio, including unreliable input, data leakage scenarios, oversharing sensitive data, unexpected execution paths, and destructive copilot actions. He also introduced CopilotHunter, a security tool that scans for open Copilot Studio bots and accesses the data behind them.
Overall, the growing popularity of low-code chatbot creation tools like Copilot Studio underscores the need for enhanced security measures to prevent data breaches and unauthorized access. Microsoft’s commitment to improving the security of Copilot Studio will be crucial in ensuring the safe usage of AI assistants in enterprise environments.

