Microsoft’s Copilot is gaining popularity as an AI productivity assistant among major global enterprises. However, Michael Bargury, the Chief Technology Officer at Zenity, has raised concerns about the cybersecurity risks associated with this new technology.
Despite being a fan of Copilot and finding it extremely beneficial in his own work, Bargury highlighted the potential security vulnerabilities that come with its extensive access to enterprise systems. Copilot has the ability to delve deep into various platforms such as emails, messaging applications, and files, which enhances its usability for users but also makes it an attractive target for cyber attackers.
Bargury pointed out, “It has access to your emails, your calendar, your Teams messages, all of your files, and if you bring in plug-ins it can actually work on your behalf. It has access to everything you have access to, even the things you write to yourself.” This level of access opens up opportunities for malicious actors to exploit the system.
In his research, Bargury demonstrated how easy it is to take control of Microsoft Copilot by simply sending a single email. He explained, “I can get Copilot to tell you whatever I want it to tell you.” This discovery underscores the need for heightened security measures when utilizing AI technology in sensitive business environments.
As Copilot continues to be integrated into more enterprise workflows, the potential for cyberattacks targeting this AI assistant becomes a real concern. Organizations must prioritize cybersecurity protocols and ensure that proper safeguards are in place to protect against unauthorized access and manipulation of sensitive data.
Ultimately, while Copilot offers significant productivity benefits, it is essential for organizations to be vigilant about the potential risks associated with its widespread use. By addressing these cybersecurity concerns proactively, businesses can leverage the full potential of AI technology while safeguarding their valuable information from malicious threats.

