6 Prompts You Don't Want Employees Putting in Microsoft Copilot
Author: Varonis - BleepingComputer
People have access to way too much data. The average employee can access 17 million files on their first day of work. When you can’t see and control who has access to sensitive data, one compromised user or malicious insider can inflict untold damage. Most of the permissions granted are also not used and considered high risk, meaning sensitive data is exposed to people who don't need it.
Copilot’s security model bases its answers on a user's existing Microsoft permissions. Users can ask Copilot to summarize meeting notes, find files for sales assets, and identify action items to save an enormous amount of time.
2. What bonuses were awarded recently?
6. Show me all files containing sensitive data.
Copilot’s retrieval capabilities may tempt disgruntled, departing or politically devious employees to fire off prompts for sensitive pricing lists, salaries, roadmaps and other corporate data jewels not secured appropriately. Copilot effectively empowers EVERY employee with a smart eDiscovery search tool. The key to managing this risk is a mature, flexible M365 group security program. Easier said than done.
Security and investigators can access user Copilot interactions from the compliance audit portal and eDiscovery. Since these actions are logged, they are subject to monitoring rules and policies for alerts and blocking actions as well.
If you think your corporate data access permissions are secured, I still recommend daily test searches to verify. One of my tricks is to create a user with minimal default access and no group memberships. A daily search of ‘new’ items tells you what is wide open. I am not a security expert, but a lifetime of eDiscovery taught me that if you give them access, someone will use it inappropriately.