Skip to content Skip to main navigation Report an accessibility issue
Information Security

Privacy Concerns with Onboard AI: Microsoft Copilot



Continuing with privacy concerns with onboard AI, Microsoft Copilot, which uses the hardware on any system with Windows 11 Copilot+ to run the native AI of Copilot, has been seen as a potential issue for user privacy.

We’ve highlighted a few privacy concerns associated with Microsoft Copilot:

  1. Data Usage and Storage: Microsoft Copilot uses data from Microsoft Graph, which includes emails, chats, documents, and other organizational data. While this data is used to enhance productivity, there are concerns about how this data is stored and protected.
  2. Screen Capture Feature: Some Copilot-integrated devices have a feature called “Recall” that periodically takes screenshots of a user’s screen. Although these screenshots are encrypted and stored locally, there are concerns about the potential for misuse and the impact on user privacy.
  3. Compliance and Regulatory Concerns: Microsoft Copilot is designed to comply with regulations like GDPR and the EU Data Boundary. However, there are ongoing discussions about how well these compliance measures protect user data in practice.
  4. User Control and Transparency: Users have some control over what data is collected and how it is used, but there are concerns about the transparency of these processes and whether users fully understand the implications.

You can opt out of data collection for Microsoft Copilot by following these steps:

For General Data Collection: Microsoft provides clear notices and controls within Copilot, Bing, and Microsoft Start to allow users to opt out of having their data used to train AI models.

You can manage these settings through your Microsoft Account, ensuring you have control over your data.

For Dynamics 365 and Power Platform: If you have opted into data sharing for Copilot AI features, you can withdraw your consent at any time by going to the Power Platform admin center. Navigate to Settings > Tenant Settings and turn off the Data sharing for Dynamics 365 Copilot and Power Platform Copilot AI Features toggle.

For Security Copilot: If you opt out of data sharing, Security Copilot will delete all customer data shared within 30 days. Your data will be retained by your tenant if you have an active subscription and have not requested its deletion.

When you opt out of data collection for Microsoft Copilot, several things happen to ensure your data is no longer used:

  1. Data Deletion: Any data shared with Microsoft for the purpose of improving Copilot features will be deleted. For example, in the case of Security Copilot, all customer data shared will be deleted within 30 days.
  2. Cease Data Sharing: Microsoft will stop using your data to train AI models or improve Copilot features. This means your data will no longer be reviewed or used to enhance the service.
  3. Data Retention: Your data will still be retained by your tenant if you have an active subscription and have not requested its deletion. This ensures that your data remains accessible to you but is not used for any other purposes.
  4. Control and Transparency: You maintain control over your data, and Microsoft ensures that your data is handled according to your preferences and that the privacy policies are in place.

If you value your data and privacy, it would be best to opt out of data collection and data sharing. You can always search for privacy-related articles and tools that help defend your privacy and data. Always remain vigilant on the systems you use for your data and privacy.