Microsoft 365 Copilot information governance guidance
Microsoft 365 Copilot, including Copilot Chat (called Copilot from this point) is a tool that uses Artificial Intelligence (AI). Health and care organisations may use Copilot to help with day-to-day tasks.
Copilot has both a free and a paid-for model. This guidance relates to the paid-for enterprise model which integrates with Microsoft 365 apps.
Copilot can have many uses, for example: answering questions about the contents of documents, making summaries of work meetings, or making some simple tasks run automatically.
Copilot uses 2 main sources of information:
- information which is publicly available on the web
- information held on Microsoft apps which the person using Copilot already has permission to see
This is likely to include personal data and may include data that is considered sensitive.
Guidance for patients and service users
The use of health and care information by Copilot
Copilot can be used by health and care organisations to make things like searching documents, taking notes and other admin tasks more efficient. The IT systems that hold your health and care records for your care do not use Microsoft 365. For example, the IT system used by your GP practice to record your medical notes is not provided by Microsoft.
Your data may, however, be used by Copilot if it is held in the Microsoft 365 applications used by your health or care organisation. For example:
- Microsoft Outlook may include emails which are sent about patients or service users
- Microsoft Excel may include lists of individuals receiving care
- Microsoft Word may include letters, for example a letter from a hospital consultant to a patient
Health and care organisations will look at what information they hold in Microsoft 365 systems before they start using Copilot. This will help them to decide whether Copilot should access that information or not. If not, they will move the information so it cannot be accessed by Copilot.
Health and care organisations will check that Copilot is only used in the way they planned to.
Copilot does not impact the care you receive
The use of Copilot will not directly impact your care. Copilot will never be used to decide the care you receive. It will not replace a healthcare professional’s expertise or affect their decisions about your care.
Your rights
Data protection laws give you rights over how your personal data is used. These rights still apply when an organisation uses Copilot. You should see your health or care organisation’s privacy notice for more information about your individual rights.
Guidance for health and care professionals
For general information about Copilot, how it works and its benefits, please see the Copilot hub page.
The use of health and care information by Copilot
Microsoft 365 is not deployed as a clinical system. While Copilot agents may be able to interact with other local systems, they should not access patient health record systems or triage systems.
We therefore do not expect extensive data about your patients and service users to be used by Copilot. This will, however, depend on how your organisation currently uses Microsoft 365 applications. The same applies to any personal data stored within your Microsoft system, such as HR records. You should always ensure any personal data is appropriately restricted to only those who require access.
If you are a part of NHS.net Connect, your use must comply with the Acceptable Use Policy. If you do not use NHS.net Connect, your organisation should have its own Acceptable Use Policy, although it will likely have similar requirements.
Training Copilot
Copilot comes trained and does not use any organisational data to train it further.
The quality of the outputs of Copilot
Copilot warns users after each prompt that the responses that generative AI produces are not guaranteed to be 100% factual. Users must use their judgement when reviewing the output before sending them to others or using them in official documentation. Copilot can provide useful drafts and summaries to help users achieve more. It also provides a chance to review the generated AI rather than fully automating these tasks. As a user, you should be aware that it is your responsibility to check the accuracy of Copilot outputs, particularly if any personal data is involved. This includes verifying information or statistics against an independent second source where possible.
What Copilot should not be used for
Copilot must not be used for clinical decision-making, diagnostics, or as a substitute for a healthcare professional’s expertise. The tool can support clinical administration but should not be used to provide individuals with care, and therefore cannot be relied upon to inform treatment decisions, determine patient care pathways, or interpret health and care data. There is a clinical safety case and hazard log available for Copilot.
Copilot may be used by staff to seek specialist advice such as legal or compliance advice. In these cases, staff should always check the guidance they receive from Copilot with an expert with sufficient expertise to confirm the accuracy of the advice. This is to ensure any advice is accurate.
It is important to note that ‘large language models’ can introduce new functions unintentionally, as well as allow users to use it in a way that goes beyond its intended use. This can lead to the tool inadvertently being used in ways that are not approved by your organisation. It is therefore important for your organisation to monitor use on an ongoing basis.
Further help
You should speak to your information governance (IG) team or your Data Protection Officer (DPO) if you need support relating to the use of information by Copilot. You can also speak to your Caldicott Guardian about ethical, lawful and appropriate uses of information.
These IG pages provide clear and consistent IG advice and guidance to patients and service users, health and care staff and IG professionals. NHS England convenes a working group to check and challenge the guidance.
Last edited: 14 May 2026 8:39 am