Skip to main content

Microsoft 365 Copilot information governance guidance for information governance professionals

This guidance sets out the IG implications of using Microsoft 365 Copilot in health and care settings

Assessing the information held in your tenant

If you are adopting Copilot, you are encouraged to assess what information you hold in your tenant to confirm exactly what data sets may be subject to use by Copilot. You can use your Information Assets and Flows Register (IAFR) to help with this task, updating it where necessary.


Data Protection Impact Assessments (DPIA)

If you are processing personal data using Copilot, a DPIA is recommended based on how new the technology is and its potential to process a large amount of data. It is important to review different use cases for Copilot as the risk can vary depending on its intended use. A template DPIA for Copilot has been provided for you to use to help understand and assess the processing. You should record in your DPIA the information you have assessed is used in your tenant. It may be the case that you decide to have separate DPIAs for specific uses of Copilot.

It may also be necessary to update DPIAs for other processes or projects where Copilot will be used as part of those processes, for example, if you have a DPIA for a research project that will be supported by Copilot, ensure that the DPIA for that project also reflects this. See guidance from the Health Research Authority (HRA) for further information on when a DPIA is required for research projects.


Updates to your privacy notice and transparency materials

You should update your privacy notice for patients, service users and staff to inform individuals about your use of Copilot where this involves processing of personal data.

You should also assess which other methods of communicating the data use are needed to meet your organisation’s transparency requirements. This could include posters, emails or information leaflets.

We have developed a privacy notice template that you can use to develop your wider privacy notices and some recommended wording below to include for Copilot:

[Your organisation’s name] uses Copilot, an AI tool to help with certain tasks such as:

  • [List your uses of Copilot]

Copilot does not replace the expertise of your health and care professional and is never used to make decisions about the care you receive. The rights set out in this privacy notice will apply when it is used.


The location of the data processed and stored by Copilot

When generating responses, Copilot processes data held in the shared tenant in its original location and does not store it elsewhere.

Copilot saves and stores prompts and responses in the individual user’s mailbox. This remains within your organisation’s Microsoft tenant and is stored in line with your wider Microsoft set up.


The lawful basis for the use of the data

The lawful basis for processing personal information will vary for each task and it is important that you assess this on a case-by-case basis. As the purpose of Copilot is to aid staff in performing certain tasks within the Microsoft 365 environment, the lawful basis for using Copilot will typically be the same lawful basis which applies to the underlying task. For example:

  • if Copilot is being used to support a function of a health or care organisation, such as scheduling appointment reminders, the lawful basis is likely to be Article 6 (1) (e) public task
  • if as part of the health or care function, it is processing a special category of data, the lawful basis is likely to be Article 9 (2) (h) health or social care services (in addition to the article 6 basis above)
  • if Copilot is assisting with writing a summary of an employee's terms of employment, the lawful basis is likely to be Article 6 (1) (b) contractual obligation
  • if copilot is summarising reports of employee salary details for tax or national insurance purposes, the lawful basis is likely to be Article 6 (1) (c) legal obligation
  • if copilot is summarising information about a cohort of consented research participants, the lawful basis is likely to be Article 6 (1) (e) public task (see guidance from the HRA on legal basis for processing data for research)

The security of the data in Copilot

Because Copilot is integrated with your Microsoft tenant, it applies all existing Microsoft security, compliance and privacy controls that you have already deployed in your tenant. Organisations are responsible for monitoring their own privacy settings.

If you are on the NHS.net Connect shared tenant, find information on how to do this.


Data minimisation

Ensuring data minimisation (that Copilot only accesses and uses the data that is strictly necessary for its task, for example) is dependent on a number of factors:

  • your role-based access controls (RBAC) must ensure that an individual user's Microsoft account can only access information relevant to that individual's tasks and duties
  • users must appropriately limit their use of Copilot by only using it for approved tasks and entering specific and narrow prompts
  • records past their retention period must be reviewed or deleted in line with retention policies

You can achieve this by:

  • reviewing your tenant security setup to ensure that RBAC is in place and applied appropriately
  • making sure that users are aware of the relevant acceptable use policy (AUP) for Copilot through local communications (NHS.net Connect users are bound by the national Acceptable Use Policy for Copilot)
  • ensuring users receive appropriate training on how to manage access permissions in your Microsoft 365 tenant
  • conducting regular audits on both access controls and retention

Ensuring no automated decision-making takes place

The National Acceptable Use Policy (AUP), which is applicable to NHS.net Connect users, is designed to prevent use of Copilot in a way that constitutes automated decision-making. Local AUPs must do the same and can use the national policy as a basis to do so. It is important that this is socialised locally (see How can we make sure staff are using it correctly?). Results are returned from Copilot to the user and are there for the user to choose to use or disregard. This human check is essential to ensure that no automated decision-making is taking place.


The correct use of Copilot by staff

Organisations should consider local arrangements for socialising the expectations around Copilot use. You could do so through staff communications, training or local policies.

The AUP should be socialised with users of Copilot.

In addition to socialising expectations, organisations should implement audit arrangements which may involve reviewing how people are using Copilot and its outputs on a regular basis.


Individuals’ rights

The process for complying with individuals’ rights under UK General Data Protection Regulation (UK GDPR) are not significantly altered by the use of Copilot. Information continues to be stored in your Microsoft tenant and your existing processes will likely be sufficient, though this may need to be assessed on a case-by-case basis. Further information and considerations can be found in the template Copilot DPIA.


Records management

The addition of Copilot to your Microsoft suite will not significantly change your records management practices and should not cause issues. It may, however, highlight existing issues. For example, Copilot may surface data which an individual has access to that they should not have (or which they previously did not know they had access to). For NHS.net Connect users, this may occur across the shared tenant, meaning that users in other organisations may be able to surface your data if it is public. Staff should report any access to data that they should not have or do not need, through their organisation’s information incident reporting process.

To prevent this, organisations are encouraged to check their privacy setting. For organisations using NHS.net Connect, read guidance on how to do this.


The National Data Opt-Out (NDOO)

The NDOO may apply to disclosures of data beyond individual care, for example, for research and planning purposes.

The NDOO does not apply to specific tools that are used to support processing (i.e. an individual cannot use the NDOO to opt-out of processing of their data by Copilot), however, it may apply to the project, programme or purpose that Copilot is supporting in specific circumstances. For example:

  • where you are using Copilot to create a summary for a research project; and
  • the research project involves the disclosure of confidential patient information without consent under Section 251 support, and is therefore subject to the NDOO;
  • then you must ensure that Copilot does not use the disclosed data relating to individuals who have opted out to produce the summary

By contrast, where the data being used in Copilot is not reliant on section 251 support for example, because it has been rendered anonymous or provided with consent - the NDOO does not apply.

This would not prevent you from using Copilot to support a task associated with the same individual’s care, such as writing them a letter about an appointment, as the opt-out would not apply to that purpose.

You should ensure that for any processing that is subject to the NDOO, the relevant DPIA reflects use of Copilot where appropriate and details how you will exclude relevant data from Copilot’s searches.

See Understanding the national data opt-out for more information on when the NDOO applies.


Further information

The Information Commissioner’s Office has produced a consultation series on how data protection laws apply to the development and use of generative AI.


Guidance for patients and service users


Guidance for health and care professionals

Last edited: 14 May 2026 8:42 am