Use and adoption of AI ambient scribes in general practice

  • Safety

This guidance covers the benefits and risks practices need to consider before using AI to assist with note taking.

It will take you through the areas for a practice to consider before implementing an AI ambient scribe, or if there is already one in use, the areas to review and update as necessary, with a helpful checklist appendix of the actions to undertake to ensure all risks are being managed.

AI ambient scribes use generative AI to convert spoken words into text and other outputs. They are designed to support clinical documentation and workflows. AI scribes potential benefit GPs and practices in several ways, including reduction of workflow, saving time during the consultation, and allowing the GP to focus on the patient as the scribe captures what is being discussed (for the GP to then review before uploading or accepting into the patient’s record). Some AI scribes will also automatically create other clinical documentation using templates or suggest clinical coding. The main AI scribes currently being used by practices in London are Heidi, Tortus and Anima. Accurx Scribe is also due to be released soon.

However, the use of AI scribes can also bring with it clinical safety, legislative and regulatory compliance risks, including the risk that the spoken work output can be misunderstood or misrepresented and could be entered into the patient record without an informed review of the output before it is added. This guidance will guide you through the actions we advise to ensure that risks are mitigated. Before implementing an AI scribe in your practice, engage early with your Data Protection Officer (DPO) and Integrated Commissioning Board (ICB) for support on ensuring compliance with legal and regulatory requirements.

All use of AI within a specific general practice should be agreed by the practice management and implemented safely. Practices may want to have an AI policy to manage this. The Information Commissioners Office (ICO) provides guidance for adopting AI technologies in compliance with data protection legislation. NHS England (NHSE) recently released guidance on the use of AI-enabled ambient scribing products in health and care settings and the British Medical Association (BMA)’s General Practitioner’s Committee for England (GPCE) have released a brief note on the use of Artificial Intelligence (AI) software in general practice.

Liability

The NHSE guidance states that “NHS organisations may still be liable for any claims arising out of the use of AI products particularly if it concerns a non-delegable duty of care between the practitioner and the patient”. This risk can be mitigated by having clear and comprehensive contracting arrangements with suppliers setting out their roles, responsibilities and liabilities. If considering a free AI scribe tool or one provided initially for free, check that the appropriate contracts and data processing agreement/relevant data protection contractual clauses are in place. The ICO provides guidance on what should be included in a contract and/or a data processing agreement.

Clinical safety

Clinical safety is a clinical risk management activity. Digital clinical safety assurance is the process by which health IT used by care professionals is assured as safe and meets the required national standards. There are two standards issued by the NHS: DCB0129 and DCB0160.

Practices need to manage clinical risk when using IT solutions including AI by:

  • Assigning a Clinical Safety Officer (CSO) within the practice or PCN to oversee clinical risk management– this must be a qualified healthcare professional with sufficient training in clinical safety and clinical risk management. NHSE provides Digital Clinical Safety training.
  • Checking that the supplier of the AI scribe has the DCB0129 clinical safety standard, which requires suppliers of digital health solutions to verify the safety of their products in England. The practice will need to read this to understand the risks and then re-evaluate and form their own risk assessment in the form of a DCB0160.
  • Complete a DCB0160 (check with your DPO/ICB if they have an exemplar DCB0160 for the product/tool you want to use):
    • Conduct a Clinical Risk Assessment to analyse the product’s functions, its architecture and failure models, if there are any potential sources of harm that may have a clinical impact or present a clinical risk to a patient, eg missing key information in the transcript, delayed outputs, incorrect information. These are described as hazards and documented in a hazard log with the potential clinical impacts, their mitigations and controls and a final assessment of risk. the actions taken to ensure the system is clinical safe.
    • Produce a clinical safety case. The analysis from the clinical risk assessment is presented as a structured argument supported by a body of evidence in order to provide a compelling, comprehensive and valid case that a system is safe to use
    • Implement risk controls and a monitoring framework to include conducting regular reviews & audits assessing for any new risks, issues/keeping note of any incidents and changes in a clinical hazard log and updating the clinical safety case report where necessary.

Data protection compliance

Transparency to patients

  • Update practice privacy notices with what AI tools are being used, how they work, how their data will be used including how their data will be stored and for how long, and their right to refuse a recording or withdraw consent. Make the information available in your practice public areas, on practice websites and through social media channels and your patient participation groups.
  • Health and care professionals should inform patients at the start of a consultation if an AI tool will be used. If the patient does want the AI tool used it should be confirmed that it is deactivated during the consultation.
  • Practices may wish to consider suffixing their consultation entry with a line to state the consultation entry was made using AI scribing software and/or tag the consultation with the SNOMED code SNOMED: Audio Dictation (24771000000105).

Complete a Data Protection Impact Assessment (DPIA)

Complete a Data Protection Impact Assessment (DPIA) to identify and assess any risks to individual’s personal data, to ensure compliance with UK data protection legislation and to map the data flow to describe how the data is collected, stored and shared. ICBs or your DPO may have an exemplar DPIA for the AI scribe you want to use that you can review and amend for your practice.

Security

Practices should consider that appropriate security measures are in place, including all data being processed in the UK and how access to the data/AI tool is managed. Completing the DPIA will help define the security measures in place, assess any organisational or technological security risks and identify mitigation measures.

Retention of data

Practices should check that they understand how and where the AI scribe will retain data, how long it is retained for, when it will be deleted and whether the practice or the clinician control the retention period. If recordings or transcripts are stored, then these must be included in any Data Subject Access Request response provided. Most AI scribes will delete the transcript after it has been reviewed and copied to the patient record.

Training

Inform and train all staff about the use of the AI scribe, what the risks are and how these are being managed. Ensure that this training and information is available for locums working in the practice.

Bias and accuracy in transcriptions

AI systems are trained on data sets that may not be adequately diverse to represent the wide scale of accents within the UK population. This can produce embedded bias due to accent, pronunciations or other factors and presents a risk that the AI scribes may generate information that is incorrect or misleading. If the AI scribe translates the language of the consultation, then it is recommended that the clinician reviewing the transcript for accuracy is fluent or confident in the language being transcribed so that they can identify and correct errors. This risk can be mitigated by ensuring that all outputs are reviewed by the healthcare professional involved in the consultation, and any inaccuracies corrected.

Right not to be subject to automated decision-making

UK GDPR Article 22 provides people with the right not to be subject to automated decision-making. A process won’t be considered solely automated if someone interprets the result of an automated decision before applying it to the individual, therefore all outputs from an AI scribe must be reviewed by a clinician for both compliance with data protection responsibilities and for clinical responsibility.

Regulatory compliance

Check if the AI scribe you want to use is considered a medical device

The NHSE guidance states that “ambient scribing products that inform medical decisions and have simple/low functionality (for example, products that solely generate text transcriptions that are easily verified by qualified users) are likely not medical devices. However, the use of Generative AI for further processing, such as summarisation, would be treated as high functionality and likely would qualify as a medical device.” The Medical and Healthcare Products Regulatory Agency (MHRA) has guidance on when software applications (apps) are considered to be a medical device and how they are regulated. If it is a medical device, it needs to be registered with the MHRA, and the Yellow Card reporting mechanism used to report any issues. You can search the Public Access Registration Database (PARD) to check if a product is registered as a medical device.

CQC inspections

Where digital technology is being used for one of the activities regulated by CQC, this may be included in their inspections. A completed DPIA will demonstrate that the practice gave consideration on how it could be deployed safely and effectively in the care pathway and what improvements to good innovative practice it might deliver. Keeping a training record of the training provided, and having processes to report issues and incidents as part of the clinical safety can also be used as evidence.

Useful guidance and links

Appendix – A checklist for GPs and practices before implementing an AI ambient scribe:

✔ Do you have a clear and comprehensive contract in place with the supplier of the AI scribe?

Have you engaged with your Data Protection Officer (DPO) and/or Integrated Care Board (ICB) on information governance and cyber security?

  • They may have exemplar Data Processing Impact Assessments (DPIAs) and DCB0160s on the AI scribe (see Appendix B) for details of the current support available from your ICB and GP DPO)
  • Have you assigned a Clinical safety officer role at practice or PCN level to oversee clinical risk management?
  • This must be a qualified healthcare professional with sufficient training in clinical safety and clinical risk management. NHSE provides Digital Clinical Safety training.

Have you reviewed the DCB0129 and completed a DCB0160 for the AI scribe?

  • Check with your ICB and/or GP DPO if they have an exemplar DCB0160 on the AI scribe you can review.
  • Have you checked if the AI Ambient Scribe is classed as a medical device?
  • You can search the Public Access Registration Database (PARD) to check if a product is registered as a medical device.
  • Reviewed which functionality you want to use, including any integration with your practice clinical systems (EPR, CBT).

✔ Have you completed a Data Protection Impact Assessment (DPIA)to consider risks and their mitigations?

  • Check with your ICB and/or GP DPO if they have an exemplar DPIA on the AI scribe you can review.

Have you informed and trained your staff on the AI scribe?

  • What it does, how it works?
  • How to check and review the output?
  • Ensure the training is available to new colleagues and locums.
  • Keep a training record of all training.
  • Have you informed your patients?
  • Update your practice privacy notice with the use of the AI scribe.
  • Update patients through practice websites, social media channels, patient participation groups, posters/information on screens in the surgery etc.
  • Remind staff to inform patients at the start of a consultation that the tool will be used and check that staff know how to disconnect if the patient does not want it used.

✔ Have you implemented a monitoring framework with regular reviews and audits of the use of the AI scribe?

 

Appendix B – Support from London ICBs for GP Practices

This will be updated as new information is received.

North Central London (NCL)

The NCL ICB general practice website has a separate page on AI in practice which has a structured step by step process for GP practices to follow, including:

  • Clinical safety support from the NCL ICB Clinical Safety Officers on completing a DCB0160.
  • Confirmation from the technical implementation group assessment that the tool is safe to use in NCL and whether there any other requirements, eg firewall access,
  • The GP DPO provides support on completing the DPIA and can provide a list of FAQs on AI.
  • Guidance on other considerations, eg workflow changes.

North West London (NWL)

NWL ICB have taken Heidi, Tortus, and Anima through their technology pathway and these are NWL approved products to use with exemplar DPIAs and DCB0160s available for NWL GP practices to review. Accurx scribe will go through the same pathway. Please contact the GPD DPO nhsnwl.icb-dpo-gp@nhs.net for further information.

North East London (NEL)

NEL ICB have been running an AI pilot which they are evaluating with Heidi across 42 practices for 6 months with two users per practice. In the pilot, the NEL DPO completed DPIA and used a novel way for clinical safety, where an overarching hazard log was applied across all practices and supported with mitigation. Each practice was then asked to formalise their own DCB0160 using the same provider paying a nominal amount to engage with and participate in the DCB0160 process and the practice will have a follow up hazard review at 3-6 months. Each practice was also given an implementation plan to support the hazards identified. NEL ICB are mirroring this process with the Accurx scribe. Please contact the NEL GP DPO via Itservicedesk.nelicb@nhs.net for further information.

South East London (SEL)

Please contact the SEL GP DPO (gpdpo@selondonics.nhs.uk) in the first instance for assistance/advice on a DPIA.

South West London (SWL)

Please contact the SEL GP DPO in the first instance for assistance/advice on a DPIA – swl.gpdpo@swlondon.nhs.uk.

London region

A portal is being developed to share exemplars and there is planned work at a London level to look at the clinical safety aspects of new AI at a regional level.

NHS England

NHSE stated in their guidance that they are developing exemplars. When these became available the guidance will be updated.