Practice Ambient AI Use Policy

  • Version Number: 1.0
  • Drafted by: Dr Rob Seal on 31st January 2025
  • Reviewed by: Dr Devin Gray on 4th March 2025
  • Reviewed by: Dr Lavan Baskaran
  • Approved by: Dr Lavan Baskaran 
  • Review date: 1st July 2025
 

1. Purpose

This policy outlines the framework for the safe, ethical, and compliant use of the AI transcription tool, Heidi Health, in clinical settings, ensuring patient safety, data security, and high-quality care documentation. Ambient AI technology refers to artificial intelligence systems that passively capture and process consultations to support clinical documentation while ensuring patient safety and data security.

 

2. Key Principles

  • Patient consent and transparency: patients must be informed of the use of ambient AI technology and have the option to opt out. Clear consent mechanisms must be in place.
  • Clinical oversight: AI-generated documentation must be reviewed and approved by the clinician before being added to the patient’s record.
  • Data protection and confidentiality: All data processing must comply with UK GDPR, the Data Protection Act 2018, and NHS Digital standards.
  • Bias and accuracy: The AI system should be regularly monitored for accuracy, potential bias, and alignment with clinical best practices. The ultimate responsibility for what is saved in the clinical record lies with the clinician using the tool.
  • Accountability: There are clear governance structures to oversee the use of AI, including incident reporting and system audits.
 

3. Scope

All clinical staff can use the tool once they have had sufficient training, can demonstrate that they understand the benefits and limitations of the tool and understand how to use safely from a clinical governance and data protection perspective.

 

4. Consent and Patient Communication

  • Patients must be informed prior to transcription that the consultation will utilise an AI transcription tool. Consent must be recorded within the patient’s notes before proceeding.
  • Patients should be informed that the AI tool assists in documentation, but the clinician retains final control over the record.
  • Practice website and premises should have additional information about the tool for patients to read and refer to.
  • The practice should engage with patients, including potentially the PPG regarding use of the tool.
  • Opt-Out Process: Patients must be offered the option to opt out of having their consultation transcribed by the AI tool. Opt-out requests will be honoured without impacting the quality of care. Staff must document these requests in the patient record.
 

5. Data Privacy and Security

  • Authentication: Multi-factor authentication (MFA) with NHS.net email address is mandatory for accessing the tool to prevent unauthorized use.
  • Data retention: Transcription data must be deleted after 24 hours, unless required longer for documented legal or clinical reasons. This must be set up in the Heidi Health user settings.
  • Data protection compliance: Compliance with UK GDPR, the Data Protection Act, and other relevant regulations is mandatory. DPIA processes must be completed and formally documented.
 

6. Clinical Governance

  • Accuracy responsibility: Clinicians are fully responsible for reviewing and approving AI-generated notes before finalising them in patient records. See Clinical Safety Case Report for full details.
  • Framework alignment: AI use must comply with DCB0160 standards and related NHS governance requirements.
 

7. Identified Risks and Mitigation Strategies

Over-reliance on AI

  • Risk: Clinicians may overlook errors or nuances due to over-reliance on AI.
  • Mitigation: Reinforce clinician responsibility for validation and provide regular training on AI limitations.

Patient distrust

  • Risk: Patients may feel uneasy about the use of AI in their care.
  • Mitigation: Engage patients with clear communication, transparency about safeguards, and opt-out options.

Technical failures

  • Risk: System outages or errors could disrupt workflows.
  • Mitigation: Revert to usual manual note-taking processes.

Inconsistent data handling

  • Risk: Failure to follow data retention and deletion policies could result in regulatory non-compliance.
  • Mitigation: Automate data deletion and conduct regular compliance audits.

Misinterpretation of context

  • Risk: AI may misinterpret or fail to capture nuanced language.
  • Mitigation: Emphasise clinician review of all transcriptions.

Bias in AI output

  • Risk: AI algorithm may introduce bias if training data lacks diversity.
  • Mitigation: Heidi health actively seek to minimise this and conduct regular evaluations for biases in generated notes.

Increased administrative burden

  • Risk: Extensive post-editing could increase workload, for example needing to add codes separately.
  • Mitigation: Test the AI tool thoroughly before implementation and gather ongoing feedback to address inefficiencies.

Lengthening documentation

  • Risk: AI-generated notes may be overly detailed, leading to lengthier documentation that takes longer for clinicians to review or read.
  • Mitigation: Train staff to customise and streamline transcriptions, editing the detail setting and focusing on clinically relevant content.

Reduction in clinical coding

  • Risk: PClinical coding is not transferred when pasting into the notes, which risks reduction in data quality. 
  • Mitigation: Train staff to be aware of this risk. Conduct audits to ensure continued good note taking practice.
 

8. Vendor Compliance

  • Compliance verification: the AI tool vendor must provide documentation confirming compliance with UK GDPR, NHS standards, and other relevant legal requirements. Certifications and compliance documents will be maintained as part of the practice's governance records. See DCB0129.
  • Ongoing collaboration: regular communication with the team at Heidi Health will ensure updates to the tool meet evolving legal and clinical needs.
 

9. Training Requirements

Training will be provided using Heidi Health’s training materials, tailored to the tool’s use. Completion of the training will be recorded in the staff training matrix for accountability. Refresher sessions will address updates and reinforce best practices, and updates in functionality will be discussed at regular intervals during clinical practice meetings.

 

10. Monitoring and Compliance

  • Audits: Conduct regular audits of compliance with data retention policies, and patient feedback.
  • Incident reporting: Any breaches, or concerns must be reported immediately to the practice management team.
  • Continuous improvement: the practice will apply learning from audits, incidents and feedback to refine the use of ambient AI technology and identify and address emerging risks.
 

11. Conclusion

The use of ambient AI in GP consultations has the potential to enhance efficiency and quality in clinical documentation. By adhering to this policy and the clinical safety standards set out in the DCB0160, it will be implemented with robust safeguards to ensure compliance, accuracy, and patient trust.