Legal and Privacy Resources

Stay informed on the legal and privacy standards shaping AI in healthcare. 

OntarioMD’s Practice Hub is excited to provide you with curated resources on regulations, best practices, and evolving legal standards specific to AI in healthcare. With additional materials covering cybersecurity essentials, you’ll find practical guidance to help you understand how to safeguard sensitive data and stay compliant in an evolving landscape.

The Personal Health Information Protection Act (PHIPA), is Ontario’s health-specific privacy legislation which protects patients’ personal health information (PHI). PHIPA places specific requirements on Health Information Custodians (HICs), which includes physicians, to ensure that PHI is collected, used, and disclosed reliably. For doctors, this means taking steps to safeguard patient data and adhering to clear rules about patient consent, data sharing and security.
 
The Personal Information Protection and Electronic Documents Act (PIPEDA), is a federal privacy law for private sector organizations to regulate how personal information in commercial activities is collected, used, and disclosed. While it primarily governs businesses, PIPEDA can also apply to healthcare practices if they engage in activities outside Ontario, or in partnership with organizations across Canada. 

PHIPA and PIPEDA both protect personal information but have different scopes. PHIPA, Ontario’s provincial law, governs health information specifically and is the primary regulation for healthcare practices handling patient data, whereas PIPEDA, applies to private-sector organizations, including healthcare organizations who engage in commercial activities or operate interprovincially or outside Canada.

When using AI tools, your practice must comply with PHIPA standards for managing, securing, and sharing patient data, and vendors may also need to meet PIPEDA standards if they handle data outside Ontario. When using AI tools (for example, those involving data processing or storage outside Ontario), you may need to ensure compliance with both PHIPA’s requirements for handling of PHI and PIPEDA’s broader privacy protections. In practice, understanding both laws can help you remain compliant when working with AI vendors and confirms that data privacy is safeguarded across jurisdictions.

PHIPA is central to the regulatory requirements with respect to adoption of AI scribe technology. PHIPA is designed to protect the privacy and security of patients’ PHI, setting standards for data protection that AI vendors and healthcare providers must meet.

PHIPA, along with other Canadian privacy standards, provides protection for patient data. These laws require AI tools to secure patient data and restrict its use to authorized purposes only, ensuring that confidentiality is maintained.

Here are some key considerations under PHIPA that are especially relevant when using AI tools in a clinical setting:

  • Consent: PHIPA requires that physicians obtain informed, valid consent before collecting, using, or disclosing PHI, including data processed by AI tools. For AI scribes, this means ensuring patients are fully informed about how their data may be used, including any potential third-party use for algorithmic training. Physicians may wish to consider updating existing consent forms to explicitly address these AI-specific uses and risks.
  • Security: As a physician, you are required to have in place reasonable security safeguards to protect PHI against unauthorized access, use, disclosure, or destruction. When considering using an AI scribe in your practice, you must ensure that the system you chose safeguards patient data, including encryption of audio recordings, securing all data both in transit and at rest, and restricts access to audio recordings, transcription or any other data generated by the AI system. AI scribe vendors must also comply with PHIPA, and in the case of those in the province’?’s Vendor of Record Program, have already been vetted for PHIPA compliance.
  • Data Processing and Retention: PHIPA specifies that physicians must maintain patient records that are accurate, complete, and retained in accordance with its standards. Physicians should ensure that any documentation produced by an AI scribe adheres to these requirements. This includes knowing how and where data is stored, how long it’s retained, and how it’s securely disposed of when no longer needed. Patients also have the right to correct their medical records for any errors.
  • Data De-identification and Re-identification Risks: Even when patient data is de-identified, there remains a risk of re-identification, especially if data is shared or used for algorithmic training purposes. 
  • Access: A patient has a right of access to any record of personal health information you hold about them. The AI scribe you use may make additional transitory records that you would not have made if you were documenting yourself in your EMR (such as an audio recording, transcript or draft note). Once you have completed your SOAP note or finalized the note generated by the AI scribe, you will want to ensure that any transitory notes, including those processed by the vendor are deleted. If the vendor stores that information and a patient asks for a copy of those transitory notes (and they still exist and have not yet been deleted) the notes would be subject to a right of access by the patient. 

Laws and regulations of general application (privacy, medical regulatory, etc.) continue to apply. Currently, there is no comprehensive legislation in Canada that regulates AI, but discussions and initiatives to address the ethical and regulatory challenges posed by AI in various sectors, including healthcare, are underway. The Artificial Intelligence and Data Act (AIDA) is draft legislation which could establish a regulatory framework for AI in Canada. If enacted, AIDA and the potential regulations under it might set the foundation for the responsible design, development, and deployment of AI systems. In its current form, AIDA places an emphasis on safety, non-discrimination, and accountability for businesses involved in AI technologies, and overall, represents a possible step towards comprehensive AI regulation in the country. It is unclear when, or if, AIDA will be enacted.

To ensure your practice meets regulatory standards when using AI, especially in Ontario, here are several critical steps you can take to maintain compliance under PHIPA and related privacy laws:

  • Look for PHIPA-Compliant Tools: Look for, and try to choose tools from AI vendors who confirm their technology complies with PHIPA (Vendor of Record participants will have already been reviewed and vetted for PHIPA compliance). This includes implementing security measures that protect PHI and limiting data use to authorized purposes. 
  • Obtain Valid Patient Consent: PHIPA requires that patients provide informed, valid consent before the first time an AI scribe is used, although express consent is recommended. Update consent forms to explain how AI tools interact with patient data, any potential third-party involvement, and associated risks.
  • Research and Implement Security & Privacy Safeguards: PHIPA requires reasonable security safeguards to protect PHI against unauthorized access, use, or destruction. You should confirm that your chosen AI scribe includes encryption (for data at rest and in transit), access logs, and regular security updates. Some vendors may also specify that their tools have certifications (e.g., SOC 2 or ISO 27001), which can help to demonstrate adherence to the more stringent security standards.
  • Perform Regular Audits: Regular audits can help verify that both your practice and any AI tools in use remain compliant with privacy laws. This may include reviewing access logs, checking for unauthorized access, assessing data storage practices, and identifying any non-compliance in vendor operations.
  • Create Data Retention and Disposal Policies: PHIPA specifies how long PHI should be retained, requiring records to be accessible as long as necessary for the patient’s recourse. While you may work with your vendor to establish retention limits for AI-generated data and enforce secure disposal practices to prevent unauthorized access post-termination, this is also something which can be discussed and implemented in your daily practice with other physicians and staff.

By following these tips, you can take a robust approach to PHIPA compliance, helping to protect patient data and maintain high standards of privacy and security when integrating AI tools into your practice!

PHIPA mandates that patients provide valid consent before the first time an AI scribe is used for documentation purposes (recording conversations and note-taking) with each patient. Seeking express consent is also recommended.

This consent must be valid and meaningful, which means that you should:

  • Update your privacy policies to include references to AI, de-identification of personal health information for AI use, and data storage. 
  •  Update your privacy notices and/or signage to include notice of use of recording and retention of data for the use and general use of AI in patient encounters.
  • Consider using a consent form with simple, understandable language explaining the purpose and potential uses of any recordings. Patients should be fully informed of how their personal information is being used, collected, and shared. You can also use our free Patient Consent Toolkit.
  • Provide patients with the ability to ‘opt out’ of the use of your AI scribe during visits without it impacting their healthcare. 
  • Review your AI scribe output after each patient encounter to ensure notes and documentation summarized by the AI scribe are accurate and complete. Recordkeeping remains your responsibility as a physician. 

Despite any policies or consent forms used, these are not a substitute for a proper, informed discussion with your patients about the risks and benefits of using an AI scribe.

Carefully review any consent form offered by a vendor to check if it is easy for your patients to understand. If the AI scribe makes a recording, you will need the consent form to include consent for you to record the interaction. Be careful not to use a form that asks your patients to give the vendor consent to use or disclose their data for the vendor’s own quality improvement, product development, marketing or training of the AI model. 

OMD provides a sample consent in our Patient Consent Toolkit.

Under PHIPA, health information custodians are responsible for how their patients’ personal health information (PHI) is used, collected, and disclosed, including by third-party services providers like AI scribe vendors. De-identification will likely be considered a “use” of PHI under PHIPA, and therefore any de-identification and subsequent use must be PHIPA compliant. PHIPA permits custodians to de-identify data without consent, but it must be for purposes related to providing care or managing their practice, not for a vendor’s benefit, like training or improving their algorithms unless there is authorization.

If physicians authorize a vendor to use de-identified data, they should obtain remain transparent about these practices, including describing it in their privacy policies/notices. This notification should include information on the purposes of the use, and taking reasonable steps to ensure PHI is protected and kept secure to ensure individuals’ privacy is protected and minimize re-identification risks.

Under PHIPA, de-identified data may sometimes be used for secondary purposes, like improving AI algorithms, without express patient consent. However, there always remains a risk of re-identification, so it’s important that any de-identified data is stored and processed securely by your vendor. Check what privacy and security protocols your vendor has in place. To maintain transparency, consider informing patients about any use of their de-identified data, even for secondary purposes.

The type of support and/or updates a vendor will provide both at the outset and throughout the term of your contract will be laid out in the terms of your agreement. If you are having issues with your current AI scribe, you should start by contacting your vendor, as they will be most likely to help resolve the problem. Use your vendor’s designated email or telephone number. If you’re looking to enhance or integrate your AI scribe into your clinic’s workflow, you can contact OntarioMD at support@ontariomd.com. We can also assist with any general questions you may have on AI scribes.

The level of involvement and support a vendor provides is an important element to keep in mind in relation to the needs of your practice – do you prefer to have a vendor who provides more guidance, or are you tech-savvy and feel confident with more independence? Check if the contract specifies the vendor’s response times and staffing, including what, if any, resources there are for troubleshooting ongoing issues.
 
Typically, the contract will stipulate that updates to your AI tool will happen periodically, but check if there is a timeframe given, and what notice the vendor is required to give you. To make sure your AI tool remains up to date, look for a clause specifying any regular maintenance, support, and updates. This can include both scheduled updates to improve performance and security, and support for any technical issues that arise.

To effectively address patient concerns about privacy with AI tools, transparency and empathy are key. Be upfront about the AI tool’s benefits and privacy safeguards – it is good for patients to know the benefits an AI scribe can have on their quality care, while also laying out the ways their sensitive information is protected. Many patients may worry about how their data will be used. Clearly explain that their data will only be used for the purposes they’ve consented to, such as for documentation, and will not be sold or shared with third parties unless explicitly authorized by them.
 
Before the first time an AI scribe is used, you must seek valid consent from the patient. This is an opportunity to have an informed discussion on the use of AI tools, including that patients have a right to opt out of using it without having an impact on their quality of care.

By being transparent, empathetic, and informative, you help patients feel more comfortable with AI tools, showing them that their privacy is paramount.

If you suspect you or your clinic have undergone a data or cybersecurity breach, it is important to act quickly to mitigate any real or possible damages. In some cases, breach reporting is mandatory under PHIPA. At a high level, you and/or your practice should:

  • First, identify and contain the breach to stop any further unauthorized access, then report it as required under PHIPA to the appropriate authorities.
  • Notify affected patients as soon as possible, outlining the extent of the breach and actions taken to protect their data.
  • Where applicable, notify and work with the AI vendor to assess what led to the breach and review or update your security practices to prevent future incidents.
  • Depending on the breach, you may need to notify the Information Privacy Commissioner of Ontario.

To learn more about how to prevent and respond to cybersecurity risks, you and your staff can also consider taking OntarioMD’s free Privacy & Security training which comprehensively covers the do’s and don’ts or protecting personal health information from breaches and security incidents.

Advice to the Profession

`