PsyFi
PsyFi Technologies
Back to Blog
PsyFi Team

Patient Privacy and AI Therapy Notes: How to Talk to Patients About It

Practical scripts and guidance for therapists who need to explain AI-assisted documentation to patients, address privacy concerns, and handle consent conversations with confidence.

AI patient privacy therapy notes consent documentation behavioral health HIPAA

You started using AI to help with your clinical notes. Now a patient asks: "Wait, is a computer reading what I say in here?"

That question lands differently in a therapy room than it does in a dentist's office. The therapeutic relationship depends on trust, vulnerability, and the feeling that what happens in session stays between you and the patient.

So how do you talk about AI documentation without undermining the relationship you have worked hard to build?

This post offers practical language you can use, common patient concerns you should expect, and guidance on when transparency helps versus when too much detail creates unnecessary anxiety.

Why this conversation matters

Most clinicians do not look forward to explaining AI tools to patients. It feels awkward. It can raise questions you are not sure how to answer.

But the conversation matters for several reasons:

  • Informed consent requires patients to understand how their information is handled
  • Patients who learn about AI documentation from a third party may feel blindsided
  • Proactive transparency builds trust rather than eroding it
  • Some licensing boards and payers are beginning to issue guidance on AI disclosure

The good news: most patients handle this conversation better than clinicians expect.

What patients are actually worried about

When patients hear "AI" in the context of therapy, their concerns tend to fall into a few categories:

"Is a robot listening to my sessions?"

Patients often imagine a live AI sitting in the room, analyzing them in real time. Clarifying that AI helps with documentation after the session — not during — resolves this quickly.

"Will my information end up somewhere it shouldn't?"

This is a data security question. Patients want to know that their records are not being sent to a general AI service, stored on a public server, or used to train some other model.

"Does this change what you write about me?"

Some patients worry that AI-generated notes will be more clinical, more impersonal, or more revealing than what a therapist would write on their own.

"Can I opt out?"

Patients want to know they have a choice. Even when they ultimately agree, knowing they could say no matters.

Scripts for introducing AI documentation

Here are some sample approaches you can adapt for your practice:

During intake

"I want to let you know that I use a documentation tool that helps me write my session notes. It's HIPAA-compliant, meaning it follows the same privacy rules as the rest of your medical record. The AI helps me organize my notes faster so I can spend more time focused on you during our sessions. Do you have any questions about that?"

When adding AI tools to an existing practice

"I wanted to let you know about a change I've made to my documentation process. I've started using an AI tool that helps me draft my session notes. Your information is still protected under the same privacy standards — nothing about your rights or my confidentiality obligations has changed. The main difference is that I can get notes done faster, which honestly means I'm less behind on paperwork."

When a patient asks directly

"Good question. The AI helps me structure my notes after our session. It doesn't listen in or record anything on its own. I still review and edit everything before it becomes part of your chart. Think of it like a very organized assistant that helps me with the writing part."

What to say versus what NOT to say

Do say:

  • "I review and approve everything before it goes into your chart"
  • "Your information is protected under HIPAA, the same as it always has been"
  • "You can ask to see your notes at any time"
  • "You have the right to request corrections"

Do not say:

  • "Don't worry about it" (dismissive)
  • "It's just like ChatGPT" (raises more questions than it answers)
  • "The AI is better at notes than I am" (undermines clinical judgment)
  • "Everyone is doing this now" (pressures rather than informs)

Addressing the data security question

When patients ask where their information goes, be specific:

  • The tool processes information in a HIPAA-compliant environment
  • Data is encrypted in transit and at rest
  • Patient information is not used to train general AI models
  • Access is limited to you and authorized staff, just like any other part of the record

If you use PsyFiGPT, you can explain that the platform is designed specifically for behavioral health documentation with privacy as a core requirement, not an afterthought.

Patient rights with AI-generated notes

Patients retain all the same rights they have with any clinical documentation:

  • Right to access: patients can request copies of their records, including AI-assisted notes
  • Right to request amendments: if a patient believes something in a note is inaccurate, they can request a correction
  • Right to an accounting of disclosures: patients can ask who has accessed their records
  • Right to restrict uses: patients can request limitations on how their information is shared

These rights exist regardless of whether notes were handwritten, typed, or AI-assisted. The documentation method does not change patient protections.

When patients opt out

Some patients will say they are not comfortable with AI documentation. That is their right.

Here is how to handle it:

  1. Acknowledge the concern without defensiveness
  2. Clarify what opting out means practically (you will write notes manually)
  3. Document the patient's preference in their chart
  4. Do not treat it as a disruption or a problem

Sample language:

"That's completely fine. I'll continue writing your notes the way I always have. If you ever change your mind or want to know more, just let me know."

Most clinicians find that the number of patients who opt out is small — typically under five percent. But having a clear process shows that you take the concern seriously.

Updating your consent forms

If you are using AI documentation tools, your informed consent should include language about it. This does not need to be a separate form in most cases. A brief addition to your existing notice of privacy practices or informed consent document is usually sufficient.

Key elements to include:

  • That AI-assisted tools may be used for documentation purposes
  • That the clinician reviews all AI-generated content
  • That patient data is handled in accordance with HIPAA
  • That the patient has the right to ask questions or opt out

PsyFi Assist includes consent language templates that you can adapt for your practice, covering both intake automation and AI documentation disclosure.

The bigger picture

Talking to patients about AI documentation is not fundamentally different from other transparency conversations you already have — explaining limits of confidentiality, discussing recording policies, or describing how you handle subpoenas.

The key is to be straightforward, specific, and open to questions.

Most patients care about two things: that you are still paying attention in session, and that their information is safe. When you can answer both of those clearly, the AI conversation becomes a non-issue.

If you are looking for tools that make this easier — both the documentation itself and the patient communication around it — PsyFiGPT handles clinical notes with privacy built in, and PsyFi Assist streamlines the intake and consent process so nothing falls through the cracks.


Have questions about implementing AI documentation in your practice? Contact us — we are happy to help you think through the workflow.