PsyFi
PsyFi Technologies
Back to Blog
PsyFi Team

Private AI for Therapists: How to Vet Vendors on Encryption, PHI, and HIPAA Before You Sign

A HIPAA-focused vendor due-diligence checklist for practice owners evaluating AI tools: BAAs, PHI handling, encryption standards, breach notification, and what 'privacy-first' actually means.

PsyFiGPT HIPAA Privacy AI Security Compliance Buyer Guide

Quick Answer

Before adopting any AI tool in a behavioral health practice, require a signed Business Associate Agreement (BAA), confirm that Protected Health Information (PHI) is never sent to third-party AI models without a BAA in place, verify encryption at rest and in transit, and ask exactly how long data is retained and under what conditions humans can access it. "Privacy-first" is a marketing claim — a BAA, a documented data flow, and a clear breach-notification policy are facts.


Why Practice Owners Need a Higher Bar Than Other AI Buyers

Most AI vendor due-diligence guides are written for general business buyers. Behavioral health practices face a different set of obligations.

You are a HIPAA Covered Entity. That status is not optional, and it applies to every technology vendor who touches patient data on your behalf. A standard SaaS privacy policy is not a substitute for a BAA. A tool that is excellent for a law firm or marketing agency may expose your practice to regulatory penalties, civil liability, and reputational harm if it lacks proper PHI safeguards.

The Office for Civil Rights (OCR) has issued settlements and corrective action plans specifically tied to insecure third-party software use. The average HIPAA settlement in recent years has exceeded $1 million. Small practices are not exempt.

This guide gives you a structured checklist — built around HIPAA requirements and the realities of AI product architecture — to evaluate any AI tool before you sign a contract or enter a single note.


What "Private AI" Actually Means for a Therapy Practice

The phrase "private AI" is used to market everything from fully on-premise models to standard cloud tools with a checkbox privacy policy. As a practice owner, you need a working definition that ties to your legal obligations.

For behavioral health purposes, private AI means:

  • No PHI leaves your control without a signed BAA covering the receiving party.
  • PHI is not used to train third-party models, ever, unless the patient has explicitly consented in writing.
  • You can demonstrate to an auditor exactly where data goes, how long it stays, and who can access it.
  • You receive timely breach notification — within the 60-day window required by HITECH — if a vendor incident involves your patients' data.

Encryption is one piece of this picture, not the whole picture. A tool can encrypt data at rest and in transit and still be non-compliant if it lacks a BAA, retains PHI indefinitely, or routes session data through unprotected analytics pipelines.


The HIPAA AI Vendor Checklist for Practice Owners

Work through each section when evaluating any AI tool that may touch clinical content, intake data, scheduling records, or any information that could identify a patient.

1. Business Associate Agreement (BAA)

  • Does the vendor offer a signed BAA before you input any PHI?
  • Does the BAA specifically cover the AI functionality you intend to use (not just cloud storage or billing)?
  • Is the BAA available in the standard contract or does it require enterprise negotiation?
  • Does the vendor list its own subprocessors and confirm subprocessors are also under BAA?

A vendor who hesitates on a BAA, or who says their tool is "HIPAA compliant" without offering a BAA, should be disqualified from consideration for any PHI-touching workflow.

2. PHI Handling and Data Flow

  • Can the vendor provide a written data flow diagram showing exactly where PHI travels?
  • Is PHI sent to any third-party AI model (e.g., OpenAI, Anthropic, Google) for inference or processing?
  • If third-party AI models are used, does the vendor have a BAA with those providers, and have they confirmed those providers will not use your data for training?
  • Are there product tiers where PHI handling differs? (Free or trial tiers commonly have weaker protections.)

3. Encryption Standards

  • Is data encrypted in transit using TLS 1.2 or higher?
  • Is data encrypted at rest? What encryption standard (AES-256 is the current baseline)?
  • Are database backups encrypted under the same standard?
  • Who controls the encryption keys — the vendor or your practice?

4. Retention, Deletion, and Minimum Necessary

  • What is the default data retention period for session data, notes, and uploaded files?
  • Can you configure retention to match your practice's policies?
  • Is there a self-service deletion path, or does deletion require a support ticket?
  • How long does full deletion propagate, including from backups?
  • Does the tool apply a minimum-necessary principle — or does it log everything by default?

5. Access Controls and Audit Logs

  • Is role-based access control (RBAC) available so staff only see data relevant to their role?
  • Are access logs available for your review?
  • Can you see a history of who accessed patient records within the tool?
  • What access does vendor staff have to your data, and under what circumstances?

6. Breach Notification

  • What is the vendor's documented incident response procedure?
  • Will you receive notification of a potential breach within 60 days as required by HITECH?
  • Has the vendor experienced any reportable breaches? What was the outcome?
  • Does the BAA specify the vendor's notification obligations?

7. Training Use and Human Review

  • Is your data ever used to train or fine-tune AI models?
  • Does the vendor conduct human review of session content for quality assurance or safety purposes?
  • If human review occurs, is it covered under the BAA and subject to the same access controls?
  • Can you opt out of training or analytics data collection while retaining full product functionality?

Red Flags in AI Privacy Marketing

The following statements appear frequently in AI vendor marketing. None of them, on their own, constitutes HIPAA compliance.

  • "We take privacy seriously." This is a values statement, not a policy.
  • "Your data is encrypted." Encryption is necessary but not sufficient.
  • "We don't sell your data." Sale is not the primary HIPAA concern — training use and third-party sharing are.
  • "HIPAA compliant." Compliance is a process and a set of controls, not a certification. Ask what specifically they mean.
  • "We use enterprise-grade security." This is undefined.
  • "Your data stays private." Ask for the data flow and retention policy in writing.

Any vendor that responds to your specific technical questions with marketing language rather than documented policies is a risk.


Vendor Evaluation Matrix

Use this matrix when comparing tools side by side. Score each category independently before reaching an overall decision.

Criterion Acceptable Caution Disqualify
BAA availability Offered in standard contract Available on request Not offered
PHI to third-party AI No PHI sent, or vendor holds BAA with subprocessor Unclear data flow PHI sent without BAA
Encryption at rest AES-256, vendor documented Stated but unspecified Not documented
Retention policy Configurable, documented Fixed but stated Not disclosed
Training use Opt-out confirmed in writing Opt-out available Opt-in or undisclosed
Deletion Self-serve, timely propagation Support ticket required Not possible
Breach notification 60-day SLA in BAA Promised verbally Not addressed
Audit logs Available to practice Limited Not available

How Architecture Affects PHI Risk

The way an AI tool is built determines where your PHI exposure actually lives. Two tools can make similar privacy claims while having fundamentally different risk profiles.

Client-side vs. server-side processing

Tools that process data on your device or within your own cloud environment do not route PHI to the vendor's servers. Tools that use server-side inference send data to the vendor (and often to the underlying AI model provider) on every interaction.

Pass-through AI vs. in-house models

Some behavioral health AI tools act as a thin wrapper over a general-purpose model like GPT or Claude. Your clinical content is sent to that third-party provider on every request. Others run inference on their own infrastructure under a BAA. These are fundamentally different architectures from a HIPAA standpoint.

Memory and context persistence

AI tools with persistent memory or long-term context features store conversation content — often indefinitely — to improve responses over time. For behavioral health use, this creates a PHI retention risk that standard consumer AI tools are not designed to address. Ask specifically whether session content is stored after the session ends and for how long.

PsyFiGPT was built to address exactly this architecture problem. It provides AI-powered clinical documentation — including session notes, treatment summaries, and correspondence — without sending PHI to third-party AI models. The architecture is designed from the ground up for practices that need to maintain HIPAA compliance without abandoning the efficiency gains AI offers.


AI Tools Across the Practice: Where PHI Exposure Lives

Clinical documentation is the most obvious PHI risk, but practice owners should map every workflow where AI is being used or considered.

Intake and scheduling

Intake forms and scheduling workflows collect sensitive information before the first session: presenting concerns, insurance data, medication history, and referral reasons. AI tools that automate intake processing must be evaluated with the same rigor as clinical documentation tools.

PsyFi Assist handles AI-powered intake and scheduling, including therapist matching, in a HIPAA-conscious framework. It gives practices the efficiency of automated intake without routing patient-submitted data through unprotected consumer AI pipelines.

Clinical reporting and analytics

Progress notes, treatment outcome measurements, and aggregate reporting all touch PHI. AI tools that generate clinical reports or analyze practice-level outcome data require their own BAA review and data flow assessment.

PsyFi Reports provides clinical report generation and analytics built for behavioral health practices. It allows practice owners to derive operational insights without manual export to general-purpose spreadsheet or BI tools that lack PHI safeguards.


Conducting Your Own Risk Assessment

HIPAA requires Covered Entities to conduct periodic risk assessments. Adding a new AI tool is a triggering event for a risk assessment update. A minimal AI-specific risk assessment should include:

  1. Asset identification. List every patient data type the tool will touch.
  2. Threat modeling. Identify ways PHI could be exposed through this tool (data breach, unauthorized access, training ingestion, improper disposal).
  3. Current controls. Document what the vendor provides (BAA, encryption, access controls).
  4. Residual risk. Identify gaps between your required safeguards and the vendor's actual controls.
  5. Mitigation plan. Document steps to close gaps before deployment.
  6. Documentation. Keep records. OCR requests documentation during audits and investigations.

You do not need to be a security engineer to complete this process. You do need written answers from your vendor for each item above.


Next Steps for Your Practice

Evaluating AI vendors for a behavioral health practice requires asking questions that most general-purpose software buyers never think to ask. The checklist above gives you the foundation.

Start with the BAA. If a vendor cannot provide one, the conversation ends there. Then work through data flow, retention, encryption, access controls, and breach notification. Put every material answer in writing before you sign.

If you are looking for a place to start with purpose-built tools that were designed for this compliance environment from day one, explore what PsyFi Technologies has built for behavioral health practices:

  • PsyFiGPT for HIPAA-conscious clinical documentation — AI-assisted notes, summaries, and correspondence without PHI passing through third-party AI models.
  • PsyFi Assist for AI-powered intake, scheduling, and therapist matching with patient data handled appropriately.
  • PsyFi Reports for clinical report generation and practice analytics without manual export to unsecured tools.

The goal is the same as it has always been in behavioral health: do right by your patients while running a sustainable practice. The right AI tools make both easier, not harder.

Frequently Asked Questions

What is a Business Associate Agreement and why does every AI vendor need one?
A BAA is a legally required contract under HIPAA between a Covered Entity (your practice) and any vendor who creates, receives, maintains, or transmits PHI on your behalf. Without a BAA, using a vendor to process PHI is itself a HIPAA violation — regardless of how the vendor describes their security.
Can I use a general-purpose AI tool like ChatGPT for clinical documentation if I remove patient names?
Removing names is not sufficient de-identification under HIPAA's Safe Harbor standard. HIPAA requires removal of 18 specific identifiers, and even de-identified notes can be re-identified in context. More practically, a general-purpose tool without a BAA should not be used for any workflow involving clinical content.
What is the difference between 'HIPAA compliant' and 'HIPAA certified'?
There is no official HIPAA certification. Any vendor claiming 'HIPAA certification' from a third party is referring to a voluntary audit or assessment, not a government credential. The meaningful question is whether they offer a BAA and can document the specific controls that support compliance.
Does encryption alone make an AI tool HIPAA compliant?
No. Encryption is a required safeguard but one component among many. HIPAA's Security Rule requires administrative, physical, and technical safeguards. Encryption addresses part of the technical safeguard requirement. A BAA, access controls, audit controls, and a breach notification policy are also required.
How long can a vendor retain PHI?
There is no single HIPAA-mandated retention period for vendor data. Your BAA should specify the vendor's retention policy. Standard practice is that vendors should retain PHI only as long as necessary to perform the contracted service and then return or securely destroy it.
What happens if a vendor experiences a breach of my patients' data?
Under HITECH, your Business Associate must notify you within 60 days of discovering a breach. You are then responsible for notifying affected patients and, depending on the size of the breach, the Department of Health and Human Services and potentially the media.
Should I avoid AI tools entirely in my practice?
No. AI offers real efficiency gains in documentation, scheduling, and reporting that can reduce administrative burden and support better patient care. The goal is not to avoid AI but to adopt it through tools built for behavioral health compliance rather than adapted from general-purpose consumer products.