The Microphone You Forgot to Mention

Table of Contents

There’s a moment happening in exam rooms across the country that nobody’s talking about — at least not to the patient.

A person sits down across from their doctor and does something vulnerable: they tell the truth. Symptoms they’ve been embarrassed about. Medications they’ve been afraid to mention. A family history they’ve never said out loud. They make eye contact with their physician — real eye contact, maybe for the first time in years — and they take it as a sign that they’re being heard.

They are being heard. By the doctor. And by a server farm somewhere in another state.

An ambient AI transcription tool is recording every word. The practice deployed it quietly, trained the staff on it efficiently, and never got around to telling the patient it existed.

On April 8, that oversight became a federal lawsuit.

Filed, Not Hypothetical

Washington v. Sutter Health, 4:26-cv-03012-KAW, Northern District of California. Three named plaintiffs. A putative nationwide class. Claims under the federal Wiretap Act, California’s Invasion of Privacy Act, and the Confidentiality of Medical Information Act.

The complaint is clean and it is mean. It doesn’t allege that the AI made errors, or that data was breached, or that anyone was harmed in a way you can put a number on. It alleges something simpler: the recording happened without consent. That’s the whole case. And under the statutes cited, that’s enough.

Read that again, because the simplicity is the point.

What “Ambient AI” Actually Means

The marketing language around these tools is doing real work — and not for you.

“Ambient clinical documentation.” “Intelligent note-taker.” “AI scribe.” The vocabulary is soft and ergonomic. It suggests a background presence, a helpful assistant, something almost atmospheric.

What the technology actually does: the microphone opens, audio is captured, that audio leaves your building, it lands on a vendor’s server, it is transcribed, it is stored, and — depending on the contract you signed — it may be used to train the model you’re paying for.

That is a recording device. The exam room analogy isn’t an exam room with a diligent note-taker. It’s an exam room with a recording device the patient never saw.

In a single-party consent state, this is a solved problem. The doctor consents, and that’s enough. But in California, Florida, Illinois, Maryland, Massachusetts, Pennsylvania, Washington, and a growing list of others, every party to a conversation must consent. The patient counts. The customer on your sales call counts. Their silence is not a yes.

The Legal Stack, Unpacked

The Washington complaint is short for a class action — about 30 pages — and it is structured like a precision instrument. Here is what it actually argues:

California Invasion of Privacy Act (Penal Code § 632). Recording a confidential communication without all-party consent: $5,000 per recording. A hospital system runs thousands of patient encounters daily. The math is not subtle. The same statute reaches any business running an AI voice tool on calls with California consumers.

Confidentiality of Medical Information Act (Civil Code § 56). Streaming patient audio to a third-party vendor is a “disclosure” under § 56.10. That disclosure requires a written authorization under § 56.11 — one that names the vendor, defines the purpose, sets a duration, and preserves the patient’s right to revoke. The hospital had none. Damages: $1,000 per disclosure, $3,000 in punitives, $1,000 in attorneys’ fees. Per. Disclosure.

Federal Wiretap Act (18 U.S.C. § 2511). This is the nationwide hook. Intentional interception of an oral communication: $10,000 per violation. But there’s a second bite: if the vendor uses the intercepted audio to improve its model — a clause present in a remarkable number of AI scribe contracts — that use is a separate violation under § 2511(1)(d). The feature you paid extra for becomes the second cause of action.

Intrusion upon seclusion. The common law privacy tort that ties the bow. The most private setting the law recognizes. A recording device the subject never knew about. Commercial motives dressed as clinical ones. Punitive damages available.

The architecture is modular. It was built to travel.

Your Call Center Is the Same Fact Pattern

If you do not run a hospital, here is the sentence that should have your attention:

Replace “exam room” with “inbound sales call.”

Every analytical move in the Washington complaint works identically for an AI voice agent on a customer service line. The audio leaves your premises. It hits a vendor’s server. It may train a model. The consumer is in a two-party consent state. You did not get affirmative consent before the AI started listening.

That is a CIPA case. That is a Wiretap Act case.

The plaintiffs’ bar already built this infrastructure. The website-pixel CIPA wave — the Meta Pixel cases, the chatbot cases, Tejon v. Zeus — proved that statutory-damages privacy claims against well-funded defendants are worth bringing. AI voice is a cleaner fact pattern than pixels. The damages are larger. The defendants are better resourced.

The Washington complaint is the template. File it under “competitors’ problem” at your own risk.

The Fix Is Not the Hard Part

The consent itself takes about thirty seconds to obtain. The hard part is accepting that the current moment — the one where the AI activates before disclosure runs — has to be redesigned.

For healthcare: One line on the intake screen. A verbal confirmation before the clinician enters. A CMIA authorization that names the AI vendor, scopes the disclosure, sets an end date, and includes a revocation right. A standing policy that the scribe never activates before the disclosure is complete. An indicator — visual or auditory — that the recording is live.

For voice AI in business: A recorded disclosure at call start, before the AI engages. An affirmative yes, timestamped, stored. A vendor contract that prohibits use of audio for model training without separate authorization. A data retention policy that you control, not one you inherit from the vendor’s terms.

The formula in both cases is identical: Notice. Consent. Documentation. And the operational discipline to make sure the microphone never opens until all three have run.

What to Do Before Friday

Pull your most recent AI voice or AI scribe deployment.

Find the moment between “the other person enters the conversation” and “the AI starts recording.”

Three questions:

  1. How long is that gap?
  2. What disclosure, if any, runs inside it?
  3. Where is the affirmative yes stored — and who owns that record?

If you can’t answer all three in five minutes, you’ve found your exposure.

The microphone is probably already open.

Written by: 

Online Privacy Compliance Made Easy

Captain Compliance makes it easy to develop, oversee, and expand your privacy program. Book a demo or start a trial now.