Why Your Medical Record AI Should Run On Your Own Computer

· 10 min read

Every AI-powered medical record summarization tool on the market today—EvenUp, Supio, Wisedocs, Precedent, Superinsight, and the rest—operates the same way. You upload your client's protected health information to their cloud servers. Their AI processes it on their infrastructure. Then you download the result.

This workflow has become so normalized that most firms never pause to consider what they are actually doing: transmitting the most sensitive category of personally identifiable information across the internet to servers they do not control, cannot audit, and have no physical access to.

There is a better way. And as both HIPAA enforcement and state bar ethics opinions tighten around cloud-stored client data, the argument for on-premise processing is becoming harder to ignore.

What Happens When You Upload Medical Records to the Cloud

When you send a PDF of medical records to a cloud-based AI platform, the following things occur:

  1. The file is transmitted over the internet, typically encrypted in transit via TLS.
  2. It is received by the vendor's servers—usually hosted on AWS, Google Cloud, or Azure.
  3. The records are processed by one or more AI models, often including third-party models from OpenAI, Anthropic, or Google.
  4. Extracted text and structured data may be stored temporarily or permanently on the vendor's infrastructure.
  5. In some cases, your data may be used to improve the vendor's AI models, unless you have explicitly opted out.

Each step in this chain introduces a point of risk. Not theoretical risk—the kind of risk that has already resulted in data breaches affecting millions of healthcare records.

The HIPAA Question

HIPAA's Security Rule requires covered entities and their business associates to implement safeguards protecting the confidentiality, integrity, and availability of electronic protected health information (ePHI). Law firms processing medical records typically fall under the business associate umbrella.

Cloud-based AI vendors generally address HIPAA compliance by executing Business Associate Agreements (BAAs) with their customers. However, a BAA is a contractual instrument, not a technical one. It does not prevent a breach. It merely establishes liability after one occurs.

Key HIPAA Consideration

Under the HIPAA Security Rule (45 CFR 164.312), entities must implement access controls, audit controls, integrity controls, and transmission security. When data resides on a third-party cloud platform, the law firm is relying entirely on the vendor's implementation of these controls—controls they cannot independently verify.

The Sub-Processor Problem

Most cloud AI platforms do not run their own AI models. They call APIs from providers like OpenAI, Anthropic, or Google. This means your client's medical records may traverse multiple third-party systems, each with its own data handling policies, retention schedules, and security posture.

When you signed a BAA with your medical chronology vendor, did that BAA extend to every sub-processor in the chain? In many cases, the answer is unclear. And "unclear" is not a comfortable position when a state bar disciplinary committee is reviewing your data handling practices.

State Bar Ethics and Cloud Storage of Client Data

The American Bar Association's Formal Opinion 477R (revised 2017) addresses a lawyer's duty to protect client communications and data. While it acknowledges that cloud computing is permissible, it imposes a duty of competence that requires lawyers to understand the technology they use and its associated risks.

"A lawyer generally may transmit information relating to the representation of a client over the internet without violating the Model Rules of Professional Conduct where the lawyer has undertaken reasonable efforts to prevent inadvertent or unauthorized access."

The phrase "reasonable efforts" is the operative standard. What constitutes "reasonable" is evolving, and it is evolving toward more stringent requirements. Several state bar associations have issued opinions expanding on this duty:

The direction of these opinions is clear: the more sensitive the data, the greater the burden on the attorney to justify the chosen method of storage and processing. Medical records—containing diagnoses, treatments, mental health history, substance abuse records, and other deeply personal information—represent the highest tier of sensitivity.

The On-Premise Alternative

On-premise processing eliminates the entire cloud risk chain. When medical record AI runs locally on your own hardware:

Why Every Competitor Is Cloud-Only

You may be wondering: if on-premise is clearly the more secure option, why does every other medical record AI vendor operate exclusively in the cloud?

The answer is business model, not technology. Cloud-based delivery allows vendors to:

These are advantages for the vendor, not the customer. The cloud model exists because it is more profitable for the company selling the service, not because it is better or safer for the firm using it.

Practical Requirements for On-Premise AI

Modern on-premise medical record AI does not require specialized hardware. Solutions like MedRecords AI run on any standard laptop or desktop computer. The software installs locally for maximum data privacy—your medical records never leave your machine—while AI inference is handled through secure cloud APIs like AWS Bedrock. This hybrid architecture gives you the privacy benefits of on-premise data storage without requiring expensive GPU hardware.

The hardware you already own is almost certainly sufficient. Any computer running Windows 10 or later with 8 GB of RAM can run on-premise medical record AI effectively. There is no need for a workstation-class machine, specialized GPU, or any capital expenditure beyond the software license itself. Inference costs are minimal and can be billed directly to clients as a case expense.

The Trajectory of Enforcement

HIPAA enforcement has intensified steadily over the past five years. The HHS Office for Civil Rights collected over $4 million in penalties in 2024 alone for violations related to inadequate ePHI protections. State attorneys general have also begun pursuing independent actions under state health privacy laws.

Simultaneously, AI-specific data privacy regulations are emerging at both the state and federal level. The intersection of AI processing and healthcare data is receiving particular scrutiny. Firms that proactively adopt architectures that keep PHI under their direct control will be better positioned as this regulatory landscape continues to develop.

The Bottom Line

Cloud-based AI is convenient. It requires no installation, no hardware, and no technical overhead. But convenience has a cost, and in this case, that cost is measured in regulatory risk, ethical exposure, and the fundamental loss of control over your clients' most sensitive information.

On-premise medical record AI is not merely a privacy-conscious choice. It is increasingly becoming the professionally responsible one.

Process Medical Records Without Sending Data Anywhere

MedRecords AI runs entirely on your own computer. No cloud. No uploads. No third-party access to patient data. Try it free.

Start Free Demo