Clinical insights

Three Reasons AI in Healthcare Isn’t All It’s Cracked Up to Be

Tim Wetherill, MD
January 28, 2025
Share via:
Abstract illustration of a futuristic healthcare setting with a figure standing among medical equipment and glowing bottles, rendered in soft neon hues of blue, purple, and teal. The text overlay reads, 'Three Reasons AI in Healthcare Isn’t All It’s Cracked Up to Be.'

Artificial Intelligence (AI) is often hailed as the savior of healthcare, the answer to endless waiting rooms, indecipherable bills, and an unhealthy dependency on fax machines. But let’s take a moment to breathe. Before we declare AI the cure-all, let’s ask ourselves: can it really handle the chaos we call modern medicine? Or, is it just a tech bro with delusions of grandeur, promising us miracles while misspelling “pneumonia?”

Here’s the thing: AI is brilliant in theory. It’s already helping doctors read x-rays, design new drugs, and even assist in surgeries. But saying AI will fix healthcare is like saying a treadmill will fix obesity—there is much more to the story. And, right now, healthcare is a minefield. Here are three big reasons AI might stumble before it can truly soar.

1. The Data is a Dumpster Fire

First, we have to consider the data. Imagine reading your medical record. Actually, don’t—unless you fancy a migraine. It’s a maze of jargon, errors, and contradictory statements. One moment, the doctor’s prescribing a medication, the next, they’re un-prescribing it, and by the end, it’s unclear if you ever needed it at all. It’s not a medical record—it’s a hospital’s cluttered stream of consciousness.

The problem isn’t just mistakes. It’s the sheer volume of useless information clogging up the system. A three-day hospital stay can generate a thousand pages of notes. A thousand! That’s more than War and Peace, and at least Tolstoy had the decency to make it coherent.

AI relies on good data to learn and improve. But, because of the state they’re in, feeding it our current medical records will not end well. Sure, you’ll end up with something, but it’s probably not based on facts. If your AI does not understand how dirty medical records are, it will simply churn out more confusion than clarity.

2. People Follow the Money

Next, we have to be realistic about the corporate side of healthcare. Hate to burst your bubble, but healthcare isn’t purely altruistic. For all its noble goals, it’s also a business—a big, bloated business where the bottom line often takes precedence over the patient. AI isn’t immune to this. In fact, it’s the perfect tool for squeezing out even more profit.

Take electronic medical records (EMRs). Powered by early AI functionality, they were supposed to streamline care, but in reality, they’re nudging doctors toward pricier diagnoses by recommending outcomes that have a higher payout. Why call it pneumonia when you can call it sepsis and charge three times as much? It’s like a dodgy mechanic telling you your car needs a new transmission when all it really needs is a bit of air in the tires. 

And it’s not just theoretical. In 2020, a tech company took kickbacks from Big Pharma to use AI to promote opioids through its software. Let that sink in. AI wasn’t helping doctors, it was nudging them toward prescriptions that lined someone’s pockets—and ruined countless lives. If AI isn’t designed with the best intent, we’re just handing a scalpel to a corporate boardroom and hoping for the best.

3. We’re Not Getting the Full Story

Lastly, medicine isn’t just about numbers and test results. It’s about listening to people. And people, as it turns out, aren’t superb at explaining their own health backgrounds. They’ll ramble, contradict themselves, and occasionally throw in a random detail that sends you down the wrong path entirely.

A patient might say, “My arm hurts when I lift it. Or when it’s by my side. Or when I’m asleep. Also, my fingers feel weird, but only on Tuesdays.” And it’s up to the doctor to figure out whether this is a simple muscle strain or the early stages of cancer. AI, on the other hand, tends to break stories into neat little chunks. For example, with the same patient AI may see “left arm,” “pain,” “elbow,” etc. But life isn’t neat. It’s messy, inconsistent, and full of nuance that machines just don’t get. And the story isn't just told with words and numbers.

Did you know doctors smell you? Not in a creepy way, but if the room smells like you haven’t showered in weeks, we take note. We also watch your movements, your demeanor and how you behave in the exam room. Trauma surgeons will watch as we wheel you to the CT scanner. If you cross your legs in a relaxed state, we can relax. This tells us that you don't have an injury requiring emergency surgery. The subtle cues can sometimes be more valuable than an MRI. It’s uncanny how predictable these small body movements can be. Can AI process these hidden cues? 

AI might get better at this over time, but for now, it’s like trying to explain sarcasm to a robot. Sure, it might understand the words, but the meaning? Completely lost. And in healthcare, missing the meaning can have life-or-death consequences.

AI is a Tool, Not a Miracle

Let’s be clear: AI isn’t the villain. It’s not going to become self-aware and start diagnosing everyone with gout just for fun. It’s a tool—a powerful one—but it needs to be used wisely. That means respecting how dirty the data is, focusing on actual health outcomes instead of profits, and recognizing that some parts of medicine may always require a human touch.

With time, AI can and will transform healthcare. Currently, it has the potential to serve as a co-pilot and, in many cases, already is. To make the most of AI, tools need to acknowledge dirty data, not sweep it under the rug. Take the time to understand how your AI is designed and what lessons and data it relies on. 

To learn more about Machinify and how it uses AI to improve claims processing, schedule a demo today.

Tim Wetherill, MD
January 28, 2025
Share via:
contact us
sign up

This is a selected article for our subscribers

Sign up to our newsletter to read the full version of this article, access our full library and receive relevant news.