Your AI Chatbot Isn't a Doctor—It Just Plays One on the Internet

Where does ChatGPT get its information? Sure, it's fast, but is it always right?

9/30/20254 min read

Let's be real—we've all done it. That weird rash pops up, and before you can say "WebMD," you're three clicks deep into the internet, convinced you have a rare parasitic infection. Welcome to modern healthcare, where everyone has a medical degree from the University of Google. 🎓

But here's the thing: not all online health information is created equal. And with AI chatbots like ChatGPT, Gemini, Grok, and Claude becoming popular sources for medical advice, we need to talk about what's actually safe and what might land you in the ER.

The Rise of Dr. ChatGPT 🤖

AI chatbots have become surprisingly popular for health questions. A recent survey found that one in six adults uses AI chatbots for medical information at least once a month. That's a lot of people asking their phones about their symptoms!

And honestly? I get the appeal. AI chatbots are available 24/7, they don't judge you for asking about that embarrassing rash, and they're free. No waiting room, no copay, no awkward small talk about the hot AZ weather.

But here's where things get scary.

Where AI Learned "Medicine" (Hint: It's Not Med School) 📚

AI chatbots learn by reading tons of sources and then answering questions based on what they've "read." Sounds great, right?

Wrong. Because AI doesn't just learn from medical textbooks and peer-reviewed journals. According to research from Concurate.com, here's where ChatGPT actually gets its information:

  • Third-Party Blogs & Expert Reviews: 34% (Who determines if the "expert" is actually an expert?)

  • Review Platforms: 18% (Think Yelp, but for everything)

  • Wikipedia: 15% (Anyone can edit this, by the way)

  • News & Industry Publications: 15% (At least some of this might be legit!)

  • Vendor Product Pages: 6% (Companies selling you stuff)

  • Product-Specific Blogs: 5% (More blogs!)

  • Reddit & Community Threads: 4% (Yes, the same place where people debate whether hot dogs are sandwiches)

  • YouTube Review Videos: 3% (Home of cat videos and "doctors" who graduated from the University of My Opinion)

Notice what's missing? Peer-reviewed medical journals. Evidence-based medical guidelines. Actual medical textbooks.

Would you trust medical advice based mostly on blogs and review sites? I hope not! But that's essentially what you might be getting when you ask AI for health advice.

When AI Goes Dangerously Wrong ⚠️

Still think AI medical advice is harmless? Let me share a terrifying real-life example.

A 60-year-old man wanted to reduce the amount of salt in his diet—a perfectly reasonable health goal. So he asked ChatGPT for advice. The AI recommended he take sodium bromate.

Here's the problem: sodium bromate is toxic to humans.

The man ended up in my world—the emergency department—with bromate toxicity, suffering from paranoid delusions and other serious symptoms. All because he trusted AI advice without checking with a real doctor.

You can read the full scary story here.

This isn't just a "whoops" moment. This was life-threatening.

Why AI Makes a Terrible Doctor 🩺

Look, AI chatbots have some serious limitations when it comes to healthcare:

They're inconsistent. Ask the same question twice, get different answers. Your real doctor wouldn't tell you one thing on Monday and something completely different on Tuesday.

They can't see you. AI can't look at your rash, check your breathing, or notice the worried look that tells an experienced doctor something's wrong.

They hallucinate. No, not like that. AI can make up false information or cite studies that don't exist. Imagine receiving advice based on research that never took place. 😬

They don't know YOU. AI can't understand your personal health history, medications, allergies, or living situation. Good healthcare is personal.

They make mistakes with real consequences. Such as recommending toxic substances for salt reduction. Need I say more?

Where You Should Actually Get Health Info ✅

The internet has great health resources—you need to know where to look. Here are my trusted recommendations:

🏥 THE SAFE ZONE: Your Trusted Health Sources

MedlinePlus - Run by the National Library of Medicine with reliable, up-to-date info

National Institutes of Health (NIH) - The gold standard for medical research

Centers for Disease Control and Prevention (CDC) - Your go-to for diseases, vaccines, and public health

Mayo Clinic - One of the most respected medical institutions in the world

FamilyDoctor.org - Created by actual family doctors for patients

These sites are written and reviewed by real medical professionals, not Reddit users with strong opinions.

Can You Ever Use AI for Health Stuff? 🤔

Here's my honest take as an ER doctor: AI can be a useful tool, but it should never replace actual medical care.

✅ DO use AI to:

  • Understand the medical terms your doctor used

  • Learn basic info about a condition you've already been diagnosed with

  • Generate questions to ask your healthcare provider

  • Get general health and wellness information

❌ DON'T use AI to:

  • Diagnose yourself

  • Decide whether you need to see a doctor

  • Choose treatments or medications

  • Get advice for severe or urgent symptoms

Think of AI as a starting point for conversation with your doctor, not as a replacement for medical expertise.

Before You Click Again

The internet can be excellent for health information—if you use the right sources. Stick to government health agencies, major medical institutions, and sites run by professional medical organizations.

When it comes to AI chatbots, be smart. We can help you understand information, but we shouldn't be making your healthcare decisions. We don't have the training, the ability to examine you, or the knowledge of your personal health history that a real doctor has.

And please, if you're ever unsure about health advice you find online—whether from AI, a website, or your brother-in-law's Facebook post on using peptides —run it by your actual healthcare provider. That's what we're here for.

Your health is too important to trust to sources that learned medicine from Reddit. You deserve better than that.

The next time you're tempted to ask ChatGPT about that weird pain in your side, take a breath. Visit a trusted health website instead. Even better? Call your doctor's office. We went to medical school, we know your history, and we took an oath to "do no harm."

Your health journey shouldn't start with "I asked the internet." It should start with "I talked to my doctor."

Stay healthy, stay smart, and for the love of all that is medical—don't take sodium bromate. 🙏

Remember: the best medical advice comes from medical professionals, not algorithms trained on blog posts.