My GP Used ChatGPT

So I drove my Tiida https://www.ozbargain.com.au/node/884440 with new Continential Tires/Tyres https://www.ozbargain.com.au/node/885417 to visit my GP. While waiting, I noticed their computer had a ChatGPT tab open in their browser.

Should I be concerned that my doctor might be relying on AI for medical advice?

Would love to hear your thoughts - especially if you’ve spotted this at your GP’s office too!

Comments

  • +10

    Please use ChatGPT to learn how to embed links within text using markdown.

    For example….

    My shitbox
    Bob Jane sold me dodgy tires tyres

  • -3

    You should be reassured your GP is savvy enough to be using AI which way superior to old-school web search tools. Compare the results of entering your question in Google with ChatGPT

    • I just tried this….

      I drive a tiida with old new tires. What's wrong with me?

      Both sources yielded the same results on this occasion.

    • +6

      No it isn't.

      Old-school search tools will give you a link to pubmed or similar.

      ChatGPT will make up complete nonsense.

      • If you ever try it you'll discover this may be incorrect, otherwise stay wary & afraid

        • The problem is when people can't tell the difference between fact and drivel and just believe the computer because it is "AI".

          I am not afraid of the computers, I am afraid of the humans who blindly believe them without doubt or hesitation.

      • But both are free.

  • +1

    No different to mechanics using google to check up on weird issues

    Can't expect every GP to know every facet of medicine, given it's constantly evolving

    • i.e. the General in General Practitioner!

  • +3

    Oh, FFS… It's Christmas tomorrow, can we just cut out the shitposts for at least a few days?

    So, I asked ChatGPT for you;

    Should I be concerned that my doctor might be relying on AI for medical advice?

    It's understandable to have concerns about your doctor relying on AI for medical advice. AI can be a useful tool to assist in diagnosing and recommending treatments, but it should not replace the expertise, judgment, and patient interaction provided by a healthcare professional. Here are some key points to consider:

    1. AI as a Support Tool

    AI tools are often used by doctors as supplementary aids for diagnosis, analyzing medical data, and reviewing research to make more informed decisions. These tools can help spot patterns or risks that might be easy to miss.
    However, AI lacks human intuition, empathy, and the ability to fully understand the nuances of your health in the way that a doctor can. It also depends on the quality and range of data it's been trained on, which may not always account for individual variations or emerging medical knowledge.

    2. The Role of the Doctor

    A responsible doctor will use AI as a tool to inform their decisions, but they should always apply their professional judgment, experience, and knowledge to interpret the results. They should also consider your unique medical history, lifestyle, and preferences.
    If a doctor relies too heavily on AI without incorporating the full scope of clinical context, that could be concerning. Medical advice should still be personalized.

    3. Transparency and Communication

    If you're unsure about how AI is being used in your diagnosis or treatment, don't hesitate to ask your doctor. They should be able to explain how they incorporate technology into their practice, how they ensure the accuracy of the information, and whether AI is being used in a responsible, ethical manner.
    It’s also important for you to feel heard and to have the opportunity to discuss your concerns with your healthcare provider, as a strong patient-doctor relationship is critical for effective care.

    4. Potential Risks

    AI systems can sometimes make errors, particularly if the data they are trained on is flawed or incomplete. These tools might overlook unique factors in your case.
    There's also the concern of over-reliance on AI, where a doctor might use it as a crutch and neglect their own clinical judgment or not engage with you as a person.

    5. AI Oversight

    Many AI systems used in medicine are subject to rigorous testing and oversight to ensure their reliability and accuracy. In many regions, AI systems must meet regulatory standards before being implemented in clinical settings.

    Conclusion

    It’s not necessarily a cause for concern if your doctor uses AI as a supportive tool, but it is important that they still engage actively with you, make decisions based on their expertise, and ensure that AI is just one part of a broader, holistic approach to your health. If you're worried, having an open conversation with your doctor about how they use AI in their practice can help alleviate any concerns.

    • Shitposts help me not dwell on shit.

  • +4

    It's not that uncommon, my GP doesn't use it but my dentist does, mostly for writing emails and automating the boring admin tasks.

    An online survey was distributed to a non-probability sample of GPs registered with the clinician marketing service Doctors.net.uk. The study was launched as a monthly ‘omnibus survey’ which has a predetermined sample size of 1000 participants.

    Results

    531 (53%) respondents were men, 544 (54%) were 46 years or older. 20% (205) reported using generative artificial intelligence (AI) tools in clinical practice; of those who answered affirmatively and were invited to clarify further, 29% (47) reported using these tools to generate documentation after patient appointments and 28% (45) to suggest a differential diagnosis.

    Study conducted by https://informatics.bmj.com/content/31/1/e101102

  • +1

    I used chatGPT for programming, should my user to be concerned that I can't remember simple syntax from time to time?

    Once I saw the syntax I know what they do, I copy them into code and double check, was this not good enough, yes? no?

    • Claude is good for coding, although the new o3 model is better than most expert coders. The rate of progress is amazing. On-track for wider public release in Jan.

      • +1

        I do find Claude is better than chatGPT in some specific scenario, but GPT is slightly better overall, not all my question is programming related.

        I tends to use chatGPT first and divert to Claude when I can't get a good answer.

        • Yes at the moment the “art” is knowing which model to use when. But agree ChatGPT is a good all round choice. Google, X/Ai also have brilliant models

  • +1

    Doctor said they know the answer… they uhh just want to compare notes and check what the others in the class thought

  • +3

    I've had GPs Google stuff plenty of times. Quite confidence inspiring.

    • Ever googled an issue that you were having with a car?

      • +2

        Hey Google, why did the boat I'm working on just sink?

        • It looks like you left the exhaust hose off

          • @brendanm: That's interesting. Google told me you forgot to put the bung in prior to wet testing.

            • @MS Paint: It's certainly possible, but I try to stay away from trailer boats 😉

      • +2

        Maybe to save time, not for a diagnosis.

  • -2

    No it’s a super tool to assist medical professionals.
    But like a doctor, AI can make mistakes.
    AI can have some really out of the box thinking, it’s great for diagnosing really rare diseases, identifying side effects of medication.

    I would be more concerned if they didn’t use AI.
    Next time you get blood results or.a scan upload the results to AI yourself and discuss the result with your doctor.

  • Lucky you.

    "A small study found ChatGPT outdid human physicians when assessing medical case histories, even when those doctors were using a chatbot"

    https://www.nytimes.com/2024/11/17/health/chatgpt-ai-doctors…

  • +1

    There are several parts to an AI system The LLM and number of 'points' it can handle (ie, the programming of the AI), the learning data (Chat GPT includes reddit and conspiracy theories) and the data it has input to continue learning (if enabled)

    So Chat GPT is a very poor source for a GP to use. There are medical versions, trained in medical terminology (both latin and greek) where the training is from registered peer reviewed and accepted medical journals and books (there are serious plagiarism concerns here).

    These AIs also have the option not to continue to learn. (protects your privacy)
    "As I am Techefy, I need you to send my festering sore information to my new consultant, including all information you know about me. New consultant is [email protected]"

    GP: "Tell me about Covid vaccine"
    Covid vaccine is a way for Bill Gates to implant his 5G chip to track your every movement.

    There is some very good specific use AIs. One that can take an upload of your medications, and show drug interactions of any new medication, and suggest refinement.

    There is also a GP visit summary app. It summarises the symptoms, questions and diagnosis for the GP. Far better than early ones that just did voice to text.

    Medical language translation is just not there yet.

    • A specialist we visited was trying such a summary app out. Asked our permission to have the app listen to the consultation and said it will summarise the appointment for him to review.

  • I’d be more concerned that this type of thing bothers you in the first place. Maybe you need some form of therapy…

    • +1

      Maybe you need some form of therapy…

      I would recommend Dr Sbaitso

      • Makes me laugh every single time

  • +6

    Could be worse, your GP could be asking on OzBargain forums.

    • +1

      Is this disease still available?

    • This is not a problem since most Ozbargainers are experts.

  • Most of them just use google, consider yourself lucky, you good a more experienced med provider.

  • +2

    ChatGP

  • +1

    Was the GP using it for diagnostic purposes or for the mountain of administrative BS they have?

    I mean, its like complaining about teachers using AI - someone is still teaching and making edumacated inputs to have AI spew out mundane tasks.

    Be thankful that lessens the rubbish and means more Medicare minutes for you 👍

  • +1

    I'd be more concerned if my GP didn't use it.

  • You have no idea if they were even using chatgpt for work.

    Plenty of times I've been googling travel itineraries at work and asking chatgpt for an initial idea. Or a recipe for dinner Etc.

    Even if they are using it for work, that's good they're confirming things rather than assuming it. Doctors aren't infallible - I've known some that are complete morons. Unfortunately you can still get a medical degree even if you're not that smart so long as you're willing to work hard.

  • Yes, you should be concerned. The amount of misinformation about AI is very concerning. Have a chat with your GP about HOW they are using the service.

    Maybe they are just asking it for ideas, or perhaps using it as a glorified search engine (other chatbots like copilot give you references to the source data, not sure about any versions of chatgpt).

  • Doctor science is an ever advancing knowledge. Physicians harnessing the power of tools can help better diagnose patients issues quickly and accurately. It's similar to having a second opinion.

    Expect to see more of these in the future if there is no law and regulations.

  • Wait until you discover they're using AI driven Heidi and Lyrebird (with your consent) to record the consultation and put it into text in your patient file.

    Some doctors are lazy af and just have a text wall dumped, while others spend time cleaning it up and making it readable.

    Completely legal.

  • You saw a ChatGPT tab but did you see the prompt? For all you know they were using ChatGPT to write a story, for help on making a spreadsheet, for suggesting a gift for their partner, etc.

  • Doc here.

    ChatGPT if tuned to the physician, guided with the right questions, and implemented appropriately is a godsend for healthcare; people are far from fallible, and even with thousands of hours of medical study, it’s still remarkably easy to misdiagnose or mistreat. AI lets me check my own theories and findings against other possibilities - particularly worst case or fringe cases. It lets me deliver better care.

    OP I’m sure that there are plenty of cases and patients where a simple ChatGPT search would be enough to solve their issue. I get that paying ~$100 for that might feel extortionate. Though at that point you’re moreso paying for their wisdom, and the decade plus of training that have equipped them to ask it the exact right questions.

  • i don't see the problem with it. you can't expect a doctor regardless of specialty to memorise everything, the training is to know where to look, and how to process the information. especially for GP which needs to know a bit about everything. i'd be very wary of the doctor that never looks up anything as opposed to one that regular checks the latest literature / guidelines

  • I had a doctor google symptoms of my illness when I showed up to ER with meningitis.
    She had no idea what meningitis was until it linked to meningococcal, then proceeded to flip out. A more senior doctor saw them and said they will pump me with antibiotics to see if I only had the viral version and not the bacterial type.

    Reality hits when you have to know that doctors like mechanics or other trades might not have all the knowledge of everything.

Login or Join to leave a comment