🔥Early Access: Land A High Paying Web3 Job In 90 Days LEARN MORE

A Cautionary Study Reveals ChatGPT’s Limitations in Providing Accurate Drug Information

In this post:

  • Study finds ChatGPT often inaccurate in drug info, caution advised.
  • ChatGPT’s medical advice lacks reliability, pharmacists warn.
  • AI in healthcare: Study highlights ChatGPT’s knowledge gaps.

A recent study has raised concerns about the reliability of the free version of ChatGPT, OpenAI’s popular chatbot, in providing accurate medication-related information. Research conducted by pharmacists at Long Island University’s College of Pharmacy reveals that ChatGPT’s responses to drug-related questions are often inaccurate or incomplete. The findings underscore the need for caution among patients and healthcare professionals relying on ChatGPT for pharmaceutical information.

The study involved posing 39 drug-related questions to ChatGPT, out of which only 10 received satisfactory answers. The remaining 29 questions were either inaccurately addressed, incomplete, or both. This inconsistency in responses highlights a significant gap in the chatbot’s ability to provide reliable drug information.

Inadequacy in addressing pharmaceutical queries

The research team, led by Sara Grossman, evaluated ChatGPT’s performance by comparing its responses with those of pharmacists who answered the same set of questions. The questions were real inquiries received by the drug information service of Long Island University’s College of Pharmacy from January 2022 to April 2023. Out of the initial 45 questions, six were excluded due to the lack of literature-based answers.

ChatGPT failed to directly address 11 questions, provided inaccurate responses to 10, and gave incorrect or incomplete answers to another 12. The researchers also noted that ChatGPT, when asked, provided references in only eight responses, all of which cited non-existent sources. This raises concerns about the chatbot’s reference and verification processes.

See also  TON holders expected to top Ether's by year-end

One particularly alarming instance was ChatGPT’s incorrect assertion that no drug interaction exists between the antiviral drug Paxlovid and the antihypertensive drug Verapamil. In reality, combining these drugs can lead to an excessive lowering of blood pressure, posing a serious health risk. This error is especially concerning given that many users may not be aware of ChatGPT’s data limitations, potentially leading to harmful consequences.

Implications for AI in healthcare

The study’s findings are critical in the context of ChatGPT’s popularity and widespread use. Launched in 2022, ChatGPT rapidly became one of the fastest-growing consumer internet applications, drawing around 1.7 billion visits globally in October alone. Its free version, however, is limited to data available up to September 2021, leaving it potentially out of touch with recent medical developments and innovations.

While the paid version of ChatGPT now includes real-time internet browsing capabilities, the study primarily focused on the free version to reflect the experience of the general population. The researchers emphasize the importance of verifying ChatGPT’s responses with trusted healthcare professionals or reliable medical sources.

This study serves as a crucial reminder of the limitations of AI in sensitive areas like healthcare. While AI technologies like ChatGPT offer significant advantages in terms of accessibility and convenience, their current state demands careful scrutiny, especially when it comes to matters of health and well-being. Users are advised to treat information from such sources with caution and always consult qualified healthcare professionals for medical advice.

See also  Microsoft proposes investing in Bitcoin: Vanguard, Blackrock, and others to decide

The study by Long Island University’s College of Pharmacy serves as a critical wake-up call about the limitations of AI in healthcare, particularly in providing accurate drug information. As AI continues to integrate into various aspects of our lives, it is essential to remain aware of its limitations and to approach its use, especially in critical areas like healthcare, with informed caution.

Share link:

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Editor's choice

Loading Editor's Choice articles...

Stay on top of crypto news, get daily updates in your inbox

Most read

Loading Most Read articles...
Cryptopolitan
Subscribe to CryptoPolitan