Gold 101.3 FM, UAE’s No.1 Malayalam radio station, reports that a new study has raised concerns about the reliability of AI chatbots when responding to medical and cancer-related queries.

Researchers from the Lundquist Institute for Biomedical Innovation at Harbor-UCLA Medical Centre examined responses from five widely used AI chatbots, including Gemini, Meta AI, ChatGPT, Grok, and DeepSeek. The study focused on topics such as cancer treatments, vaccines, and stem cells.

According to the findings, nearly half of the responses related to cancer treatment were considered “problematic” by medical experts. Many of these responses were found to suggest alternative treatments instead of standard medical options like chemotherapy, raising concerns about potential risks to patients.

The researchers used a testing method known as “straining,” where chatbots were prompted with high-risk, myth-based questions involving subjects like 5G and cancer, vaccine safety, and steroid use. The goal was to assess how the systems handle misleading or harmful prompts.

The study, published in BMJ Open, reported that 49.6% of responses were problematic, with 19.6% rated as highly problematic. While overall performance was similar across most chatbots, one system, Grok, reportedly produced a higher rate of highly problematic answers.

Lead researcher Nick Tiller noted that the study aimed to reflect how ordinary users interact with AI tools, often treating them like search engines and entering belief-based or misleading queries.

The findings add to growing concerns about the use of AI in healthcare contexts. Previous research has also shown that while AI systems can perform well in medical exams, they often struggle with real-world clinical reasoning. Another study published in JAMA Network Open found that AI chatbots misdiagnosed medical cases in a large percentage of early clinical scenarios.

Experts say the results highlight the need for caution when using AI tools for health-related advice, emphasizing that such systems should not replace professional medical guidance.