News

But three of those FDA employees told CNN that Elsa just makes up nonexistent studies, something commonly referred to in AI ...
Insiders at the Food and Drug Administration are ringing alarm bells over the agency's use of an AI to fast-track drug ...
Insiders tell CNN the FDA’s AI is “hallucinating” studies and can’t access key documents. Agency leaders insist the AI is getting better, and use is not mandatory.
The federal agency introduced Elsa last month, boasting about the AI tool's ability to increase efficiency at the FDA.
FDA officials say the assistant is flawed, just as the Trump administration stresses AI adoption in healthcare.
The FDA's generative AI, Elsa, has a massive hallucination problem, according to the agency's employees themselves.
With reports that FDA’s AI Elsa is “confidently hallucinating” studies that don’t exist, the use of AI to streamline drug ...
Despite ambitions to streamline regulatory review, FDA’s Elsa platform has been prone to hallucinations, prompting internal scrutiny and questions about AI reliability and governance.
In recent years, Artificial Intelligence (AI) has made significant strides across various sectors, providing innovative solutions and enhancing efficiency. However, ...
There’s no questioning that AI should have governance and guardrails. As much good as it can bring, it also heightens risk.
Hallucinations are a known problem with generative AI models—and Elsa is no different, according to Jeremy Walsh, the head of ...