IMPORTANT: Keep in mind that generative AI tools can produce biased and inaccurate content. Any content generated by AI should be thoroughly verified for accuracy through human review and additional research.
AI "hallucination"
The official term in the field of AI is "hallucination." This refers to the fact that it sometimes "makes stuff up." This is because these systems are probabilistic, not deterministic.
Which models are less prone to hallucination?
This problem of making things up can be made less likely to happen by a technique known as “grounding.” That’s connecting external sources of data (like the Internet for example) to the model, so it can respond based on what it finds in those sources. But it's still not perfect. Keep in mind that the Internet sources used by the model, could also contain misinformation or disinformation. So verification of the output and evaluation of any sources it references is still needed.
Despite the hype, AI technologies — particularly those based on machine learning — are probabilistic systems. They rely on patterns and probabilities to make decisions rather than exact, deterministic rules.
Generative AI tools are widely known to make up citations that don't exist. This is called "hallucination."
It's better to use gen AI tools for . . .
other writing and text-related tasks
AI tools are not designed to be search engines - even if they can search the web.
Heaven, W. D. (2024). Why does AI hallucinate? MIT Technology Review, 127(4), 20–21. [Academic Search Complete]
Student Guide to ChatGPT by University of Arizona Libraries (Nicole Hennig, Michelle Halla, Nicole Pagowsky, and Niamh Wallace), 2025, licensed under a Creative Commons Attribution 4.0 International License.