Skip to Main Content

Student Guide to Generative AI

Basic tips and advice to consider when using generative AI tools and chatbots in an academic setting.

Fact-checking is always needed

AI "hallucination"
The official term in the field of AI is "hallucination." This refers to the fact that it sometimes "makes stuff up." This is because these systems are probabilistic, not deterministic.


Which models are less prone to hallucination?
This problem of making things up can be made less likely to happen by a technique known as “grounding.” That’s connecting external sources of data (like the Internet for example) to the model, so it can respond based on what it finds in those sources. But it's still not perfect. Keep in mind that the Internet sources used by the model, could also contain misinformation or disinformation. So verification of the output and evaluation of any sources it references is still needed.

 

Despite the hype, AI technologies — particularly those based on machine learning — are probabilistic systems. They rely on patterns and probabilities to make decisions rather than exact, deterministic rules.
~ Tshilidzi Marwala, United Nations University in AI is Not a High-Precision Technology...

 

Citations Provided by Chatbots

Generative AI tools are widely known to make up citations that don't exist. This is called "hallucination."

  • The tool might give you articles by an author that usually writes about your topic, or even identify a journal that publishes on your topic, but the title, pages numbers, and dates are completely fictional. 
  • You can try to see if any are valid by searching in Library Search on our home page or in Google Scholar, but chances are the sources do not exist.

It's better to use gen AI tools for . . .

  • Brainstorming and getting creatively unstuck
  • Editing and constructive criticism of your writing
  • Explaining concepts at multiple difficulty levels
  • Summarizing long texts
  • other writing and text-related tasks

AI tools are not designed to be search engines - even if they can search the web.

  • It's better to the Library's Summon Quick Searchlibrary databases, or even Google Scholar instead when doing research.
  • Some AI tools can search the web, like Perplexity AI. It combines a language model with a search engine and provides links to its sources, so you can fact-check. It doesn't include all the scholarly resources you would find in a Library database search or Google Scholar, but it can be a complementary tool for finding web search results using natural language.
  • Even if an AI tool searches the web, you must still fact check the output it gives you. 

References

Heaven, W. D. (2024). Why does AI hallucinate? MIT Technology Review, 127(4), 20–21. [Academic Search Complete]

Student Guide to ChatGPT by University of Arizona Libraries (Nicole Hennig, Michelle Halla, Nicole Pagowsky, and Niamh Wallace), 2025, licensed under a Creative Commons Attribution 4.0 International License.