Skip to main content

AI chatbots offer possibilities and perils to researchers seeking library resources

Joanne McIntyre

Joanne McIntyre

Artificial intelligence assistants are an exciting new tool for researchers, scholars and students. But, along with the potential that these powerful new technologies possess, there are also problems that are showing up for the University Libraries.

When researchers use AI chatbots to discover materials such as books and articles there is a danger that AI may generate a false citation. Some parts of the citation might sound real, but the entire source could be completely made up by the chatbot. Submitting the citation as a request to the Libraries Interlibrary Loan (ILL) service can lead to delays—and unnecessary work.

“Librarians who work in ILL are experts at finding rare, unique and hard-to-locate citations,” said Joanne McIntyre, resource sharing librarian and head of resource sharing at the Libraries. “We’re really tenacious, we don’t give up. This can be a problem when the citation we’ve been given is an AI hallucination.”

False citations can happen due to how AI generates its content.

“Large language models (LLM), which power tools like ChatGPT and Copilot, are really great at generating language that sounds plausible,” explained Natalia Tingle Dolan, business research librarian at the University Libraries. “They work probabilistically, by generating the word that is the next most likely to appear after the word they just generated. This makes them sound very convincing, but it does not make them accurate.”

The University Libraries recommend that everyone—students, faculty, researchers and scholars —approach AI tools with curiosity, critical thinking, while double checking sources.

“If you know or suspect that a citation was generated using AI, please let us know when you request materials,” said McIntyre. “You can use the Libraries Catalog, OneSearch (the search bar on the libraries website), or even speak with a subject librarian to help you determine whether a citation is real or a hallucination.”

Natalia Tingle Dolan

Natalia Tingle Dolan

AI tools are evolving all the time. Previous versions of the chatbots did not have real time access to the internet, they only had the data they were already given. Now, they can do web searches and get the latest information—or misinformation.

“We, as individuals and as a society, don’t have a mental model of how to use these tools, what they are good at doing and what they are bad at doing,” said Tingle Dolan. “One way to think about it is we’re back in the late 90’s when the first search engines became available to the public. We had no idea how to use them. We had to figure it out even as each service was evolving.”

Tingle Dolan added: “We’re at a very exciting time. This technology is very powerful, but it is changing a lot right now. As you explore the capabilities of these tools, keep an open mind and be willing to adjust your perceptions, both initially and ongoing.”