AI Chatbots Cause Librarians Fatigue with Non-Existent References and Hallucinations, (from page 20260125.)
External link
Keywords
- AI chatbots
- librarians
- misinformation
- fake citations
- ChatGPT
- hallucinations
Themes
- AI chatbots
- librarians
- misinformation
- citation issues
- trust in technology
Other
- Category: technology
- Type: blog post
Summary
AI chatbots like ChatGPT are contributing to a significant rise in requests for non-existent books and research articles, causing frustration among librarians. Sarah Falls from the Library of Virginia reports that approximately 15% of reference questions are AI-generated queries, often involving fictitious citations. Many users tend to trust the AI over experienced librarians, leading to confusion and disbelief when informed about the inaccuracies. Furthermore, similar issues have been prevalent for years, with past academic papers containing fake citations even before the advent of chatbots. Users often mistake AI’s authoritative tone for reliability, complicating efforts to correct misinformation.
Signals
| name |
description |
change |
10-year |
driving-force |
relevancy |
| Increased Librarian Exhaustion |
Librarians report feeling overwhelmed by AI-generated reference queries. |
Shift from traditional reference requests to increased AI-generated inquiries. |
Librarians may become overburdened or seek new systems to manage AI interactions. |
The rise of AI chatbots contributing to misinformation and false citations. |
4 |
| Public Trust in AI Over Humans |
People increasingly trust AI over human experts for information and references. |
Change from reliance on human expertise to blind trust in AI outputs. |
Potential decline in the role of human experts in information dissemination. |
Perception of AI as an authoritative source despite inaccuracies. |
5 |
| Emergence of Fake Academic Resources |
Academic and literary works generated by AI that do not actually exist accessible to the public. |
Shift from traditional publications to reliance on potentially fictitious AI-generated works. |
Possible proliferation of fake literature affecting academic integrity and library systems. |
Increased accessibility and usage of AI in generating content without verification. |
5 |
| Misunderstanding of AI Limitations |
Users develop misconceptions about how to prompt AI for reliable information. |
Shift from realistic expectations of AI outputs to overconfidence in its capabilities. |
Growth in disillusionment when AI fails to deliver credible information. |
Misunderstanding of AI workings leading to ineffective user strategies. |
4 |
| Decline in Research Quality |
Research outputs are increasingly littered with AI-generated citations and references that don’t exist. |
Transition from high-quality research to incorporating AI-generated inaccuracies. |
Research could suffer from reduced credibility and increased retractions. |
AI’s influence on scholarly work leading to reliance on faulty references. |
5 |
Concerns
| name |
description |
| Reliance on AI for Information |
People increasingly trust AI chatbots over human experts, undermining the authority of librarians and credible sources. |
| Disinformation Spread |
The proliferation of fake citations and non-existent sources generated by AI leads to widespread misinformation. |
| Erosion of Critical Thinking |
As users depend on AI for research, they may neglect critical thinking skills, leading to blind acceptance of false information. |
| Impact on Librarians’ Roles |
Librarians are overwhelmed and exhausted due to the rising number of misleading AI-generated inquiries about non-existent materials. |
| Hallucination Phenomenon in AI |
AI chatbots frequently produce fabricated information, causing confusion and frustration for users seeking accurate references. |
| Potential for Academic Integrity Issues |
The emergence of false references could compromise the integrity of academic research and publishing. |
| Decreased Trust in Human Expertise |
The tendency to believe AI over trained professionals may weaken societal trust in expert knowledge. |
| Lack of Understanding of AI Limitations |
Users are not aware of the limitations of AI tools, leading to misguided reliance and poor research outcomes. |
Behaviors
| name |
description |
| Trust in AI over Human Experts |
An increasing number of individuals prefer relying on AI-generated information than the expertise of librarians or researchers. |
| Request for Non-Existent References |
People are frequently requesting book or citation references from librarians that are fabricated by AI systems, leading to confusion and misinformation. |
| Increased Use of AI for Research |
There’s a growing trend of users beginning their research with generative AI, which often provides inaccurate or fictitious information. |
| Belief in AI Authority |
Users tend to perceive AI chatbots as more authoritative than human sources, affecting their trust in information retrieval. |
| Misguided AI Prompting Techniques |
Individuals believe that specific prompts can prevent AI from hallucinating, leading to false confidence in AI outputs. |
Technologies
| name |
description |
| AI Chatbots |
AI-driven conversational agents like ChatGPT, Grok, and Gemini that assist users in finding information. |
| Generative AI (GenAI) |
AI systems capable of generating text and other content, often leading to hallucinations or false information. |
| Large Language Models (LLM) |
Advanced AI models trained on vast datasets for natural language processing and understanding. |
| AI in Research Assistance |
Utilization of AI tools to aid in academic and historical research, affecting the reliability of sources. |
Issues
| name |
description |
| AI-generated misinformation |
The rise of AI-generated content is leading to widespread misinformation, with users requesting non-existent sources and publications. |
| Trust in AI over human experts |
An increasing number of individuals trust AI chatbots more than trained professionals like librarians for accurate information. |
| Librarians’ workload and burnout |
Librarians are experiencing increased exhaustion due to the influx of queries about fictional references created by AI. |
| Challenges in academic integrity |
The proliferation of AI-generated fake citations raises concerns about citation accuracy and academic integrity. |
| User strategies for improving AI reliability |
Users believe they can improve AI outputs with specific prompts, indicating a misunderstanding of AI capabilities. |