Hallucinations in Scholarly LLMs: A Conceptual Overview and Practical Implications

dc.contributor.authorGaur, Manas
dc.date.accessioned2026-01-22T16:19:12Z
dc.description.abstractLarge Language Models are progressively being incorporated into academicprocesses. They assist with literature reviews, citation creation, and knowledge extraction, improving accessibility and productivity in academic research. However, theyalso come with a critical challenge, the problem of hallucination, which is the generation of content which is factually incorrect, fabricated, or unsupported by evidence.In scholarly communication, such hallucinations often manifest as non-existent citations, misrepresented research findings, or inaccurately contextualized inferences thattogether undermine academic integrity and the reliability of scientific discourse. Thispaper positions hallucinations in LLMs within the context of scholarly communication,classifies the major types of hallucinations, and examines their underlying causes andpotential consequences. It outlines practical mitigation approaches, including retrieval augmented generation (RAG) for evidence grounding, citation verification techniques,and neurosymbolic methods that leverage knowledge graphs for fact-checking. The paper further emphasizes the need for human–AI collaboration in designing hallucination aware scholarly tools that foster responsible and verifiable AI use in research. By highlighting both the risks and opportunities of LLMs in academia, this work intends to raiseawareness and offer development guidance for trustworthy AI systems serving scholarly applications. This discussion develops a conceptual framework for the creation ofreliable, transparent, and fact-based AI-driven research assistants that augment, ratherthan distort, scholarly communication.
dc.format.extent9 pages
dc.genrejournal articles
dc.genrepreprints
dc.identifierdoi:10.13016/m2vgrz-oust
dc.identifier.urihttp://hdl.handle.net/11603/41556
dc.language.isoen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectUMBC Ebiquity Research Group
dc.titleHallucinations in Scholarly LLMs: A Conceptual Overview and Practical Implications
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0002-5411-2230

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
HalluinationsinScholarlyCommuniations.pdf
Size:
587.74 KB
Format:
Adobe Portable Document Format