Hallucinations in Scholarly LLMs: A Conceptual Overview and Practical Implications
| dc.contributor.author | Gaur, Manas | |
| dc.date.accessioned | 2026-01-22T16:19:12Z | |
| dc.description.abstract | Large Language Models are progressively being incorporated into academicprocesses. They assist with literature reviews, citation creation, and knowledge extraction, improving accessibility and productivity in academic research. However, theyalso come with a critical challenge, the problem of hallucination, which is the generation of content which is factually incorrect, fabricated, or unsupported by evidence.In scholarly communication, such hallucinations often manifest as non-existent citations, misrepresented research findings, or inaccurately contextualized inferences thattogether undermine academic integrity and the reliability of scientific discourse. Thispaper positions hallucinations in LLMs within the context of scholarly communication,classifies the major types of hallucinations, and examines their underlying causes andpotential consequences. It outlines practical mitigation approaches, including retrieval augmented generation (RAG) for evidence grounding, citation verification techniques,and neurosymbolic methods that leverage knowledge graphs for fact-checking. The paper further emphasizes the need for human–AI collaboration in designing hallucination aware scholarly tools that foster responsible and verifiable AI use in research. By highlighting both the risks and opportunities of LLMs in academia, this work intends to raiseawareness and offer development guidance for trustworthy AI systems serving scholarly applications. This discussion develops a conceptual framework for the creation ofreliable, transparent, and fact-based AI-driven research assistants that augment, ratherthan distort, scholarly communication. | |
| dc.format.extent | 9 pages | |
| dc.genre | journal articles | |
| dc.genre | preprints | |
| dc.identifier | doi:10.13016/m2vgrz-oust | |
| dc.identifier.uri | http://hdl.handle.net/11603/41556 | |
| dc.language.iso | en | |
| dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
| dc.relation.ispartof | UMBC Faculty Collection | |
| dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department | |
| dc.rights | Attribution 4.0 International | |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
| dc.subject | UMBC Ebiquity Research Group | |
| dc.title | Hallucinations in Scholarly LLMs: A Conceptual Overview and Practical Implications | |
| dc.type | Text | |
| dcterms.creator | https://orcid.org/0000-0002-5411-2230 |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- HalluinationsinScholarlyCommuniations.pdf
- Size:
- 587.74 KB
- Format:
- Adobe Portable Document Format
