Effects of UI Abstraction on Non‑Expert Prompting Workflows in Generative AI
Links to Files
Permanent Link
Collections
Author/Creator
Author/Creator ORCID
Date
Department
University of Baltimore. Yale Gordon College of Arts and Sciences
Program
Master of Science in Interaction Design and Information Architecture
Citation of Original Publication
Rights
Attribution-NoDerivs 3.0 United States
This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by The University of Baltimore for noncommercial research and educational purposes.
This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by The University of Baltimore for noncommercial research and educational purposes.
Subjects
Interactive Design
Large Language Models (LLMs)
artificial intelligence
Human AI Interaction
generative AI
Artificial Intelligence
human-computer interaction
user experience design
large language models
human-computer interaction
web design
User Research
user experience
interaction design
Information Resources (General) - General works
Large Language Models (LLMs)
artificial intelligence
Human AI Interaction
generative AI
Artificial Intelligence
human-computer interaction
user experience design
large language models
human-computer interaction
web design
User Research
user experience
interaction design
Information Resources (General) - General works
Abstract
Generative AI tools offer powerful capabilities but often demand that users craft precise prompts, effectively requiring humans to “think like machines.” This thesis investigates whether thoughtful interface design can instead enable generative AI to better speak like us, aligning with natural human communication. We conducted an exploratory study with 10 non-expert participants, observing each as they interacted with three Figma-based prototype interfaces of varying abstraction (ranging from a freeform text input to a highly guided prompt form) to complete a prompt-writing task. Our findings show that the level of UI abstraction significantly shaped users’ prompting experience. Participants preferred different interfaces depending on their familiarity and comfort: a freeform prompt felt most intuitive and expressive for some, whereas others benefitted from structured templates that yielded clearer prompts and greater confidence. These results highlight that a one-size-fits-all prompting solution is suboptimal. We discuss implications for the UX design of AI-powered tools, emphasizing the need for adaptive, user-centered prompt interfaces that accommodate varying experience levels and facilitate more natural human–AI interaction.
