Using Language Groundings for Context-Sensitive Text Prediction

Author/Creator ORCID

Date

2016

Department

Program

Citation of Original Publication

Timothy Lewis, Amy Hurst, Matthew E. Taylor, & Cynthia Matuszek, Using Language Groundings for Context-Sensitive Text Prediction, EMNLP Workshop on Uphill Battles in Language Processing: Scaling Early Achievements to Robust Methods, 2016;

Rights

This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please contact the author.

Abstract

In this paper, we present the concept of using language groundings for context sensitive text prediction using a semantically informed, context-aware language model. We show initial findings from a preliminary study investigating how users react to a communication interface driven by context-based prediction using a simple language model. We suggest that the results support further exploration using a more informed semantic model and more realistic context.