Richards, Luke E.Matuszek, Cynthia2021-04-092021-04-09Luke E. Richards and Cynthia Matuszek. “Learning to Understand Non-Categorical Physical Language for Human Robot Interactions.” In Proceedings of the RSS 2019 workshop on AI and Its Alternatives in Assistive and Collaborative Robotics (RSS: AI+ACR), http://iral.cs.umbc.edu/Pubs/RichardsMatuszekRSSws2019.pdfhttp://hdl.handle.net/11603/21316Proceedings of the RSS 2019 workshop on AI and Its Alternatives in Assistive and Collaborative Robotics (RSS: AI+ACR)Learning the meaning of language with respect to the physical world in which a robot operates is a necessary step for shared autonomy systems in which natural language is part of a user-specific, customizable interface. We propose a learning system in which language is grounded in visual percepts without pre-defined category constraints by combining CNNbased visual identification with natural language labels, moving towards making it possible for people to use language as a highlevel control system for low-level world interactions, allowing a system to operate on shared visual/linguistic embeddings. We evaluate the efficacy of this learning by evaluating against a wellknown object dataset, and report preliminary results that outline the feasibility of pursuing a visual feature approach to domainfree language understanding.5 pagesen-USThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.Learning to Understand Non-Categorical Physical Language for Human Robot InteractionsText