Assistive Multimodal Interfaces for Improving Web Accessibility
MetadataShow full item record
Type of Work4 pages
conference papers and proceedings preprints
Citation of Original PublicationRavi Kuber, Wai Yu, Philip Strain, Emma Murphy and Graham McAllister, Assistive Multimodal Interfaces for Improving Web Accessibility, http://www.dcs.gla.ac.uk/haptic/haptics%20web%20pages_files/Kuber%20et%20al..pdf
RightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Multimodal interfaces have been used in helping blind and partially sighted people to access visualization tools such as graphs and numerical tables. However, few studies have been undertaken to improve Web access and navigation using a multimodal approach. This paper describes a novel approach that addresses this Web accessibility issue by using an assistive tool which consists of a multimodal interface and a content-aware Web browser plug-in. A force feedback mouse and a real-time audio rendering tool form the basis of the multimodal interface. The Web plug-in constantly monitors the mouse position on-screen as well as detecting the objects in the vicinity. As a result, haptic and audio feedback are provided to inform users when they are close to an image or hyperlink. The multimodal interface and the Web plug-in are described in the paper. Results from a pilot study on the usability of this system are also presented.