Categories
Research

PocketMenu: Non Visual Menus for Touch Screen Devices

It’s a chilly Sunday afternoon and you are out for a walk, listening to music from your MP3 player, and you want to select the next song. How do you do that?

A few years ago you probably didn’t even take the MP3 player out of the pocket. You just used your fingers to feel for the shape of the next button and press it.

Today, we don’t own dedicated MP3 players anymore, but use our smartphones. And since most input in modern smartphones is done via large touch screen displays, you need to take the phone out of your pocket, unlock the screen, and spot the button visually to press it.

The PocketMenu addresses this problem, by providing haptic and auditory feedback to allow in-pocket input. It combines clever ideas from previous research on touch screen interaction for sensory and motor impairments in a novel way.

All menu items are laid out along the screen bezel. The bezel therefore serves as a haptic guide for the finger. Additional speech and vibration output allow identifying the items and obtaining more information. Watch the video to see how exactly the interaction works.

In a field experiment, we compared the PocketMenu concept with the state-of-the-art VoiceOver concept that is shipped with the iPhone. The participants had to control an MP3 player while walking down a road with the device in the pocket. The PocketMenu outperformed VoiceOver in terms of completion time, selection errors, and subjective usability.

This work will be presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which takes place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:
Share
Categories
Research

TouchOver Map


Touch-screens and interactive maps are two things which are common-place nowadays. We find both in modern smartphones and location-based services. In the HaptiMap Project (FP7-ICT-224675), we aim at making maps and location-based services accessible. Thus, my colleagues Benjamin Poppinga, Charlotte Magnusson, Kirsten Rassmus-Gröhn, and me investigated, how to make maps on touch screens accessible for visually impaired users.

We developed TouchOver Map, a simple prototype aimed at investigating the feasibility of speech and vibration feedback. It allows to non-visually explore a map – currently the street network to be exact. Our approach is dead simple: as long as the user touches a street the phone vibrates and the street name is spoken.

To evaluate how well people can understand street layouts with TouchOver Map, we conducted a user study. Eight sighted participants explored the map while the device was covered by an empty box.

While the participants explored the map, they were asked to reproduce it on a piece of paper. Although the results are far from perfect, the participants were able to reproduce the streets and their relationships.

Obviously, our study has a number of limitations. There were only a few testers and none of them was blind. The TouchOver Maps also only displayed streets but no other geographic features. Finally, the non-visual rendering could clearly benefit from fine-tuning and clever filtering of features to display. Nevertheless, out pilot study shows that it is possible to convey geographical features via touch screen devices by making them “visible” through speech and vibration.

The TouchOver Map is a collaboration of Certec, the Division of Rehabilitation Engineering Research in the Department of Design Sciences, Faculty of Engineering, Lund University and the Intelligent User Interfaces Group of the OFFIS Institute for Information Technology, Oldenburg, Germany. It was published as a Works-in-Progress at the 13th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’11).

The paper can be downloaded from here.


Share this:
Share