Categories
Research

PocketMenu: Non Visual Menus for Touch Screen Devices

It’s a chilly Sunday afternoon and you are out for a walk, listening to music from your MP3 player, and you want to select the next song. How do you do that?

A few years ago you probably didn’t even take the MP3 player out of the pocket. You just used your fingers to feel for the shape of the next button and press it.

Today, we don’t own dedicated MP3 players anymore, but use our smartphones. And since most input in modern smartphones is done via large touch screen displays, you need to take the phone out of your pocket, unlock the screen, and spot the button visually to press it.

The PocketMenu addresses this problem, by providing haptic and auditory feedback to allow in-pocket input. It combines clever ideas from previous research on touch screen interaction for sensory and motor impairments in a novel way.

All menu items are laid out along the screen bezel. The bezel therefore serves as a haptic guide for the finger. Additional speech and vibration output allow identifying the items and obtaining more information. Watch the video to see how exactly the interaction works.

In a field experiment, we compared the PocketMenu concept with the state-of-the-art VoiceOver concept that is shipped with the iPhone. The participants had to control an MP3 player while walking down a road with the device in the pocket. The PocketMenu outperformed VoiceOver in terms of completion time, selection errors, and subjective usability.

This work will be presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which takes place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:
Share