Categories
Research

PocketMenu: Non Visual Menus for Touch Screen Devices

It’s a chilly Sunday afternoon and you are out for a walk, listening to music from your MP3 player, and you want to select the next song. How do you do that?

A few years ago you probably didn’t even take the MP3 player out of the pocket. You just used your fingers to feel for the shape of the next button and press it.

Today, we don’t own dedicated MP3 players anymore, but use our smartphones. And since most input in modern smartphones is done via large touch screen displays, you need to take the phone out of your pocket, unlock the screen, and spot the button visually to press it.

The PocketMenu addresses this problem, by providing haptic and auditory feedback to allow in-pocket input. It combines clever ideas from previous research on touch screen interaction for sensory and motor impairments in a novel way.

All menu items are laid out along the screen bezel. The bezel therefore serves as a haptic guide for the finger. Additional speech and vibration output allow identifying the items and obtaining more information. Watch the video to see how exactly the interaction works.

In a field experiment, we compared the PocketMenu concept with the state-of-the-art VoiceOver concept that is shipped with the iPhone. The participants had to control an MP3 player while walking down a road with the device in the pocket. The PocketMenu outperformed VoiceOver in terms of completion time, selection errors, and subjective usability.

This work will be presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which takes place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:
Share
Categories
Research

TouchOver Map


Touch-screens and interactive maps are two things which are common-place nowadays. We find both in modern smartphones and location-based services. In the HaptiMap Project (FP7-ICT-224675), we aim at making maps and location-based services accessible. Thus, my colleagues Benjamin Poppinga, Charlotte Magnusson, Kirsten Rassmus-Gröhn, and me investigated, how to make maps on touch screens accessible for visually impaired users.

We developed TouchOver Map, a simple prototype aimed at investigating the feasibility of speech and vibration feedback. It allows to non-visually explore a map – currently the street network to be exact. Our approach is dead simple: as long as the user touches a street the phone vibrates and the street name is spoken.

To evaluate how well people can understand street layouts with TouchOver Map, we conducted a user study. Eight sighted participants explored the map while the device was covered by an empty box.

While the participants explored the map, they were asked to reproduce it on a piece of paper. Although the results are far from perfect, the participants were able to reproduce the streets and their relationships.

Obviously, our study has a number of limitations. There were only a few testers and none of them was blind. The TouchOver Maps also only displayed streets but no other geographic features. Finally, the non-visual rendering could clearly benefit from fine-tuning and clever filtering of features to display. Nevertheless, out pilot study shows that it is possible to convey geographical features via touch screen devices by making them “visible” through speech and vibration.

The TouchOver Map is a collaboration of Certec, the Division of Rehabilitation Engineering Research in the Department of Design Sciences, Faculty of Engineering, Lund University and the Intelligent User Interfaces Group of the OFFIS Institute for Information Technology, Oldenburg, Germany. It was published as a Works-in-Progress at the 13th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’11).

The paper can be downloaded from here.


Share this:
Share
Categories
Research

In Situ Field Studies using the Android Market

Recently, researchers have started to investigate using app distribution channels, such as Apple’s App Store or Google’s Android Market to bring the research to the users instead of bringing the users into the lab.

My colleague Niels, for example, used this approach to study how people interact with touch screens of mobile phones. But, instead of collecting touch events in a boring, repetitive task he developed a game where users have to burst bubbles by touching them. And instead of conducting this study in the sterile environment of a lab he published the game on the Android Market for free, so it was installed and used by hundreds of thousands of users. So, while these users were enjoying the game they generated millions of touch events. And unlike traditional lab studies, this data was collected from all over the world and many different contexts of use. The results of this study were reported at MobileHCI ’11 and were received enthusiastically.

Since my work is on pedestrian navigation systems & conveying navigation instructions on vibration feedback, lab studies are oftentimes not sufficient. Instead we have to go out and conduct our experiments in the field, e.g. by having people navigate through a busy city center.

So, if we can bring lab studies “into the wild” can we do the same with field experiments?

My colleague Benjamin and I started addressing this question in 2010. We developed a consumer-grade pedestrian navigation application called PocketNavigator and released on the Android Market for free. Then, we developed algorithms that allow us to infer specific usage patterns we were interested in. For example, these algorithms allow us to infer whether users follow the given navigation instructions or not. We also developed a system that allows the PocketNavigator to collect these usage patterns along with relevant context parameters and send these to one of our servers. On a side-note, the collected data does not contain personally identifiable information, so it does not allow us to identify, locate, or contact users.

With this setup we conducted a quasi-experiment. Since my research is about the effect of vibration feedback on navigation performance and the user’s level of distraction, we compared the usage patterns of situations where the vibration feedback was turned on versus turned off. Our results show that the vibration feedback was used in 29.9 % of the trips with no effect on the navigation performance. However, we found evidence that users interacted less with the touch screen, looked less often at the display, and turned off the screen more often. Hence, we believe that users were less distracted.

The full report of this work has been accepted to the prestigious ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’12) and has been presented in May 2012 in Austin, Texas.

The paper can be downloaded from here.

Share this:
Share