Non-visual Interaction with Mobile Devices

Before the proliferation of smartphones and tablets, using a computer meant to sit down at your desk in a warm, calm room. Today, we use mobile computing devices to be online and connected anywhere and anytime. The interaction with these devices therefore takes place in more demanding environments. Freezing cold may impair your motor skills, loud noise may impair you hearing, and usage on the move may frequently disrupt your attention. My past research was about overcoming these limitations by using appropriate user interfaces. In I have been focusing on providing spatial information through non-visual displays for location-based services.

Below you find selected research projects I have been working on as part of my PhD at the Intelligent User Interfaces group at the OFFIS Institute for Information Technology.

2012

Dude, Where’s My Car? In-Situ Evaluation of a Tactile Car Finder.

We published a Vibro-Tactile Car Finder App on Google Play. When users point the phone at the location of car, the phone vibrates. We found that half of the users use tactile feedback. Further, we provide evidence that users are less distracted from the environment with tactile feedback (more). Published at ACM NordiCHI ’12 (pdf) (app).

Tacticycle: Supporting Exploratory Bicycle Trips.

Tacticycle: the concept of providing a minimal set of navigation cues for exploratory bicycle trips and show that it supports tourists well on their excursions. We studied the prototype with tourists on the island of Borkum, and found that the participants always felt oriented, were encouraged to explore the environment, appreciated its eyes-free use, and had a relaxed travel experience (more). Published at ACM MobileHCI ’12 (pdf) (video).

PocketMenu: Non-Visual Menus for Touch Screen Devices.

The PocketMenu enables using menus on touch screens without looking at the device. Thanks to the clever alignment of the buttons along the screen’s bezel and the vibration feedback when crossing the thin grey lines, the finger is “guided” along the menu items and can be browsed without taking the phone out of the pocket. The items functions are announced via speech to allow eyes-free use. We experimentally compared PocketMenu with iPhone’s VoiceOver and asked people to use both menus on the move to control an MP3-player. The results provide evidence that in the above context the PocketMenu outperforms VoiceOver in terms of completion time, selection errors, and subjective usability (more). Published at ACM MobileHCI ’12 (pdf) (video).

PocketNavigator: Studying Tactile Navigation Systems In-Situ in the Large

We developed map-based pedestrian navigation system to study the value of vibro-tactile navigation instructions in the wild. The application was released to the Android Market and by collecting anonymous usage data we found evidence that users are less distracted when they use the tactile feedback (more). Published at ACM CHI 2012 (pdf) (video).

2011

6th Senses for Everyone!

In a field experiment we compared conveying navigation instructions through different sensory modalities, namely vision, the sense of touch, and both at the same time. The results provide evidence that combining both modalities leads to more efficient navigation performance while using tactile feedback only reduces the user’s distraction (more). Published at ICMI 2011 (pdf).

TouchOver Map

The TouchOver Map is a prototype to investigate the use of speech and vibration to make maps accessible on touch screen devices. Our approach is dead simple: as long as the user touches a street the phone vibrates and the street name is spoken. In a pilot study we found that users can roughly reproduce maps displayed by the TouchOver Map (more). Published at ACM MobileHCI 2011 (pdf) (video).

FriendSense

FriendSense is an application of the Tactile Compass as a non-visual way of sensing your friends’ locations. We evaluated it at a large festival to keep groups of friends informed about the location of each other. Probing the groups’ moods via Experience Sampling we found that ability to sense one’s friends can significantly improve the visitors mood and confidence (more). Published at ACM CHI 2011 (pdf).

The Tactile Compass

The Tactile Compass is the concept of cueing spatial directions and distance, such as 2 o’clock, 200m in vibration patterns. By altering the length of two vibration pulses, the user learns about the direction of a geographic location. The pauses between a set of pulses indicates its distance. In field studies we could prove that the Tactile Compass can effectively cue the location of a reference place, such as the destination of a travel, and thereby have a positive effect on the user’s attention. Published at IFIP INTERACT 2011 (pdf).

2010

PocketNavigator

The PocketNavigator is an OpenStreetMap-based map and navigation system for pedestrians. It is part of the HaptiMap project, where we develop a toolkit to make map-based application accessible. The accessibility feature we are testing in the PocketNavigator is navigation by vibration. Published at ACM MobileHCI 2010 (pdf).

Natch : A Watch-like Display for Less Distracting Pedestrian Navigation

Natch – short for Navigation Watch – is the result of a student project. Natch is the prototype of a navigation system that was stripped down to the most necessary features. In a field study we could show Natch significantly reduces the travelers distraction compared to a commercial navigation system while delivering a similar navigation performance. Published at Mensch und Computer 2010 (pdf).

Sensing your Friends’ Locations via the Sense of Touch

We investigated presenting spatial locations (direction and distance) of people / objects in relation to the user with a vibro-tactile belt. In lab experiments we showed that people can get a rough understanding of locations via the sense of touch. Specifically, we compared different distance encodings and found that the rhythm-based encoding was most accurate (more). Published at ACM CHI 2010 (pdf).

Where is my Team? Supporting Situation Awareness with Tactile Displays

The previously developed tactile location sensing interface was tested in the 3D multiplayer game Counter Strike. In alternating order the teams were equipped with the interface. We used it to keep them informed about the location of their team members. In an experiment we could show that the location information could be processed effectively and lead to an increased situation awareness (more). Published at ACM CHI 2010 (pdf).

2009

Tacticycle – Supporting Tourists on a Bicycle Trip

We took a common bicycle and enhanced the handle bar with two vibration motors, one at each bar end. The vibration motors were connected to a handheld device showing only interesting sights around the user in a radar-like way. If the user selected one of the sights the vibration motors would start indicating its general direction, similar to a compass. Published at ACM MobileHCI 2009 (pdf).

Vibration-enhanced Paper Map Navigation

Traveler with a map and our vibro-tactile belt. The belt vibrates into the direction of the traveler's destinationWe used a vibro-tactile belt to support paper map-based pedestrian navigation. By turning on the vibration motor pointing most closely into the direction of the destination we continuously cued its location to the navigator. In a field experiment we could show that with the tactile belt people depend less on the map, or lose their orientation less often, and take shorter routes (more). Published at ACM MobileHCI 2009 (pdf).

2008

Continuous Direction Encoding with Tactile Belts

We investigated if the apparent location effect (two touches on the skin are perceived as a single one between them) can be applied to tactile belts, to enable displaying a continuous 360° range of horizontal directions. We found that directions can be presented more accurate at the expense of a higher cognitive workload. Published at HAID 2008 (pdf).

2007

Tangible UI for Auditory City Maps

We developed a tangible user interface to explore sonified city maps. The user could move a little physical icon over a board and listen to natural non-speech sounds emitted by the map’s geographical features from the icons perspective. We could show that the physical nature of the interface had benefits on how well the user could explore the map. Published at HAID 2007 (pdf).