Categories
Research

Tacticycle: Supporting Exploratory Bicycle Trips

Navigation systems have become a common tool for most of us. They conveniently guide us from A to B along the fast or shortest route. Thanks to these devices we do not fear to get lost anymore when traveling through unfamiliar terrain.

However, what if you are a cyclist and your goal is an excursion rather than reaching a certain destination and all you want is to stay oriented and possibly learn about interesting spots nearby? In that case, the use of a navigation system becomes more challenging. One has to look up the addresses of interesting points and enter them as (intermediate) destination. Sometimes the navigation system might not even know all the small paths, so we end up checking the map frequently, which is dangerous when done on the move.

The Tacticycle is a research prototype of a navigation system that is specifically targeted at tourists on bicycle trips. Relying on a minimal set of navigation cues, it helps staying oriented while supporting spontaneous navigation and exploration at the same time. It field three novel features:

  1. First, it displays all POIs around the user explicitly on a map. A double-tap quickly selects a POI as travel destination. Thus, no searching for addresses is required.
  2. Second, the system relies on a tactile user interface, i.e. it provides navigation support via vibration. Thus, the rider does not have to look at the display while driving.
  3. Third, the Tacticycle does not deliver turn-by-turn instructions. Instead, the vibration feedback just indicates the direction of the selected POI “as the crow flies”. This allows the travelers to find their own route.
The direction “as the crow flies” of the selection POI is encoded in the relative vibration of the two actuators in the handle bars. In this picture, the POI is about 20° to the right, so the vibration in the right handle bar is a little stronger.

In cooperation with a bike rental, we rented the Tacticycle prototype to tourists who took it on their actual excursions. The results show that they always felt oriented and encouraged to playfully explore the island, providing a rich, yet relaxed travel experience. On the basis of these findings, we argue that providing minimal navigation cues only can very well support exploratory trips.

This work has been presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which took place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:
Share
Categories
Research

PocketMenu: Non Visual Menus for Touch Screen Devices

It’s a chilly Sunday afternoon and you are out for a walk, listening to music from your MP3 player, and you want to select the next song. How do you do that?

A few years ago you probably didn’t even take the MP3 player out of the pocket. You just used your fingers to feel for the shape of the next button and press it.

Today, we don’t own dedicated MP3 players anymore, but use our smartphones. And since most input in modern smartphones is done via large touch screen displays, you need to take the phone out of your pocket, unlock the screen, and spot the button visually to press it.

The PocketMenu addresses this problem, by providing haptic and auditory feedback to allow in-pocket input. It combines clever ideas from previous research on touch screen interaction for sensory and motor impairments in a novel way.

All menu items are laid out along the screen bezel. The bezel therefore serves as a haptic guide for the finger. Additional speech and vibration output allow identifying the items and obtaining more information. Watch the video to see how exactly the interaction works.

In a field experiment, we compared the PocketMenu concept with the state-of-the-art VoiceOver concept that is shipped with the iPhone. The participants had to control an MP3 player while walking down a road with the device in the pocket. The PocketMenu outperformed VoiceOver in terms of completion time, selection errors, and subjective usability.

This work will be presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which takes place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:
Share
Categories
Android Projects Research

A Tactile Compass for Eyes-free Pedestrian Navigation

The idea came up when I was heading back to the hotel from a conference dinner at MobileHCI 2008 in Amsterdam. I had no orientation. The only guide I had was a map on my Nokia phone. Not being familiar with Amsterdam, the route let me right through the busy areas of the city center.

The day before, a cyclist had stolen a mobile phone right out of the hand of another conference attendee. Knowing that made me quite afraid something similar could happen to me too. Without the phone I would have been completely lost.

Here, serendipity hit. Since my research group was already working on tactile displays for navigation and orientation, I wondered whether it was possible to create a navigation system for mobile phones that guided by vibration only, so it could be left in the pocket.

Back at OFFIS we quickly tested a few prototypes, including a hot/cold metaphor and a compass metaphor. The compass metaphor prevailed. The design was to encode the direction the user should be heading (forward, left, right, backwards) in different vibration patterns. Our testing participants liked that design most. Later we tested the vibration compass design a forest and found that it can replace navigation with a map.

The development and the studies was presented at the 13th IFIP TCI3 Conference in Human-Computer Interaction (INTERACT) in Lisbon, Portugal in September 2011. The article is available here.

If you own an Android phone you can try this vibration compass by downloading our PocketNavigator navigation application for free from the Android market.

 

Share this:
Share
Categories
Projects Research

Counter-Strike? Research?

Eight people in the twilight, starring at eight screens, hammering their keyboards, wielding their virtual guns … and all for research purposes?

Counter-Strike is a well-known team-based first-person shooter game. Since shooting is an essential part of the game play, it has a bad image in the public. Whenever a young gunman runs amok, Counter-Strike is the first to blame for many conservative politics.

But, games, such as Counter-Strike, can be fantastic research tools. Players need to process a large amount of information as quickly as possible to be successful. For user interface researchers, this comes in handy when novel interface shall be tested in high cognitive workload situations.

In our previous work we had developed a tactile user interface that allowed sensing the location of people via the sense of touch. To test this user interface we integrated it into Counter-Strike. It gives the wearer a sense of where the team mates are at any time.

The direction of the team mate is indicated by where the vibration occurs. The distance is indicated by the number of pulses.

 

We conducted a study where two teams of participants compete against each other. The teams where alternately equipped with the tactile location sensing system. We found that the tactile location sensing system increased the team’s situation awareness and its performance. Despite the games high cognitive demands the participants were able to interpret the tactile cues.

And the best is … this project scored a publication at the most prestigious scientific conference on human-computer interaction: the ACM Conference on Human Factors in Computing Systems (CHI).

Share this:
Share