Categories
Research

PocketMenu: Non Visual Menus for Touch Screen Devices

It’s a chilly Sunday afternoon and you are out for a walk, listening to music from your MP3 player, and you want to select the next song. How do you do that?

A few years ago you probably didn’t even take the MP3 player out of the pocket. You just used your fingers to feel for the shape of the next button and press it.

Today, we don’t own dedicated MP3 players anymore, but use our smartphones. And since most input in modern smartphones is done via large touch screen displays, you need to take the phone out of your pocket, unlock the screen, and spot the button visually to press it.

The PocketMenu addresses this problem, by providing haptic and auditory feedback to allow in-pocket input. It combines clever ideas from previous research on touch screen interaction for sensory and motor impairments in a novel way.

All menu items are laid out along the screen bezel. The bezel therefore serves as a haptic guide for the finger. Additional speech and vibration output allow identifying the items and obtaining more information. Watch the video to see how exactly the interaction works.

In a field experiment, we compared the PocketMenu concept with the state-of-the-art VoiceOver concept that is shipped with the iPhone. The participants had to control an MP3 player while walking down a road with the device in the pocket. The PocketMenu outperformed VoiceOver in terms of completion time, selection errors, and subjective usability.

This work will be presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which takes place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:
Share
Categories
Research

TouchOver Map


Touch-screens and interactive maps are two things which are common-place nowadays. We find both in modern smartphones and location-based services. In the HaptiMap Project (FP7-ICT-224675), we aim at making maps and location-based services accessible. Thus, my colleagues Benjamin Poppinga, Charlotte Magnusson, Kirsten Rassmus-Gröhn, and me investigated, how to make maps on touch screens accessible for visually impaired users.

We developed TouchOver Map, a simple prototype aimed at investigating the feasibility of speech and vibration feedback. It allows to non-visually explore a map – currently the street network to be exact. Our approach is dead simple: as long as the user touches a street the phone vibrates and the street name is spoken.

To evaluate how well people can understand street layouts with TouchOver Map, we conducted a user study. Eight sighted participants explored the map while the device was covered by an empty box.

While the participants explored the map, they were asked to reproduce it on a piece of paper. Although the results are far from perfect, the participants were able to reproduce the streets and their relationships.

Obviously, our study has a number of limitations. There were only a few testers and none of them was blind. The TouchOver Maps also only displayed streets but no other geographic features. Finally, the non-visual rendering could clearly benefit from fine-tuning and clever filtering of features to display. Nevertheless, out pilot study shows that it is possible to convey geographical features via touch screen devices by making them “visible” through speech and vibration.

The TouchOver Map is a collaboration of Certec, the Division of Rehabilitation Engineering Research in the Department of Design Sciences, Faculty of Engineering, Lund University and the Intelligent User Interfaces Group of the OFFIS Institute for Information Technology, Oldenburg, Germany. It was published as a Works-in-Progress at the 13th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’11).

The paper can be downloaded from here.


Share this:
Share
Categories
Research

In Situ Field Studies using the Android Market

Recently, researchers have started to investigate using app distribution channels, such as Apple’s App Store or Google’s Android Market to bring the research to the users instead of bringing the users into the lab.

My colleague Niels, for example, used this approach to study how people interact with touch screens of mobile phones. But, instead of collecting touch events in a boring, repetitive task he developed a game where users have to burst bubbles by touching them. And instead of conducting this study in the sterile environment of a lab he published the game on the Android Market for free, so it was installed and used by hundreds of thousands of users. So, while these users were enjoying the game they generated millions of touch events. And unlike traditional lab studies, this data was collected from all over the world and many different contexts of use. The results of this study were reported at MobileHCI ’11 and were received enthusiastically.

Since my work is on pedestrian navigation systems & conveying navigation instructions on vibration feedback, lab studies are oftentimes not sufficient. Instead we have to go out and conduct our experiments in the field, e.g. by having people navigate through a busy city center.

So, if we can bring lab studies “into the wild” can we do the same with field experiments?

My colleague Benjamin and I started addressing this question in 2010. We developed a consumer-grade pedestrian navigation application called PocketNavigator and released on the Android Market for free. Then, we developed algorithms that allow us to infer specific usage patterns we were interested in. For example, these algorithms allow us to infer whether users follow the given navigation instructions or not. We also developed a system that allows the PocketNavigator to collect these usage patterns along with relevant context parameters and send these to one of our servers. On a side-note, the collected data does not contain personally identifiable information, so it does not allow us to identify, locate, or contact users.

With this setup we conducted a quasi-experiment. Since my research is about the effect of vibration feedback on navigation performance and the user’s level of distraction, we compared the usage patterns of situations where the vibration feedback was turned on versus turned off. Our results show that the vibration feedback was used in 29.9 % of the trips with no effect on the navigation performance. However, we found evidence that users interacted less with the touch screen, looked less often at the display, and turned off the screen more often. Hence, we believe that users were less distracted.

The full report of this work has been accepted to the prestigious ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’12) and has been presented in May 2012 in Austin, Texas.

The paper can be downloaded from here.

Share this:
Share
Categories
Research

PocketNavigator Video on Youtube

Working day and night, turning researchers into actors, mastering the use of iMovie, we present our video on the PocketNavigator Familiy.

The video shows three demonstrators:

  • The Tacticycle is a bicycle navigation system for tourists, which uses vibrating handle bars to provide directions.
  • The PocketNavigator is an OSM-based pedestrian navigation systems that uses vibration patterns to tell the user which direction to go.
  • The Virtual Observer is a research tool that allows collecting usage data (GPS tracks, Images, Experience Sampling Questions) and play them back in order to study in-situ usage of above (and other) applications.

The work presented here is part of the EU-funded HaptiMap research project (FP7-ICT-224675) , which aims at making maps and location-based services more accessible. The PocketNavigator is one of the project’s outcome developed at the Intelligent User Interfaces Group of the OFFIS Institute for Information Technology, Oldenburg, Germany.

The PocketNavigator is available for free on the Android Market: https://market.android.com/details?id=org.haptimap.offis.pocketnavigator

Share this:
Share
Categories
Android Research

Ambient Visualisation of Social Network Activity

Social network, such as Facebook or Twitter, are an important factor in the communication between individuals of the so called digital natives generation. More and more often, they are used to exchange short bursts of thoughts are comments as a means of staying connected with each other.

The instant communication enabled by those social networks has however created a form of peer-group pressure to constantly check for updates. For example, has an informal get-together been announced or has somebody requested to become your friend? This emerging pressure can make people return to the computer more often than they want. This is why we find our colleagues regularly looking for new status updates in meetings, or on our parties we see it more often that our friends cannot resist to check their Facebook account.

One solution to this is notifying users when something important happened. Mobile phones as personal, ubiquitous, and always connected devices lend themselves as platform, as they are carried with the user most of the time. This, it is no surprise that our phone now not only notify about incoming short messages, but do the same for Twitter @mentions, Facebook message, or friend requests. However, these notifications may go unnoticed, too. Thus, instead of checking our Facebook & Twitter account, we keep looking at our mobile phone for notification items.

With AmbiTweet, we investigate conveying social network statuses by ambient displays. We use a live wallpaper showing a beautiful blue water.The wallpaper can be connected with a Twitter account and visualizes the level of activity in an ambient way. The higher the level of activity on this Twitter account, the brighter and the more busy the water becomes. This can be perceived even in the periphery of the field of vision. Thus, users can become aware of important activity without the need to focus the eyes on the phone.

Ambient displays, in general, have the advantage that they convey information in a continuous but unobtrusive way. They exploit the fact that the brain can process information pre-attentive, i.e. without generating apparent cognitive load. AmbiTweet therefore allows concentrating on a primary task while remaining aware of the level of activity on a social network account.

Share this:
Share
Categories
Research

Don’t Ask Users What They Want!

As Human Computer Interaction as a research field is receiving more and more recognition in Germany my collages and me, as members of the Intelligent User Interfaces Group at OFFIS, are more often approached when it comes to developing novel user interfaces or findings innovative solutions for industry partners.

Usually, the idea is that we conduct interviews with (potential) end users and ask them what they want to come up with a innovate ideas. While this may work in some occuasions, I believe that this naïve approach misses a few important points. I will try to elaborate my view in the following:

Most people cannot think outside the box

With his famous quote “If I’d asked customers what they wanted, they would have said “a faster horse”. Henry Ford wanted to say that customers cannot clearly express their needs. What people actually want is getting fast and cheap from A to B with little maintenance and overhead. Most people are not able to imagine a car, if all they know are horse and carriage.

Instead try understanding their problems

Roger L. Cauvin points out that instead of asking users want they want it is more important focus on their problems by asking the right questions and interpreting answers carefully. Sometimes, he argues, it may even be better to ask no question at all but just observe and listen to your users.

Go into detail and then envision perfect solution

Frankie Johnson suggests to do that by going into detail and ask what people dislike about current practices (e.g. horses are time consuming, smelly, and may at times act unpredictably). He believes that talking about people’s dislike and then asking how a perfect carriage looks like would have resulted into the answer “a carriage without horse”.

Serendipity & Being Prepared

In addition, I believe that serendipity – the faculty of making fortunate discoveries by accident – plays another important role. A famous example is the serendipitous discovery of Penicillin by Sir Alexander Fleming, who, returning from holiday, found bacteria cultures having been killed by dishes Penicillium contamination. However, “by accident” may be misleading. Most people without the scientific background of Alexander Fleming would not have recognized the importance of that observation. Thus, I believe, that it is important to “go pregnant” with an problem and keep your eyes open for things that might fit that problem.

Get out there!

Helen Walters gives a few tips on increasing the chance of serendipitous findings, including to get outside the office, build prototypes instead of talking about the idea, and explore instead of execute. Further, it is important to not expect results too soon, which is really a tragic insight given that most work today is driven by deadlines and milestones.

Share this:
Share
Categories
Research

Tactile Compass presented at ICMI ’11

This week I had the chance to attend the International Conference on Multimodal Interaction.It took place in Alicante, Spain, which even at this time of the year (mid of November) can be warm and sunny!

I presented the results from a user study of our Tactile Compass. The basic idea of the tactile compass is that vibration patterns tell you, in which direction to go, so you can use it as a navigation system but never have to look at the mobile device.

In this study, we asked 21 participants to follow three routes through the city centre of Oldenburg. In random order they were equipped with the Tactile Compass, a common visual navigation system, or both. In brief, we found that

  • with the visual system the participants walked fastest, which is a sign that the users had to think least
  • with the tactile system the participants were least distracted and paid most attention to the environment
  • with the combination of both systems the participants made least navigation errors.

For the details and our conclusions please refer to the full paper.

Share this:
Share
Categories
Android Projects Research

A Tactile Compass for Eyes-free Pedestrian Navigation

The idea came up when I was heading back to the hotel from a conference dinner at MobileHCI 2008 in Amsterdam. I had no orientation. The only guide I had was a map on my Nokia phone. Not being familiar with Amsterdam, the route let me right through the busy areas of the city center.

The day before, a cyclist had stolen a mobile phone right out of the hand of another conference attendee. Knowing that made me quite afraid something similar could happen to me too. Without the phone I would have been completely lost.

Here, serendipity hit. Since my research group was already working on tactile displays for navigation and orientation, I wondered whether it was possible to create a navigation system for mobile phones that guided by vibration only, so it could be left in the pocket.

Back at OFFIS we quickly tested a few prototypes, including a hot/cold metaphor and a compass metaphor. The compass metaphor prevailed. The design was to encode the direction the user should be heading (forward, left, right, backwards) in different vibration patterns. Our testing participants liked that design most. Later we tested the vibration compass design a forest and found that it can replace navigation with a map.

The development and the studies was presented at the 13th IFIP TCI3 Conference in Human-Computer Interaction (INTERACT) in Lisbon, Portugal in September 2011. The article is available here.

If you own an Android phone you can try this vibration compass by downloading our PocketNavigator navigation application for free from the Android market.

 

Share this:
Share
Categories
Research

HaptiMap Bike Navigation System meets Tourists on Borkum

Computer science research goes wild! Starting in July 18, 2011, our Tacticycle, a novel bike navigation system for tourists, will be demonstrated on Borkum. For one week, tourists can rent the system for the cycling trips.

Background

In previous studies we found that tourists actually do not desire strict navigation support. Often, they only start with a rough idea where to go. They plan their route on-the-fly based on their surroundings and their intuition. With the Tacticycle we built a navigation system for bikes that aims at accommodating these usage patterns.

Tacticycle

The Tacticycle can roughly be described as a point of interest (POI) radar. Instead of streets and buildings the user gets a quick overview of the direction and distance of nearby POIs. As a special feature users can select one of these POIs. The Tacticycle then cues the direction of this POI by two vibration motors fixed to the handle bar. Instead of seeing its location on the screen the riders can feel where the POI is, which allows them to keep their eyes on the road.

Share this:
Share
Categories
Research

Digital Helpers for the Blind

Last year, the WDR (German TV broadcaster) together our HaptiMap project partner GeoMobile tested the the PocketNavigator with Mr. Schmidt who is visually impaired.

Actually, the PocketNavigator was not designed for blind users, since it only gives very coarse directions (straight, left, right, behind) via vibration patterns. However, to my surprise, Mr. Schmidt seemed to be able to get along with it quite well. He navigated by the white cane as usual but took the directions as orientation cues.

Screenshot of the PocketNavigators main map view

The test also showed that the PocketNavigator is not yet ready to be used by visually impaired users. During the test Mr. Schmidt accidentally hit the touch screen and changed the travel destination, which caused him ending up in front of a house.

Altogether, he was quite fond of the general principle. He concludes that the system would not be an “extra workload” when being on the move.

The video is available here: Digitale Helfer für Blinde, 15.10.2010 (German only)

Share this:
Share