Android Research

App Store Studies : How to Ask for Consent?

App Stores, such as Apple’s App Store or Google Play, provide researchers the opportunity to conduct experiments with a large number of participants. If we collect data during these experiments, it may be necessary to ask for the users’ consent beforehand. The way we ask for the users’ consent can be crucial, because nowadays people are very sensitive to data collection and potential privacy violations.

We conducted a study suggesting that a simple “Yes-No” form is the best choice for researchers.

Tested Consent Forms

We (most of the credit goes to Niels Henze for conducting the study) tested four different approaches to ask for the consent to collect non-personal data. All consent forms contain the following text:
By playing this game you participate in a study that investigates the touch performance on mobile phones. While you play we measure how you touch be we DON’T transmit personalized data. By playing you actively contribute to my PhD thesis.

Checkbox Unchecked

The first tested consent form showed an unchecked check next to a text reading “Send anonymous feedback”. In order to participate in the study a user had to tick the checkbox and then press the “Okay” button.

Checkbox Checked

The second consent form is the same as the previous one, except that the checkbox is pre-checked. To participate in the study the user has to merely click the “Okay” button.

Yes/No Button

The third consent form features two buttons are provided reading “Okay” and “Nope”. To participate the user has to click “Okay”. Clicking “Nope” will end the app immediately.

Okay Button

The foorth consent form only contains a single “Okay” Button. By clicking “Okay” the user participates in the study. To avoid participation, the user has to end the app through the phone’s “home” or “return” buttons.


These consent forms were integrated into a game called Poke the Rabbit! by Niels Henze. At first start, the application randomly selected one of the four consent forms. If the use accepted to participate in the study, the app transmitted the type of the consent form to a server.


We collected data from 3,934 installations. The diagram below shows the conversion rate. The conversion rate was estimated by dividing the number of participants per form by 983,5 (we assume perfect randomisation, i.e. each consent form was presented in 25% of the installations).

Conversion rate per consent form. The x-axis shows the type of consent form. The y-axis shows the estimated fraction of users that participated in the study after download.

We were surprised about the high conversion rate. Only the consent form with the unchecked checkbox yielded in a too low conversion rate.

Conclusions – use Yes/No Buttons

We suggest using the consent form with Yes-No buttons. The consent form with the checked checkbox may considered unethical, since the user may not have read the text and was not forced to consider unchecking the checkbox. The consent form with the “Okay” button may be considered unethical, too, because users may not be aware that they can avoid data collection by using the phone’s hardware buttons. The “Yes-No” form, in contrast, forces users to think about their choice and offer a clear way to avoid participating in the study.

Yes-No buttons are ethically safe and resulted in the second highest conversion rate.

Would you suggest otherwise? We are not at all saying that this is definite! Please share your opinion (comments or mail)!

More Information

This work has been published in the position paper App Stores – How to Ask Users for their Consent? The paper was presented at the ETHICS, LOGS and VIDEOTAPE Ethics in Large Scale Trials & User Generated Content Workshop. It took place at CHI ’11: ACM CHI Conference on Human Factors in Computing Systems, which was held in May 2011 in Vancouver, Canada.


The authors are grateful to the European Commission, which has co-funded the IP HaptiMap (FP7-ICT-224675) and the NoE INTERMEDIA (FP6-IST-038419).


Share this:

Tacticycle: Supporting Exploratory Bicycle Trips

Navigation systems have become a common tool for most of us. They conveniently guide us from A to B along the fast or shortest route. Thanks to these devices we do not fear to get lost anymore when traveling through unfamiliar terrain.

However, what if you are a cyclist and your goal is an excursion rather than reaching a certain destination and all you want is to stay oriented and possibly learn about interesting spots nearby? In that case, the use of a navigation system becomes more challenging. One has to look up the addresses of interesting points and enter them as (intermediate) destination. Sometimes the navigation system might not even know all the small paths, so we end up checking the map frequently, which is dangerous when done on the move.

The Tacticycle is a research prototype of a navigation system that is specifically targeted at tourists on bicycle trips. Relying on a minimal set of navigation cues, it helps staying oriented while supporting spontaneous navigation and exploration at the same time. It field three novel features:

  1. First, it displays all POIs around the user explicitly on a map. A double-tap quickly selects a POI as travel destination. Thus, no searching for addresses is required.
  2. Second, the system relies on a tactile user interface, i.e. it provides navigation support via vibration. Thus, the rider does not have to look at the display while driving.
  3. Third, the Tacticycle does not deliver turn-by-turn instructions. Instead, the vibration feedback just indicates the direction of the selected POI “as the crow flies”. This allows the travelers to find their own route.
The direction “as the crow flies” of the selection POI is encoded in the relative vibration of the two actuators in the handle bars. In this picture, the POI is about 20° to the right, so the vibration in the right handle bar is a little stronger.

In cooperation with a bike rental, we rented the Tacticycle prototype to tourists who took it on their actual excursions. The results show that they always felt oriented and encouraged to playfully explore the island, providing a rich, yet relaxed travel experience. On the basis of these findings, we argue that providing minimal navigation cues only can very well support exploratory trips.

This work has been presented at MobileHCI ’12, ACM SIGCHI’s International Conference on Human-Computer Interaction with Mobile Devices and Services, which took place in September 2012 in San Francisco. The paper is available here (pdf).

Share this:

In Situ Field Studies using the Android Market

Recently, researchers have started to investigate using app distribution channels, such as Apple’s App Store or Google’s Android Market to bring the research to the users instead of bringing the users into the lab.

My colleague Niels, for example, used this approach to study how people interact with touch screens of mobile phones. But, instead of collecting touch events in a boring, repetitive task he developed a game where users have to burst bubbles by touching them. And instead of conducting this study in the sterile environment of a lab he published the game on the Android Market for free, so it was installed and used by hundreds of thousands of users. So, while these users were enjoying the game they generated millions of touch events. And unlike traditional lab studies, this data was collected from all over the world and many different contexts of use. The results of this study were reported at MobileHCI ’11 and were received enthusiastically.

Since my work is on pedestrian navigation systems & conveying navigation instructions on vibration feedback, lab studies are oftentimes not sufficient. Instead we have to go out and conduct our experiments in the field, e.g. by having people navigate through a busy city center.

So, if we can bring lab studies “into the wild” can we do the same with field experiments?

My colleague Benjamin and I started addressing this question in 2010. We developed a consumer-grade pedestrian navigation application called PocketNavigator and released on the Android Market for free. Then, we developed algorithms that allow us to infer specific usage patterns we were interested in. For example, these algorithms allow us to infer whether users follow the given navigation instructions or not. We also developed a system that allows the PocketNavigator to collect these usage patterns along with relevant context parameters and send these to one of our servers. On a side-note, the collected data does not contain personally identifiable information, so it does not allow us to identify, locate, or contact users.

With this setup we conducted a quasi-experiment. Since my research is about the effect of vibration feedback on navigation performance and the user’s level of distraction, we compared the usage patterns of situations where the vibration feedback was turned on versus turned off. Our results show that the vibration feedback was used in 29.9 % of the trips with no effect on the navigation performance. However, we found evidence that users interacted less with the touch screen, looked less often at the display, and turned off the screen more often. Hence, we believe that users were less distracted.

The full report of this work has been accepted to the prestigious ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’12) and has been presented in May 2012 in Austin, Texas.

The paper can be downloaded from here.

Share this:

A Vibro-tactile “Friend Sense” for Keeping Groups Together

Being able to sense the location of people can be beneficial if you visit a crowded, noisy, and chaotic place with your friends, such as a festival. Usually, for a good nightly experience, it is important that the group stays together. However, when everyone has different needs at different times (getting food, visiting the lavatory …) it becomes increasingly challenging to keep everyone together, which is contradictory to a joyful night out.

Thanks to GPS and mobile Internet, different solutions exist where the mobile phone of each friend communicates its GPS location to a server, which then forwards the location to all the other friends’ mobile devices. The problem with existing implementations, such as Google Latitude or Glympse, is that they use maps to communicate these locations. It is more than just inconvenient to read a map while walking through a dense crowd.

We therefore investigated whether the skin can be used to communicate the location of people and therefore be turned into a “friend sense”. Our solution is quite simple, as we wanted to implement it on everyday smartphones. The user can select to “follow” one of the friends. The application then calculates the relative location of this friend, such as “left-hand side”. This information is then encoded into vibration patterns. By learning the meaning of the patterns, the user can understand where the friend is without even taking the device out of the pocket.

We tested this concept on a festival with two groups à 6 friends each. Three of each group could sense the others while three only shared their locations. Over the night we repeatedly probed the participant’s mood and the subjective level of attention they devoted to keeping the group together. In both cases, we could find statistically significant differences between users and non-users of the “friend sense”. The friends that were able to sense the other were more relaxed, felt more confident, and subjectively devoted less attention to keep the group together.

More details on FriendSense and this study can be found in A Tactile Friend Sense for Keeping Groups Together, a work-in-progress to be published as part of the extended abstract of the CHI ’11 conference.

Share this:

Turn-by-Turn Navigation – the Best Way to Guide Pedestrians?

Navigating from a place to another is an essential ability for a self-determined life. When navigating in unknown terrain, e.g. when going for a hike or when visiting a city as a tourist, people become increasingly dependent on navigation aids. Established aids range from signposts over maps to route descriptions. But, thanks to the increasing number of GPS-enabled mobile phones, a new navigation aid is becoming increasingly common: the GPS navigation system.

In principle, these systems behave like car navigation systems. The traveller’s location is displayed on a map, the route is highlighted, and turning instructions are given by symbols or speech. For cars, this way of guiding the driver has shown to be quite successful. Timely and accurate instructions are indispensable as the driver has to follow the traffic rules. For pedestrians, however, this type of information presentation may be too strict. Even worse, when providing too much navigation information, we neglect the human’s inherent navigation abilities and cause even disadvantages. For example, many people report that when driving a route by navigation systems, they cannot remember the route as well as they could before the area of car navigation systems.

Now just imagine a place that is roughly 1 mile away from your current location. If somebody would give you the rough cardinal direction of this place on demand, most people would reach it easily. We tested this kind of navigation in research group and currently offer it in the PocketNavigator. First, the user specifies the destination by selecting it on a map. The handheld then creates vibration patterns that indicate whether the destination is ahead, to the left-hand side, or to the right-hand side.

First studies show that pedestrians can effectively and efficiently navigate with such as directional cue only. Thus, showing the direction of a destination “as the crow flies” could be a valuable additional to turn-by-turn navigation systems for pedestrians.

For more information see  “In Fifty Metres Turn Left”: Why Turn-by-turn Instructions Fail Pedestrians

Share this: