Ubicomp 2008

Many blogs have been covering Ubicomp and, a couple of days ago, I promised to write down my own coverage. Here you go ;-)

The first day I attended the Automated Journeys workshop organized by Arianna Bassoli (who gave a talk at UCL a while back), Johanna Brewer (whose recent work has been covered here; for more, check her blog), and . The workshop’s format was not  traditional. As part of the workshop, we went out and had lunch :-), and, while doing so, we observed how people in Seoul use technologies.  Then, we came back and, through group discussions and hands-on design brainstorming sessions, we produced  4 envisagements that  critically reflected on technological futures. It was very engaging! I hope other workshops will replicate/mutate this format. I wished I could attend at least two of the  other workshops on offer: Ubiquitous Systems Evaluation partly organized by Chris Kray (I am in debt with him, and he knows why ;-)) and Devices that Alter Perception partly organized by Carson Reynolds.

At Ubicomp, the speakers did not suffer from powerpoint karaoke syndrome, and their slides were generally  well-designed – less text, more images. That is largely because the ubicomp’s community is made of design-conscious (CHI) researchers. Few talks are already available on slideshare.

Here are few papers I personally found intriguing because of their algorithms, their evaluation, or their interesting ideas. At the end of this post, I’ll point to few datasets that have been used and can be of interest ;-)

1. Algorithms

Navigate Like a Cabbie: Probabilistic Reasoning from Observed Context-Aware Behavior. Brian D. Ziebart showed a new way of making route predictions. He used a probabilistic model  presented at AAAI “Maximum Entropy Inverse Reinforcement Learning“.  Interestingly, he showed that the model works upon data that is noisy and imperfect.

Pedestrian Localisation for Indoor Environments. Oliver Woodman proposed a way of  tracking people indoor. Oliver and Robert showed how to combine a foot-mounted unit, a building model, and a particle filter to track people in a building. They experimentally showed that users can be effectively tracked within 1m without knowing their initial positions. Great results! It’s a paper well worth reading!

Discovery of Activity Patterns using Topic Models. Bernt Schiele presented a new method for recognizing a person’s activities from wearable sensors.  This method adapts probabilistic topic models and has been shown to recognize daily routines without user annotation.  One of Bernt’s students had an interesting poster on detecting location transitition using sensor data (pdf).

2. Evaluation

A couple of papers (including the great work done by Matthew Lee)  used a method called the Wizard of Oz evaluation. The general idea is to simulate those parts of the system (e.g., speech recognition) that require most effort in terms of development or to assess the suitability of your interface(see “Wizard of Oz studies – why and how” (pdf) for more).

Flowers or a Robot Army? Encouraging Awareness & Activity with Personal, Mobile Displays by Sunny Consolvo et al.  They designed a system that makes it possible for mobile users to self-monitor their physical activities and conducted a greatly designed 3-month field experiment.

Reflecting on the Invisible: Understanding End-User Perceptions of Ubiquitous Computing (pdf). Erika Shehan Poole detailed end-user perceptions of RFID technology using an interesting qualitative method that combines structured interviews and photo elicitation excercises. Erika and her mates show that, by using this method, one is able to uncover perceptions that are often difficult for study participants to verbalize.  One of her findings: many people believed that RFID can be used to remotely tract the location of tagged objects, people, or animals!

3. Interesting Ideas

Bookisheet: Bendable Device for Browsing Content Using the Metaphor of Leafing Through the Pages. Trash your mouse. Jun-ichiro Watanabe presented a VERY promising interface (a book made of two thin plastic sheets and bend sensors) with which  a user can easily scroll digital content such as photos. The user  does so by simply bending one side of the sheet or the other.

Towards the Automated Social Analysis of Situated Speech Data. To automatically understand individual and group behavior, Danny Wyatt et al. recorded the coversational dynamics of 24 people over 6 months. They did so using privacy-sensitive techniques. By using this type of studies, researchers may well  gain broad sociological insights.

The Potential for Location-Aware Power ManagementRobert Harle showed how to dinamically optimize the energy consumption of an office. Very interesting problem-driven research!


Accessible Contextual Information for Urban Orientation
. Jason Stewart  presented a prototype of a location-based  service with which mobile users share content (see their project’s website)

Enhanced Shopping: A Dynamic Map in a Retail Store.  Alexander Meschtscherjakov  presented a prototype for mobile phones that displays  customer activities (e.g., customer flow) inside a shopping mall

Spyn: Augmenting Knitting to Support Storytelling and Reflection (pdf). Daniela K. Rosner‘s presentation was masterfully designed! She walked us through her expirience of designing Spyn – a system for knitters to record, playback, and share information involved in the creation of their hand-knit artifacts. She showed how her system enriches the knitter’s craft

Picture This! Film assembly using toy gestures. Cati Vaucelle (who keeps a cool blog) presented a new input device embedded in children’s toys for video composition.  As they play with the toys to act out a story, children conduct film assembly.

4. Datasets

Understanding Mobility Based on GPS Data by et al. used GPS logs of 65 people over 10 months (the largest dataset in the community!) to evaluate a new way of  inferring people’s motion modes from their GPS logs

Accurate Activity Recognition in a Home Setting (pdf) by Tim van Kasteren et al. used 28 days of sensor data about one person @ home and corresponding annotations of his activities (e.g., toileting, showering, etc.) to evaluate a new method for recognizing activities from sensor data.

Discovery of Activity Patterns using Topic Models by Tam Huynh et al. used 16 days of sensor data from a man who was carrying  2 wearable sensors to test their method for automatically recognizing activities (e.g., dinner, commuting, lunch, office work) from sensor data.

On Using Existing Time-Use Study Data for Ubiquitous Compting Applications by Kurt Partridge and Philippe Golle how to use data (e.g. people’s activities and locations) that has been collected by governments and commercial institutions to evaluate ubicomp systems.

The Potential for Location-Aware Power Management by Rober Harletested on location data of 40 people in 50-room office building for 60 working days his proposed strategies for dinamically optimizing the energy consumption of an office.

(ubicomp2008)

Tags:

Comments are closed.