We present Bzzzt, the sketching process for an application which enables your smart phone to sense its surroundings to distinguish between familiar and unknown vibes. The phone will vibrate and record the echoes with its accelerometer or microphone, analyze those echoes and distinguish if it has felt the vibrations of this particular surface before, or not.
Apart from many positive comments in our reviews some showed us that we have to expand our thinking more in the direction of potential applications and contact to "real" users. For that we conducted three workshops with teenagers aged between 16 and 22. Here we present the outcome of the workshops and our own ideas.
This was our original idea for an application using Bzzzt. From our own experience we know that we "misplace" our phone in only a few specific places at home. So if the phone could distinguish these few places where it frequently gets lost, it could alert the user and provide detailed information about the location. There are a few solutions to find your phone if it is lost or stolen, for example Phone Finder (iPhone). But those apps are not useful to find a misplaced phone at home, since they just transmit the inaccurate GPS coordinates.
Most phones can alert the user either by vibrating or sounding an alarm. Imagine the phone is in vibrate-only mode and placed on a soft surface such as a couch. In this case the vibration will hardly be noticeable. An intelligent device could sense this condition and turn on the ringtone in addition. The opposite scenario is when a phone is in "silent mode" and the phone lays on a hard surface such as a desk. A vibrating phone on a meeting room table is very disturbing and audible. So if the phone starts vibrating on a hard surface, the phone could immediately disable vibration to remain silent. When the phone is lifted from the surface the notification settings are set back to normal.
Often we perform tedious tasks such as synchronizing photos, contacts, podcasts depending on where we are. If we are at home we archive our photos on a PC while in the office the phone deactivates call forwarding from your desk's phone to your mobile. For these services it is necessary that the phone learns to associate a surface with a location and a set of appropriate tasks to execute.
One group had the idea for a game inspired by "Tamagotchi". Instead of pressing a button to take care of the virtual pet, the player must explore its surrounding and train particular surfaces. To satisfy the needs of the virtual pet the user has to visit the previously trained surface in a particular place e.g. in the kitchen he has to put the phone on the table when the pet is hungry.
Certain private data on the phone is destined for the user only. Up to now we authenticate with PINs or swipe gestures but phones lack biometric identification. In preliminary experiments we found that most persons grab the phone in a unique way and their hands tremble differently when the phone vibrates. This could be used to distinguish persons and grant different access levels, for example the boyfriend is allowed to use the phone but does not get to see the call list.
Following the Inspirational Bits method, we started developing three prototypes (sketches) using different approaches to recognize familiar surfaces. We wanted them all on proof-of-concept level, meaning we developed them to the stage where they could be tested and felt. This is to open up the design space for what direction we would take for a prototype aiming towards our overall design idea, of a system capable of recognizing familiar surfaces and expressing familiarity with them.
The first sketch/prototype software uses the accelerometer and was built and tested on a Samsung Galaxy S GT-19000 with Android 2.3.3. A Support Vector Machine (SVM) is used for classification. We have tested the prototype in two conditions. First, training and testing in one position of a single surface (C1). Second, training and testing in more positions of a single surface (C2). For the first condition and for three surfaces we reach the maximum success rate (SR) and balanced success rate (BSR) of 100 percent. As we expand the number of surface categories the misclassification increases.
The second sketch/prototype software also uses the accelerometer and was built and tested on an iPhone 4S with iOS 5.0. It had a detection rate of 80 percent on the wood table, respectively 60 percent on cardboard. The hollower the sound of the surface is, the higher the error rate becomes.
The third prototype uses the microphone of the mobile device to record any vibrations and reverberation caused by the vibration motor of the device. It was implemented in Java and tested on an LG Optimus 2x running Android 2.3. It does a frequency analysis on the data but no classification yet.
The video shows a demonstration of our sketches.