I aim to shape products, interfaces and services that mediate meaningful dialogues between people, systems and their environments within everyday life.

Posts tagged ‘embedded’


CASCON 2009 Wrap-Up

The CASCON conference wrapped up last week so here’s a bit of a summary of a few more interesting talks and workshops I attended.

Technology of Google Wave

Alex Nicolaou, Mobile Engineering Manager at Google, presented an inter

esting keynote about Google Wave. He introduced the concept and the idea behind Wave for those who were not familiar with it. He talked about some cool product features I didn’t know about such as grammar-based spell check that can be implemented for various other languages (and perhaps even for programming languages?), uploading photos to create a shared album and access control to different parts of the Wave and private annotations. In terms of platform, robots and gadgets can be added to and embedded in Waves for added interactivity and extended functionality.

As someone who has previewed Google Wave and had been initially confused with the entire application, I posed the question: The current email platform is very simple and easy to use, but using this new paradigm of combining chat, email and Google docs all into one, there seems to be mixed reactions so far. How does Google envision the widespread adoption of the application when the user experience is complex and confusing?

Alex explained that Google didn’t predefine specifics on how one can or should use Wave. Since it does so many things, there are many possibilities in the patterns of usage and behaviours that will emerge. The most interesting uses would be the unexpected activities that were not initially designed for that can arise. I later found a site that lists a wide variety of possible use cases in different contexts, so it will be exciting to see what can come out of Google Wave.

Sensor-Based Support of Clinical Contexts in Hospitals

This engaging workshop was conducted by Mark Chignell, director of the Interactive Media Lab at the University of Toronto. He introducted the use of sensors as a tool in facilitating smart interactions to understand context and situations of our environment so that humans can work smarter, not harder. Smart interactions for health care is significant because of the criticality, complexity and richness of data within the sector. For example, using sensors to identify problematic clinical contexts can provide decision support, simplify tasks, and improve doctor/patient interactions. We had a guest scientist/physician, Dr. Jacques Lee, from Sunnybrook Hospital participate in the discussion, which was quite valuable in understanding the current processes and problems and gaining feedback about idealized scenarios and user study evalutions presented by IML researchers.

Dr. Lee presented an interesting topic that he specializes in: sensing and preventing delirium in the emergency department (ED). Delirium is an acute brain failure that is preventable, common, and is yet easy to miss and lethal. Approximately 30-35% of patients develop delirium as they remain immobile in the ED, but many of these patients are sent home because the condition was never detected by the doctor. Delirium can usually be detected by sensing abnormal extremes (hyperactivity or inactivity) and by testing direct cognitive tasks. Accelerometers attatched to the thigh or behind the ears to sense hyperactive motion are possible solutions for indicators. Questions of practicality and acceptance then must be considered including the visual appearance and obtrusiveness.

Overall the workshop delved into some interesting discussions between designers, researchers, healthcare specialists and technologists regarding the future of sensor-based technologies that can be used to improve current healthcare processes and human-computer/human-human interactions.


Technical Explorations

To implement a working model of my memory marbles as proof of concept, I looked into various possible technologies for the detection of marbles and transfer/reading of data. RFID and Bluetooth were the most common and appropriate technologies for an

ideal implementation. I ventured out to the Hong Kong RFID Centre in Shatin to take a look at the various forms (tags and readers) and uses in real applications. It was quite interesting, but the smallest RFID tags were still too big to be used with marbles (1.75cm diameter). Additionally, since RFID tag and reader must be placed parallel of each other for proper data transfer, tags inside a mable may end up sitting perpendicular to a reader because of the rolling nature of marbles resulting in no tag detection.


Due to project time constraints, I have decided to simulate a working interactive model using micro-controllers and switches to communicate between the physical model and PC. Switches installed in the physical model will be triggered by marbles that are inputted, and signals will be sent to a keyboard interface to activate a particular keystroke. A program running on the PC will then receive the keystroke as input and output audio and visual displays.


General technical overview of the actual implementation versus the ideal implementation.



iSerendipity Interactive Lounge

During my Embedded Interaction workshop with Michael Fox, we were immersed in a group project to design an interactive environment demonstrated with a kinetic model. My group came up with the concept of an ambient space called iSerendipity:

iSerendipity is an ambient lounge that enhances mood, sociability and interactivity among people. Organic-shaped pods float amongst each other through space and light up once a person steps on. These pods detect the activity levels of people on each pod and drift through space, either isolating people for contemplative reflection or clustering active groups to enable chance encounters – serendipity. Pod lights are time- and context- sensitive: initial activation of a pod stimulates a glow that intensifies over time and colour hues change according to activity levels. The exterior façade displays the harmonious movement and colour intensity of each pod as aesthetic visual information to passersby.



Videos of iSerendipity’s interaction points in motion

See the process blog here.



I just started a new course this week called Embedded Interaction with Michael Fox, who specializes in interactive architecture. For our first exercise we’ve been asked to build a kinetic structure inspired by nature or biological systems (also known as biomimetics).  We can draw examples from plant tropism, bird wings, spiderweb structure, etc. For example, Velcro was invented after the engineer,  George de Mestral, realized the hooking mechanism of burr to his dog’s fur.

Interesting examples for consideration:


weeping willow

pinecones will close when warm and dry but close in cold and damp conditions


Although this project won’t involve motors or gears our later projects will. Flying Pig is a basic useful resource for future reference.