I aim to shape products, interfaces and services that mediate meaningful dialogues between people, systems and their environments within everyday life.

Posts tagged ‘hackathon’


Oct
06
2014

Stories on the Go

Weekend hackathons are quite an investment in time and effort but it’s so fun to come up with new ideas and build them out in a 24-hour time crunch. The organizers had a pair of Vuzik Smart Glasses lying around so my friend and I decided to experiment with it and brainstorm some fun ideas to build.

Being an avid jogger (him) and bicyclist (me), we thought wouldn’t it be great if we could record our scenic jogs and rides without having to stop and interrupt our activity? We came up with a simple idea of wearing the smart glasses on the go to easily record photos hands-free using a quick nod of the head to invoke a camera capture. The photo series recorded with your map route can act as your diary to help you remember details of your activities.



May
14
2014

Arduino, D3, Illustrator, and … Bees!

Over the weekend I attended AT&T’s Art+Tech hackathon that challenged participants to combine technology, innovation and art to bring awareness about the decline in bee populations.

Knowing nothing about bees, some of the talks from beekeepers and conservationists taught me some fascinating things about them and made me understand the great importance of bees. Bees not only are the main of our food, they’re clever at communicating exact pollen locations (distance and direction) through dance, and can actually “see” whether a flower is filled with pollen.

The Seattle-Tacoma is home of the Flight Path project, which has turned scrub land into pollinator habitat for bees and in parallel as transformed a corner of the airport concourse for an art and educational exhibit.

In 24 hours, I teamed up with an awesome group of five and we spent all of Friday night brainstorming so many ideas until we finally nailed down our concept after midnight. The next day we got down and dirty. I took on the challenge of creating the animated visualization and decided to use D3.js to make it happen. This was the perfect chance for me to learn D3… but might have proven disastrous dabbling with a new library to build a working prototpye under huge time pressure. I didn’t think I’d make it but luckily I figured everything out and hooked up our Arduino counter to pass in live data for the visualization, hooray!

By the end, my team produced a concept for a live interactive data visualization of the beekeeping operations in the airfield. The goals of our piece was to first attract, then engage and educate the passengers about the plight of the bees.

ATTRACT PHASE
The main display is an animated visualization mimicking a bee hive that represents live bee activity based on data transmitted from Arduino motion detectors in the hives. Each hexagon lights up and fades out as bees enter/exit the hive. A small webcam stream from inside the hive helps connect the real-world activity to the abstracted visualization.

INTERACT
When passersby are attracted to approach the exhibit, motion detectors sensing a viewer standing in front of a screen triggers the INTERACT phase. The flashing visualization turns into animated infographics to tell a compelling story about the decline of bees, which we hope will bring more awareness to the importance of bees and the positive impact the Flight Project for bee conservation. The final display is a live Twitter stream that collects all the “buzz” about the installation to get people to spread the word about the exhibit and about the bees.

The weekend project was a ton of fun. Not only did I meet some great people, and taught myself D3 in less than a day, my team was one of the top 3 winners of the event (a sweet bonus indeed!)

Sep
15
2013

Visualizing Your Everyday Data

The Microsoft Garage hosted Everyday Data UX Hackathon last month posing the challenge to create an interactive visualization of data that surrounds us in our everyday lives.

We are always producing data; with the pervasiveness of wearables, we can collect so much information about ourselves to monitor, maintain, and improve our health. We looked at existing quantified self tools like Fitbit, Jawbone’s Up and the Nike FuelBand, which all visualize results of users’ actions but in very discrete bits of data that do not provide mid- to long-term perspectives, nor do they allow easy correlations to one to understand how to adapt or change behaviour. What if the data from those sources could be tied into dynamic and reactive visual documents to reflect, in real-time, how adjusting your behaviour could directly impact your life?

We set out to build a visualization that bridges the gulf between reflection and execution. We determined three main problems to tackle in order to achieve this goal: 1) data heterogeneity (how can we combine data of multiple dimensions to make correlations?); 2) macro overview vs micro details (how can we drill in to details and navigate our data points over time?); and 3) emotional design for motivation (how can a visualization motivate people for behavioural change?)

sketches

We explored many visualization methods to combine different dimensions of data. Metrics we were interested in were number of steps, activity level, hours of sleep, calories burned, number of floors climbed, etc. We used radar charts to summarize a person’s activities over a day, where each dimension is measured along a different axis originating from the same point. This produces a unique shape that can be overlaid on previous days’ activities to compare how you’re doing over time or it can be overlaid over a goal shape you’ve set. We also enabled detailed drill-downs to view a timeseries charting a single metric over a longer period of time, which can also be compared with other measures (e.g. calories consumed vs calories burned.)

day_view
multi_views

We also proposed an aesthetic, glance-able lock screen that show’s your day’s abstract shape. Live tiles on your phone can also who you a summary of your day’s progress through an easy to digest visualization.

mobile

Our design struck a fine balance between data art and raw data with a visualization combined with aesthetics that is not only functional in correlating your multiple activities but making it easily understandable and motivating people to adjust their behaviour. For future development, we’d like to look at plugging in different data sources (currently we’re only using Fitbit data), increasing interactivity like enabling manipulation of one dimension to see its impact on another aspect (eg. can increasing step count improve sleep quality), and providing smart inferences to make the data more actionable.

screenshot

Jul
03
2013

Talking Teddy

Last weekend I attended the AT&T mobile app hackathon, my very first hackathon. Within a 24 hour period, we had to get into teams and build a mobile app for education.

I teamed up with a dynamic and multidisciplinary group to develop an idea for a stuffed teddy bear that teaches kids in a tangible, interactive, and engaging way. By enabling touch, visuals, sound, and voice, in addition to understanding principals of turn-taking necessary for natural conversation and for keeping attention, we created an immersive and entertaining experience for children.

Our proof of concept integrated a mobile device to a teddy bear. The mobile device runs an application which listens to voice input and uses natural language processing to formulate an appropriate verbal response using text-to-speech. We developed a small library of educational lessons like recognizing colours, learning the alphabet and listening to animal sounds as well as little game rewards to have Teddy tell a joke or sing a song.

For the mobile device I designed simple and fun UI to provide helpful visuals to accompany the audio and ongoing conversation between teddy and child. To manage the child’s learning progress and lesson plans, an admin UI was created for the parent or teacher to choose the education level and the types of lessons/games, as well as track the child’s progress over time.

interface

By the end of the 24-hour period, all teams went up to do a show-and-tell of their prototype. We hit a couple of hiccups with the voice recognition in our demo, but it was really well received. We ended up being the grand prize winner – not too shabby for a bad first-time hacking experience!

talking_teddy


Instagram