I aim to shape products, interfaces and services that mediate meaningful dialogues between people, systems and their environments within everyday life.
Over the weekend I attended AT&T’s Art+Tech hackathon that challenged participants to combine technology, innovation and art to bring awareness about the decline in bee populations.
Knowing nothing about bees, some of the talks from beekeepers and conservationists taught me some fascinating things about them and made me understand the great importance of bees. Bees not only are the main of our food, they’re clever at communicating exact pollen locations (distance and direction) through dance, and can actually “see” whether a flower is filled with pollen.
The Seattle-Tacoma is home of the Flight Path project, which has turned scrub land into pollinator habitat for bees and in parallel as transformed a corner of the airport concourse for an art and educational exhibit.
In 24 hours, I teamed up with an awesome group of five and we spent all of Friday night brainstorming so many ideas until we finally nailed down our concept after midnight. The next day we got down and dirty. I took on the challenge of creating the animated visualization and decided to use D3.js to make it happen. This was the perfect chance for me to learn D3… but might have proven disastrous dabbling with a new library to build a working prototpye under huge time pressure. I didn’t think I’d make it but luckily I figured everything out and hooked up our Arduino counter to pass in live data for the visualization, hooray!
By the end, my team produced a concept for a live interactive data visualization of the beekeeping operations in the airfield. The goals of our piece was to first attract, then engage and educate the passengers about the plight of the bees.
The main display is an animated visualization mimicking a bee hive that represents live bee activity based on data transmitted from Arduino motion detectors in the hives. Each hexagon lights up and fades out as bees enter/exit the hive. A small webcam stream from inside the hive helps connect the real-world activity to the abstracted visualization.
When passersby are attracted to approach the exhibit, motion detectors sensing a viewer standing in front of a screen triggers the INTERACT phase. The flashing visualization turns into animated infographics to tell a compelling story about the decline of bees, which we hope will bring more awareness to the importance of bees and the positive impact the Flight Project for bee conservation. The final display is a live Twitter stream that collects all the “buzz” about the installation to get people to spread the word about the exhibit and about the bees.
The weekend project was a ton of fun. Not only did I meet some great people, and taught myself D3 in less than a day, my team was one of the top 3 winners of the event (a sweet bonus indeed!)
I took apart a keyboard to figure out the combination of signals required for various different keystrokes. By connecting switches to trigger a certain combination, a keystroke can then be sent to the PC as input to be processed by the computer progr
I conducted further interviews focused on memory recording, organizing, and sharing. In one instance, my interviewee showed me all her memory devices – PDA, cellphone calendar, appointment book – but her problem was that she always forgot to consult
them to check for important dates/appointments. Thus, I realized that a reminder system is another important feature for these Baby Boomers.
An interaction model of the Memory Marbles system.
I did some rapid prototyping to communicate my interaction concept and to give a better idea of the forms and scale of the model/system. Marbles can be carried and transported around in a pouch. On the memory player, once the marbles are enclosed inside the dome, the information can be read from the marbles and transmitted from the dome.
Likewise, to record memories from the PC to marbles, placing a dome over top the marbles will activate a wireless communication between the computer and marbles.
To evaluate our new vending machine re-design, we prototyped a life-sized physical model on which to conduct our user testing. We tested the prototype with 3 users and came up with some initial results to help us make improvements on the next design iteration.
In general, users liked the coin bucket for dropping change all at once and especially the idea of being able to pay for a drink with a combination of coins and Octopus card. Having a touch-based screen with clear icons made it easier to make selections and updating the screen view based on context made it easier to understand (for example, only showing pictures of the available drinks). Additionally, users enjoyed ease of collecting the drink without needing to bend down to pick it up.
However, our interface and screen flow still didn’t easily support the task of buying multiple drinks or different kinds of drinks in one transaction. Thus, we re-thought our interface as a single dynamic screen to provide clear feedback of user choices and actions through animations. For ease of use, the interface supports variations of actions, whether it be simple selection and/or drag & drop, allows mistake fixes, and provides prompts to help users when no input/action has been received for a while.
Fast prototyping of a physical model allowed us to quickly test and validate our ideas while highlighting problems to solve in the next iteration. In the book The Art of Innovation, one IDEO designer talks about “build[ing] to learn.” And indeed, our prototype helped us shape an improved vending machine concept that was understood by everyone and created an enjoyable user interaction and experience.