This course focuses on thoughtfully and critically embedding computational media into the physical world. We will make, tinker, and experiment with high tech and low tech materials. Our hands-on, materially-oriented work will be grounded in theoretical concepts from HCI (Human Computer Interaction), design, and information studies. Our final project showcase is Tuesday, April 28, 10.30-11.45am at DAR. This course is taught by Stacey Kuznetsov, assistant professor at the School of Arts, Media, and Engineering.



Burrito Compass – The Final(?) Version

The Burrito Compass: Your Late-Night Food Finding Friend

Perhaps “compass” is a bit misleading: through trial and error, it became more like a burrito divining rod – always pointing you in the right direction. In this case, it’s in the direction of the Chipotle on University.

The key to a good joke is to commit, and that’s exactly what happened with this project. I wanted something big and bulky and obnoxious and single-use – essentially, an anti-app. I wanted something you’d laugh about when you heard it. In these regards, it was an outstanding success.

Here’s the final product:

IMG_20150505_182552834 IMG_20150505_182546452

The original prototype was made out of cardboard (as you can see here). I wanted to make the next one out of 1/8″ birch, however, the fabrication lab was completely out, so I substituted 1/18″ styrene. C’est la vie. I did however apply a nice layer of stick-on wood finish, to achieve maximum tackiness.

The Burrito Compass works due to several components:

You can see the code here.

Over the course of this project, I learned several important life lessons:

  • Continuous rotation servos take variables for speed, not angle
  • One must be patient with GPS modules, as they have to find several satellites before you’ll get meaningful data, but it will give you meaningful data in spades. However, it won’t give you rotation data without movement.
  • In theory, you can have 128 I2C devices running on a single I2C port. I only used two.
  • the GPS library and the Servo library just don’t get along.

Also, a giant thank you goes out to Erik Peters (who did some sleuthing to help me parse out the correct GPS data). Couldn’t have done it without him.

Yeast Sonification Synthesis Sonnet in D Minor

For the final project I thought it would be fun to incorporate my music background on something in the class.  Over the past year I have really been getting into this idea of sonification, specifically with making things musical that wouldn’t traditionally be musical.  What I have created is a sound processing tool for the fermentation process of sugars with yeast.  The idea is that this tool would allow the yeast to create a composition or performance of them fermenting the sugars.  Going further, I liked the idea of this being something that could be used in a bakery or brewery as a way to pair a song with a specific bread or beer.

IMG_1142 IMG_1134

I started with building a few different synthesizers in Max/MSP.  The next step was to record the fermentation process.  Originally I intended to sense and record both the alcohol and carbon dioxide produced by the yeast during the process, however in the end I only used an alcohol sensor.  Since I didn’t want to have to keep fermenting sugar water to test and design the sound synthesis I first built a patch with Max that allowed me to record sensor data into a file which could then be played back to simulate the fermentation as many times as needed.  Doing it this way also allowed me to play around with the time scale of the fermentation meaning I could speed up the data simulating speeding up time.  I ended up recording and using data from the fermentation of regular sugar and brown sugar.  I recorded both for 3 hours and am speeding them up to condense them down to about 3 minutes.  I also used a contact microphone to record the sound of the bubbles created by the carbon dioxide during fermentation.  After a lot of playing around with various parameters in Max for the mapping of sounds I found something I liked which I also felt sonically represented the yeast.  I’m doing various calculations on the data to find both the amount of alcohol and the amount of change over time, both of which are mapped to different parameters of the sound.  The total alcohol content is mapped to the main frequency or pitch of their respective synthesizers.  The regular sugar is being synthesized using frequency modulation, the brown sugar is being synthesized using amplitude modulation and the recording of audio bubbles is being fed into a granular synthesizer being manipulated by the data.

Circuit Diagram:


Very simple Arduino code:

const int alcoPin = A0;

void setup() {
void loop() {
delay (100);

Very messy Max Patch:

Screen Shot 2015-04-30 at 12.58.10 AM

Video of project/song:

This was the first sonification project I’ve done where I thought about it sort of compositionally and telling a story.  While working on the sounds I tried to keep in mind that I was trying to find a way to represent yeast and give it a voice.  I wanted to represent it in the best sonic way possible to somehow make the process of fermentation meaningful in a new context.  Trying to give a voice to something like yeast was quite challenging.  I think using a bit of audio recording really helped glue everything together as it brought a sort of organic feel (after all it is an organic process) to a very heavily digital sound.  I think the most rewarding part of this project was actually being able to create something that I liked and enjoyed how it sounded.  During the presentation day I received some feedback that the sounds came out a lot more interesting than people thought which made me feel good about the way I was able to sculpt the sound for the yeast.  If I had more time I would like to try a whole list of things.  I would definitely want to add more layers and also get a carbon dioxide sensor to use that data as well.  I also really like the idea of time stretching where you could play different fermentation captures at different speeds or intervals.  It makes me wonder if you could also get a strong rhythmic content with having things offset like that.

Interactive Chopsticks Final documentation

Concept & motivation: The motivation to create the interactive LED chopsticks was originally to create something fun but also could be used as a game or learning device to teach others how to use chopsticks (since I, myself didn’t know how). The chopsticks would give the user feedback based on how they were used.

Process & Iteration: I don’t have my original storyboard unfortunately but I can honestly say the design of the project did not change to much visually from start to finish. I always wanted to use LED lights to give feedback and the few design changes I made between concept and the finished product were simply what components to use for what I wanted to do. At first I wanted to use an accelerometer or gyroscope to track user input but in the end I decided to use compacitive sensing because using the other two options had already been done with chopsticks. Other than that the other design ideas I had were putting the LED lights on a bracelet that could be worn but in the end I simply had the lights near the breadboard.

When doing the process of writing the code for the chopsticks I used the library to get a base for the capacitive sensing and was able to find a library  online for the LED Neopixel ring that allowed me use it the way I wanted. By combining these codes and tweaking around on the arduino i was able to accomplish what I wanted to do. I made sure to leave the credit to the creator of the original NeoPixel code in the google doc for the code.

Final Design:

 CapacitiveSensingChopsticks 2015-04-28 10.55.56 

(click images for larger view)

I copied my code into a google document because it was too big to screenshot / would have taken a lot of screenshots:

When the chopsticks aren’t being used the light should be off, when the chopsticks touch tips or if a person touches the tip then the lights are red, and when the chopsticks pick up food the light should be green or any other color (depending on mode).

Final Reflections: I learned to order materials sooner. A big issue with the project was simply waiting to get my materials in and so I could test the code. The biggest challenge I think was trying to get all the wiring to work well, I had to end up sautering the wires to the NeoPixel ring before I could get the connections to work. The most rewarding part of the project was definitely getting the project to actually work (most of the time) in time for the showcase all I had to do was edit some values because capacitive sensing can be a bit iffy depending on the location it seems.

If I had more time to work on this project I probably would have decided to use the Flora which is a smaller version of the Lilypad arduino so that I could attach everything to a bracelet with a power source.

Final Project – Interactive Card Mat



Playing Cards

Playingcard Sleeves

Copper Tape


Arduino Nano

Alligator Clips

7 Segment LED Display

Extent of Function

The goal was to get a playing card game working correctly with it, with each card interacting with one another. Unfortunately I couldn’t get any stable reading between cards to implement it, but I did get the display reacting to the cards.

Check the Video Below

Mountain Biking Trail Warning System

The idea behind my project was to create a warning system that will illuminate a rear facing light when you go over a rough section of trail when mountain biking. Whenever I ride with another person, it is difficult to warn them about a technical part of the trail, so I thought of this! IMG_0613This device is to be mounted on the rear seatpost so that it is visible to others. This form houses the sensor, battery, and even has a place to carry an inflator and tool.


Since the device is not quite large enough to house the wiring, I have made an example of what the system should be and how it operates. I hooked up an accelerometer and coded it so that the the sensor will activate a light when shaken (see video). The sensor is constantly reading a Z-axis measurement of 9.8 m/s^2 (which is gravity) so the code used accounts for values <13m/s^2 and >5m/s^2. This is to account for the up and down motion experienced while riding over something bumpy.  IMG_7554  IMG_1879

PVC was used to create the form factor. I cut, sanded, and glued varying pieces together in order to make a flush and tight fit on the seatpost.

Final Reflection:

I learned how to connect a accelerometer to the ardurino and make it read X, Y, and Z axis. I learned how to fabricate a potientally useful product that would benefit many novice mountain bikers. The biggest challenge was getting the accelerometer to work properly, and fabricating the device itself. If i had more time to work on this I would construct the physical device out of something much lighter (carbon fiber) and get the sensor to actually fit inside the system.

Refrigerator drawer sensor

Concept and Motivation
The Idea for my project came from observing the habits of the people I know as well as becoming aware of my own. I chose to work in the realm of food and looked at the design of the refrigerator and how it affects our eating decisions. Because the produce drawers are at the very bottom, people don’t reach for fruit or vegetables when looking for a snack. Also due to the low visibility people often forget what is in the drawers which leads to food going bad. There are “smart refrigerator” concepts on the market but they’re expensive and seem like overly complicated high-tech solutions to a somewhat low-tech problem. I wanted my solution to be affordable and for the technology not to interrupt or complicate user’s day-to-day routines. The device will encourage people to fill their produce drawers with auditory feedback, and it will help reduce waste by placing visual monitoring feedback on the outside of the fridge.

Process and Iteration
2015-04-29 22.00.44
2015-04-29 22.50.01 Continue reading