Fish Fugue employs computer vision and the Arduino to enable a soloist to perform a duet with a goldfish. Kyoko Inagawa, Mel Kim, and I made this for Jesse Stiles’ class on Experimental Sound Synthesis at CMU. I was responsible for the concept and developed the software and hardware.  


Music is believed to be a universal language, capable of transcending cultural barriers and filling in the words that get lost in translation. Scientists have shown that music is an art form that is not only enjoyed by human cultures; it is present in animal cultures as well. Fish Fugue is a project that harnesses the universality of music to mediate a dialogue between a human and an animal. 


A webcam, mounted on top of the tank, tracks the fish while an Arduino MEGA controls the notes played on a toy piano. As the goldfish moves to different parts of the tank, the melody changes to reflect the fish's position. 

In order to allow the Arduino to control the toy piano, eleven different piano keys were tied to eleven solenoids (with machine screws mounted on them) using fishing wire and each solenoid is connected to a digital pin on the Arduino. That way, each HIGH signal sent from the Arduino pulls a key down while each LOW signal pushes a key up. The melodies are essentially stored in the Arduino as a sequence of key up and key down messages. Scheduling was also used to ensure accurate timing for the notes.

Processing was used to implement the computer vision/fish tracking system. Since the fish was tracked based on its hue, a blue sheet of paper was placed beneath the tank to establish as much contrast between the fish and the background as possible. The video feed was additionally split into four different quadrants, each quadrant representing a melody that would be played on the toy piano. Each time the fish moves to a different quadrant of the tank the Processing program would send a byte to the Arduino's Serial port indicating which melody should be played.