PHONOBYTE

INTERACTIVE INSTALLATION

The Humour in Errors

Originally designed for the Siggraph 2011 Art Gallery, Phonobyte was created by myself and Alanna Kho as a physical analogue to the often humorous errors that arise from machine misinterpretation. Sites cataloguing the hilarity that ensues from phone text typos as well as many videos on misinterpreted voice commands highlight the occasional frustration and comedy that arises as people rely more and more on digital systems on a daily basis.

Our goal was to condense this scenario by creating a kind of physical game of “Telephone”, where users would stand at one of two screens. These screens would output (via projector) a low-resolution version of what was directly in front of the opposite screen. A microphone would take voice input, run it through speech recognition software, and output it via synthesized speech. In this way users could try to communicate with each other with often hilarious results. As with many art installations we hoped the installation would spark conversations on the topic of machine interpretation, but just as importantly, do so in a fun and interactive manner.

Phonobyte Teaser Image

The Project in a Nutshell

A Closer Look

Materials were a huge consideration in creating Phonobyte. We needed to make sure that it would be easy to build, take down, and transport so primarily lightweight components were chosen. Different plastics were tested with a variety of LEDs (for color and intensity), and we eventually settled on a Twinwall Polycarbonate sheet behind a thinner “Lighting White” plastic material. The polycarbonate was built with flutes throughout it’s structure which we used to simulate scanlines from old-school CRT based monitors and TVs, while the white plastic provided a clean white sheen for the front of the structure. Because we were aiming to “de-rez” the communication experience, the aesthetic of the materials chosen matched our intentions quite well.

MAX/MSP was used to take the camera input from webcams on either side of the installation and translate that to a low-resolution video feed through projectors. Settings were built in to ensure the balance between what the camera was picking up and what was being outputted could be adjusted based on lighting conditions. The Max patch would also take in the speech recognition data from Dragon Dictate and relay the text via a synthesized computer voice to the other screen as well.

Dragon Dictate was the speech recognition software used to take in and interpret the participant’s voice information. As the user spoke into the microphone the program would type out what it thought he or she was saying into a text field in the MAX/MSP patch, which would subsequently speak out the text to the other user. This voice was outputted through speakers found on either side of the installation.

Phonobyte was exhibited during the 2011 Culture Days event, as well as at the Interurban Art Gallery in Vancouver. On both of these occasions we were pleasantly surprised to see the ways people chose to interact with the installation. Some people tried joining together to convey a low resolution lobster man to the other screen, while others tried dancing. Some thought to converse using another language, and still others tried singing and even rapping. The resulting machine interpretations were funny, odd, and sometimes even thoughtful.

Afterwards I had the chance to speak with many of these participants, who after coming face to face with the Phonobyte were more than happy to speak of their own experiences dealing with frustrations with their own devices and subsequently begin a conversation on our increasing reliance on digital methods of communications. That we could converse on this subject in a lighthearted and fun manner with so many people was proof enough that we had completed what we set out to accomplish in building Phonobyte.

Next Project
My 15 is your 10 After
Interactive Installation