Computer Science doctoral student Charles Dietrich has created a new musical instrument incorporating a laptop and a camera. The instrument will make its debut in the next Boulder Laptop Orchestra (BLOrK) concert on March 2.
BLOrk will perform works by John Cage, Ornette Coleman and Pauline Oliveros as well as original works by ensemble members. The concert will also incorporate research from CU doctoral students Charles Dietrich (Computer Science) and Chris Chronopoulos (Astrophysics).
- Experience data that was originally emitted from the sun and then recorded, cleaned, raised to an octave and tonality that humans can easily hear and turned into sounds that can be played on a keyboard. CU astrophysics doctoral student Chris Chronopoulos made this portion of the concert possible.
- A 3D camera will track the movement of the hands and fingers of CU computer science doctoral student Charles Dietrich. Using Intel technology, the frequency and volume of sound will changes with a wiggle and wave of his fingers, creating a new musical instrument.
BLOrk partners with leading artists in the fields of music, visual arts and technology to showcase creative innovations in both art and technology. Among the tools that BLOrk uses in its shows: traditional acoustic instruments, iPads, laptops, SuperCollider software and hemispherical speakers that project sound in a way similar to that of acoustic instruments.
BLOrk is the ensemble-in-residence of the Atlas Institute’s Center for Media, Art and Performance and is led by College of Music faculty John Gunther and John Drumheller. This concert is made possible in part by funding from the Chancellor’s Award for Excellence in STEM Education.
Five Questions for Charles Dietrich
What is BLOrK?
The Boulder Laptop Orchestra (BLOrK) is a group in the Music Department and ATLAS that performs experimental electronic music. They collaborate with scientists and engineers to push the boundaries of performance.
How did you get involved?
I met John Gunther, the director of BLOrK, at a STEM (Science, Technology, Engineering and Math) poster presentation on campus in the fall of 2012. He was interested in collaborating with scientists for data sonification - creating music from scientific data. I had recently completed a side project to sonify the muscle activation energies during walking. We discussed a collaboration based on my research into motion capture and gestural interfaces.
What is the musical instrument you developed?
I developed a musical instrument that is controlled by hand gestures using a camera and software kit provided by Intel. The musician can make notes and control the pitch. I am continuing to work on the instrument. One exciting aspect of the instrument is that it lets the audience see what the musician is doing. This is something that has been largely lost with most laptop-based computer programs.
How has the collaboration experience been for you?
Musicians have specialized knowledge about what sounds good and a repertoire of techniques that they use to make music. I decided early on that I wanted to make an instrument with a great degree of flexibility, so that the musicians could use as they would a traditional instrument. The challenge is to make the instrument’s control gestures intuitive and repeatable, so that the musicians can learn to perform with it.
What surprised you about working with the musicians?
I was surprised that it was so well received! The musicians were excited to get their hands on new technology so to speak and to have a new way to use their hands to make music. It’s quite novel to be able to gesture in air to make music, though the Theremin, an analog musical instrument that uses antennas, lets you do that. I was also surprised that they were able to learn the instrument so soon. I showed up for the first demo with a version with frets, to more easily play in-tune notes, but the musicians preferred the fretless version where they had to rely on their ear to stay in tune.