home < portfolio < audiopad < questions

  Here are some frequently asked questions about Audiopad. If the question you have is not answered here, please email James Patten at jpatten@media.mit.edu.

How much does audiopad cost?

Audiopad is not currently for sale. We are looking at commercialization, but a retail product is far enough away that we do not have concrete price points or release dates at this point.

Can I have a copy of the source code/plans for Audiopad?

We are not able to release our source code at this point, but if you want to build something similar, you might be interested in reading this paper.

Who composed the music that you perform with Audiopad?

All of the music we perform with Audiopad was composed by Ben Recht, aka localfields.

Can the performer load in his/her own audio samples?

Yes, the samples are copied to a specific directory on the Audiopad computer and then they can be used in a performance.

Are you available to do performances or installations?

Yes, we enjoy doing performances and installations, though we are busy graduate students and so we can't do them very often. Feel free to send us an email and let us know what you have in mind.

Will you build a software-only, screen version?

Possibly, but one of the main points of this is using physical objects to embody and control music. We believe that this approach provides a more usable interface for performance, and that the keyboard and mouse interface is less appropriate for musical performance.

Why don't you use a laptop and a touch screen, or a tablet PC?

We believe that one of the most important aspects of the design of Audiopad is the use of multiple physical objects that move, rather than a single large touch screen with interactive buttons, sliders, etc. The main difference is that it is much easier to interact with real physical objects because your hands get passive haptic feedback from the objects which helps you move them without requiring a lot of visual attention. If we were to use a touch screen, it would need to be a large one that could track multiple points of contact at a time, and (ideally) differentiate between fingers and other objects on its surface. One step in the right direction is Lemur, but there is still a ways to go before something like this could be used for Audiopad. (Thanks to Bram de Jong for sending in the Lemur link.)

Is there going to be a commercial product?

We are looking at that now. We will put more information on these pages when we can.

I am a musician, and I would like to use Audiopad in my performances.
I would like to beta-test Audiopad.

Thanks for your interest, but we only have a few Audiopad prototypes, and they are used for our own performances, installations and research. That being said, we are interested in being in contact with other artists doing related work. If that's you, please send email.

What operating system/libraries/etc. do you use?

We use Linux. Debian or Knoppix to be specific. Most of the code is written in Python, except for the low-level tracking code, which is in C and C++. We use the Pysco optimizer to speed up the Python. We use OpenGL for graphics, and talk to the synthesizer using MIDI.