The Code of Music

Day 468 @ ITP: The Code of Music

Final project : Color Theremin V2

48366685_290160711843034_2071386027334828032_n.jpg

WHAT IT IS

The Color Theremin is a gesture-controlled musical instrument which plays and manipulates sound samples based on my hand movements, with customizable lights triggered along with the sounds in an accompanying light sculpture.

WHY I MADE IT

I have for many years been interested in the interaction between sound and light. I made this instrument to have a more expressive way to manipulate sounds along with an interactive light component, and to have a new and fun way for me to perform create/record music. I was also inspired by some of the exercises we did in class using granular synthesis and manipulation of samples along the x/y axes.

While I was also thinking the Color Theremin could potentially be a standalone interactive installation, it needs work. I will be continuing to work with different iterations of the project to improve the light/sound interactions.

HOW IT WORKS

The samples are triggered by hand movements on X, Y and Z axes. Moving the hand left <~> right, up <~> down, forward <~> backward triggers the samples and also manipulates the speed and volume at which they are played. The core elements of the setup are Max MSP and MadMapper, as well as addressable LEDs for the light component.

THE CODE

The Color Theremin V2 was programmed entirely using Max MSP for the sound, sending MIDI control changes to MadMapper to trigger the LEDs.

Max Patch

MadMapper

SCREEN CAPTURE/VIDEO

Day 447 @ ITP: The Code of Music

Final project proposal

For my final project in The Code of Music, I would like to expand upon a gesture-controlled instrument that I am building by creating a new sound palette for it. For our class I would like to come up with some musical/synthesized motifs that could be manipulated with gesture. Ideally I would like to have the instrument be something that others could play with as an interactive installation, or at least a tool I could use for performance.

early caricature of a color organ in use

early caricature of a color organ in use

The instrument itself is made of a set of acrylic tubes meant to emulate pipes. The inspiration or idea was to create a deconstructed “color organ” (see image on right) meets theremin. Lights inside the pipes are triggered based on hand gestures in the air around it along with triggering sounds mapped in a similar way, making the composition a bit unpredictable, but with time you can get the hang of it and repeat gestures, or enjoy the randomness.

I would be using a combination of Max MSP and MadMapper, controlled by a LeapMotion.

sketch of instrument/interaction

sketch of instrument/interaction

instrument being built in second messy prototype phase… (it does light up, but I’ll save that part for when it’s done for class!)

instrument being built in second messy prototype phase… (it does light up, but I’ll save that part for when it’s done for class!)

This is the first time I have made music with gesture (at least with hands waving in the air.) I think it could be an interesting segue from the midterm where I started to explore introducing different sounds/patterns and showing them with associated colors. In essence, this is a way to visualize the music as well, as well as to trigger/spatialize the sound in a stereo field, and allow room for more expressivity.

Day 432 @ ITP: The Code of Music

Techno Is A Landscape: Proposal For An Interactive Museum Exhibit
Adi, Ada, Camilla (Group project)

SOME BACKGROUND

In the video above producer James Wiltshire calls techno the “perfect balance between human analog inspiration and the technology around it,” and goes on to say that modern techno is a “perfect balance between humans and digital technology.” In this way, it seems that modern techno is a perfect musical tool for interfacing as humans with the automated world around us.

He also references the 1980 novel The Third Wave, where the author Alvin Toffler talks about a future that is half-machine, half-human. In the novel he describes a wave of technology coming forward and changing society (Note: this was written almost 40 years ago, and what he was anticipating has since come to pass.)

According to Toffler, the second wave was the industrial revolution, the third wave was the incoming information wave. He foresaw a a new technology and new society which would waylay what came before it.

Early Krautrock band Kraftwerk were some of the early adopters of electronics as a means for creating “automatic” sounding music created by humans using tools from the wave of technology Toffler describes, as can be heard in their 1982 album Computer World, which explores this relationship between human and machine through their brand of proto-techno music.

According to Wikipedia, however, the term techno was not officially used as a word for a genre until later on, in 1988 in the context of Detroit techno, which is “seen as the foundation upon which a number of sub-genres have been built….To producers such as Derrick May, the transference of spirit from the body to the machine is often a central preoccupation; essentially an expression of technological spirituality. In this manner: ‘techno dance music defeats...the alienating effect of mechanization on the modern consciousness.’”

The central rhythmic component is most often in common time (4/4), where a single measure is divided into four beats marked by a bass drum on each quarter note pulse. Each beat can be further divided, with eighth notes dividing each quarter note in half and sixteenth notes dividing each quarter note into quarters. Famously called “four-on-the-floor,” the bass drum is the automated heartbeat or metronome around which all other sounds fall into place: a backbeat played by snare or clap on the second and fourth beats of the measure, and an open hi-hat sounding every second eighth note. The tempo tends to vary between approximately 120 to 150 beats per minute (bpm), depending on the style of techno. Techno focuses more on sophisticated rhythm and repetition (cross rhythm, syncopation) than it does on melody or chord progression. The development is gradual where many sounds are layered and altered over time. All of these characteristics together create something automatically definable as techno.

Techno is a landscape. It’s a reaction and artistic statement about the actual automated world that we live in. It’s half machine & half human! Techno’s highly repetitive and mechanic rhythmic structure against human expressivity in sampling, modulating, remixing and looping shapes its unique art form.

PROCESS

From here, our group identified three elements of modern techno music:

  1. Repetition - “four on the floor” as basic unit

  2. Instruments being layered/omitted one by one over the course of a song

  3. Altering/modulating a sound texture gradually over time

With the elements in mind we think the best way to explain how techno music works is by deconstructing an existing techno song into individual instrument layers as the building blocks. We will have users rebuild their version of the track by adding and subtracting layers and playing with different combinations on top of the 4/4 rhythmic structure, and give them expressive controls over multiple parameters that control certain layers.

First, Ada built a sketch in p5 taking individual instrument layers from the song “Solace” by Pan-Pot and looped each layer/pattern and synced them to the TransportTime, so that no matter when the user turns on the layer, the layers will always be in place with one another.

We met and listened to the sounds together, which were triggered in time in sync with a four-on-the-floor bass drum, and brainstormed about how we could expand this to be an educational tool. Inspired by Valentina’s presentation with the different musicians for the fantasy Blues band, we came up with the idea to present different options for each of the elements, to give the user in our museum some feeling of agency of being able to choose their own sounds and filters within the constraints of a techno infrastructure.

We discussed that one essential element of techno is patience; elements often come in one by one, being introduced over the course of the song mathematically according to how long it has been since the last element was introduced. Since it is hard to teach patience, we instead decided to create an interface that would inhibit the need to do everything at once, while providing enough options to allow the user to create something that they feel is their own.

After brianstorming, Adi created the interface. Our interface is inspired by the popular DAW Ableton Live, which many artists use to create techno tracks. Each sound (in our case: a bass drum, multiple hi-hats, clicky percussion, rim, bass, brass hooks, pad drone, and a sequence) has a button to enable it below, with a selection of different sounds to choose from, with a visual indicator to show where in the timeline the sound is out of 4/4 -- all following the kickdrum as the heart of the song.

INSTALLATION

For a museum environment, we imagine that our installation could be shown as it is now, on a screen, for the user to play with using headphones or with speakers with instructions popping up from time to time to prompt users to enable or modulate clips, similar to the Jazz.Computer. This is to suggest users what could be a proper progression of a techno song, but it's up to them to follow the instruction or not. It could also be installed in a more immersive environment, such as in a room with surround sound, featuring tactile buttons that would trigger each sound with visual feedback such as flashing brightly colored squares, rectangles, circles, lines, etc. on the walls around the user, which would line up with the instruments they have selected and where they are in the timeline.

THE SKETCH

45881403_2260074987400268_4562491690652270592_n.png

Interact with the final sketch here

View the code here

Our presentation slides

Day 419 @ ITP: The Code of Music

Midterm Project: Tone.js Synth Interface

Sketch: https://editor.p5js.org/ivymeadows/full/SkhkB4EnX

Code: https://editor.p5js.org/ivymeadows/sketches/SkhkB4EnX

CONCEPT

I mentioned in an earlier blog post for class that I have always felt limited by how it seems necessary to have several hardware synthesizers in order to be able to get a certain combination of sounds, which is cumbersome and expensive.

Software also has its own limitations of a prebuilt interface that you work within.

For these reasons, experimenting with customizing an interface and its sounds was appealing to me, to see what I could come up with without restrictions (besides the restriction of coding, which is a restriction of its own for sure, but I did learn a lot!)

VIDEO

TECHNOLOGY
This was created by combining HTML events with CSS and P5.
All the sounds were made using Tone.js.

REFLECTION
There are still some clicks/pops and glitchy sounds happening with the Tone.js sounds, and I’m not sure why. I learned a lot about using HTML events while creating this interface and would like to explore that more. The sounds are still not where I would like them to be, and I would have liked to add more sliders to affect filters on the sounds, but I ran out of time. I do however feel that I have more tools now for experimenting with Tone.js and it took me a long time to get to this point.

For the final, I am torn between improving this sketch or working on a hardware instrument. I am leaning more towards creating a hardware instrument with customized sounds, but if I do that I would continue to work on Tone.js experiments on my own.

I would carry on some elements from this midterm, such as 3 options that are tones (with the added option to manipulate them), 3 options of arpeggiations, and 3 of percussion, or something consolidated from this idea. It would be to create one piece, but then could be an instrument for creating other pieces in the future by swapping out the patterns/sounds.

MIDTERM PLANNING SHEET (click to enlarge)

 

Day 412 @ ITP: The Code of Music

Assignment: Sampling + Effects

Sketch: https://www.openprocessing.org/sketch/611509

*Note: Sometimes when loading the sketch I am getting a error saying there is a Script error on line -1 (?) then if I reload a few times it works. What does it mean?

PROCESS
I spent some time wrangling this one, and would have liked to have sliders and on/off switches involved to change the effects (as seen on the Tone.js examples page for GrainPlayer) but was having trouble assigning the sliderValue to one of the parameters in the GrainPlayer. If I integrate this into an interface with other functions later I will work on integrating that. I ended up playing with a few different simple elements of sampling and effects and combined the Tone.js GrainPlayer with the Tone.js Sampler to create a simple scene using a track I made last winter in the background being manipulated by mouseX and mouseY, if you click the mouse you add frogs and also reverse the audio, and if you hit a,s,d,f you trigger frog sounds manipulated through the sampler. I also got a nice effect of just the song through the GrainPlayer here, though the audio has a lot of glitchy clicks.


RESPONSES

  1. Keezy Classic

    I found this super fun to play with. Do SP-404 samplers work like this?? I liked how I held one key down it stopped the other samples. This could definitely be fun to make compositions with, especially with the option to add your own samples (it seems you can do this with the built-in mic, maybe there is a way record directly into the phone too…? Probably not)

  2. Sampulator

    This was also very intuitive and fun to play with. I also liked some of the sounds, and steered clear of others. At first I thought if you held down a key it would repeat at the tempo in a signature, but it does not. I think that would be a cool function to add, though, especially for the kicks and drum parts. I also later realized you can record separate tracks and play them back…a great tool/toy! If it offered the option to export a sample that would make it a real musical instrument? I also just saw that you can “shop samples.” I would prefer to make my own similar version with uploaded samples, but it’s definitely inspiring.

Day 405 @ ITP: Code of Music

Assignment: Parallel Harmony

For this assignment I decided to keep working on my interface from last week’s synthesis assignment, I am familiarizing myself with Tone.js and am excited about the possibilities, but still getting the hang of it, so not building an entirely new interface is helpful to just experiment with the sounds more

For this week I added some new buttons with new melodies and tones to add on top of each other, and played with the Tone.MembraneSynth to add some percussive sounds as well as with changing the timings of the notes and rests between notes, though it still sounds a bit of a mess to my ears. My next goal would be to come up with more of a composition.

Here is the sketch: https://editor.p5js.org/full/rkAZb4p5X

[Note: when I first tried to take a screen recording the sketch suddenly started to sound like garbage. I finally realized it was because I was trying to use a program to record the audio separately and it was getting confused by the sketch also trying to take the audio input from the computer’s mic. So I ended up just using the computer’s built in mic to record the audio, and that worked.]

The interface I realized is loosely inspired by the Buchla easel, and I’d like to model it a bit more after the way the easel looks (without the patch cables, etc but with sliders and knobs/buttons to create effects) and also with creating similar kind of robotic and playful rhythms and melodies, but maybe a bit more ambient/evolving than the slightly jerky sounds in the video below:

 

I also enjoyed doing the Ableton making music tutorial and the Catalog piece on parsing out elements of music you like. I’ve always felt limited by pre-built synthesizers and it is very fun to try to imagine creating something that does exactly what you want it to do. Feeling the free to borrow elements from synthesizers that already exist to create interfaces with is also helpful.

Day 397 @ ITP: The Code of Music

Assignment: Synthesis w/ Tone.js

PROCESS
For this first real exploration into making sounds using the web browser (without pre-recorded audio samples) I ended up making a playable web synth (somewhat painfully — then joyfully) through cobbling together different types of synthesis and effects, inspired partially by projects that were made in our class, as well as from the Tone.js examples page.

Instructions:
- Press on/off to start playing and to mute or unmute all sounds.
- Press the colored buttons to play different synth sounds. Hold down the orange/green/yellow buttons to extend the pattern. The two pink buttons turn on their patterns permanently, until I can figure out how to turn them off with a second click.
- Move the slider to manipulate the frequency of an oscillator which is controlled by the mouse.

You can play the final sketch here. See the code here.

Some frustrations:

  • I had been using openprocessing.org up until now because overall I found it easier to use, but I couldn’t figure out how to add HTML elements or CSS behind the scenes using their editor.

  • I ideally wanted to start and stop all the individual synth patterns when hitting the buttons (allowing them to loop when enabled until stopped, and restart when hit again), which should be possible?

  • I would have liked to add some rotary sliders/circular dials to add effects to each sound, which I think I now know how to do, as well as a way to fade out the volume rather than turn it off abruptly. But wasn’t able to get there by our class.

Day 391 @ ITP: The Code of Music

Assignment #3: Melody Sequencer

View the sketch here

Screen shot of the melody sequencer playing a pattern.

Screen shot of the melody sequencer playing a pattern.

For this project I decided to build off the 8-step drum sequencer example from class. I multiplied the number of squares by four, so there are 16 tracks and 32 steps. The result reminded me of one of my favorite toys/instruments, the Tenori-on, a grid-based sequencer which works like this sketch in its simplest mode.

I ended up using the notes C, E, G, and A in different octaves, so technically it is creating a “C7” chord. I like how having multiple instances of the same note gives the option to layer them on top of each other to create different timbres, as well as creating chords from the different notes. The chord is very simple, but I think it could be a nice start to a song, or a motif to enter in with. Ideally next I would add more high notes and different sounds, but I will stop here for this first experiment and keep that in mind for something more complex further down the line, with different instruments being introduced rather than only the piano sound.

One gripe while working on this was that somehow the volumes of my sounds changed drastically once I put them into the sketch, and also at times would glitch and sound like they were peaking, and at other times not. This can be seen slightly in the video below when it glitches as it reaches the highest note. I also wanted to have more color interaction, but wasn’t able to get different colors to show up with the different notes. I also attempted adding a slider to change the bpm, and a pitch shift effect, but neither worked. In the future I will attempt tricks like this again with some help.