Contact

Feel free to contact me about projects, or with any questions.

         

123 Street Avenue, City Town, 99999

(123) 555-6789

[email protected]

 

You can set your address, phone number, email and site description in the settings tab.
Link to read me page with more information.

Updates

Ryan Baldwin - Technical Sound Designer & Audio Engineer - Blog

[Sound Design, Game Audio, Sound Art, Interactive Computing, Field Recording]

Filtering by Tag: generative composition

Evergreen Hybrid Music II - Max Generative Composition Experiment

Ryan Baldwin

Created for Hybrid Music II at The Evergreen State College.

The idea that I wanted to experiment with for this piece was generative music, or setting up an algorithm to create a composition that would never be exactly the same, and could go on indefinitely, while still being pleasing to the ears.

This resulted in drafting a relatively simple patch in Max 7 to generate MIDI data that was sent to the Sculpture modeling synthesizer in Logic Pro. Left running, Max and Logic would spit out pseudorandom tones forever.

In order to trim the concept down into something more appropriate for playback in a class setting, Logic was set to record the MIDI data created by Max for several minutes. This was done twice, with slight variations to the parameters in the Max patch, to provide notes for two Sculpture synth voices, on separate tracks. This content was then layered with some original field recordings from around the area (a bubbling stream, and frogs at night) and edited in a more traditional multitrack arrangement, to form a definite composition.

The next step for this idea, is to build the Max patch into a more standalone application, using samples and/or synthesis for audio, rather than relying on having a DAW running.

The version of the Max patch used to generate MIDI data for this piece

Some technical details

The core of the Max patch is a couple octaves of notes from the Mixolydian scale, represented as MIDI values. Once the toggle at the top is activated, it starts [metro] objects which act as pulsing timers, outputting "bangs" (or instructions for action) at various intervals. The timing of the main [metro] object is randomized by the other [metro] and [drunk] objects to its right.

Every time a "bang" is emitted by [metro], the [urn] object spits out a random number within the range of all possible MIDI note values (0-127). By design, [urn] only picks each number in its set of values once. After all values have been chosen, [urn] will reset and continue choosing random numbers.

The various [sel] objects act as filters, and only allow numbers within the specified scale through. The numbers that make it through are used as the MIDI note to be played. Each time a note is chosen, a randomized velocity (how hard the note is played), and note length are selected as well. All of this is then sent as MIDI data out, for instruments to respond to (in this case Logic Pro's Sculpture modeling synthesizer, run through Waves Enigma).