The Aleator is a MIDI generating VST plugin that is written in C# (.NET). What follows is rough overview of how the plugin works. It is assumed that the reader has some knowledge of music theory and is familiar with general application development concepts.
Before you can really understand how the Aleator plugin works, there are a few related topics you need to be familiar with. As stated above, the plugin is in fact a MIDI generator. Therefore it is important that you know what MIDI is and how it works. You can read a brief summary here and get a lot of information about MIDI in general by looking around on the MIDI Manufacturers' Association site.
A basic knowledge of the VST interface will help as well. VST stands for Virtual Studio Technology. It was created by Steinberg to facilitate communication between synthesizers, plugins and audio editors. The VST Wiki is a great starting point and reading it will definitely lessen the learning curve when understanding how our plugin was constructed.
The Aleator (as well as any other VST plugin) runs within a digital audio workstation (DAW). There are a ton of VST compatible DAWs available, but we use Reaper. The Aleator runs inside of the DAW, along with whatever VST synths or drum machines are being used to actually generate audio. The user maps the MIDI output from the Aleator to the instruments however they choose. When the Aleator starts, the MIDI begins to flow to the instruments, causing them to generate sound. This process is represented at the top of this diagram.
If you read the Streams page, you know that the Aleator reads data that defines the basic musical constructs it iterates through as it runs. When the plugin is started, all of that data gets deserialized into memory and the plugin starts to render MIDI and send it up to the host. That process is really involved - on the Techniques page, we will touch upon a lot of the concepts used. Here, we will just try to cover some of the basics.
The composition data is comprised of three XML documents - Composition, Progression and Melody. The composition document contains nodes that serve as a roadmap, laying out the order that the progressions should be played in and how many times they should be repeated. They also set the initial mode of the movement and describe any modulations that occur within it:
<Composition CompositionID="5" Name="Q.E.D." Mode="Aolian"> <Progression SequenceID="1" Multiplier="1">18</Progression> <Progression SequenceID="2" Multiplier="4" Modulation="RelativeMajor">19</Progression> <Progression SequenceID="3" Multiplier="6">20</Progression> </Composition>
The progression data contains sequences of intervals coupled with beat (quarter note) designations that spell out how long each interval is to be held. We can't reference specific chords since the exact key signature is determined randomly at runtime. At design time, the intervals are really the only way for us to control the chords that will eventually be played:
<Progression CompositionID="5" ProgressionID="19"> <Degree Beats="4">ii</Degree> <Degree Beats="4">IV</Degree> <Degree Beats="4">V</Degree> <Degree Beats="4">IV</Degree> </Progression>
Finally, there's the Melody data, which is linked to the progression XML but isn't subject to the chords in the sequence. Melody XML contains of sets of scale degrees as well (in integer notation), but they represent the notes that are available to be played for a given time span. Instead of writing an actual melody, the computer is offered a selection of notes it can pick from at various points in each movement and allow it to make those decisions as it runs:
<Melody CompositionID="5" MelodyID="2" SequenceID="2" ProgressionID="19" > <Cluster ClusterID="1" Beats="4" Notes="5,1,2" /> </Melody>
The session is the C# class that is at the center of everything and really does all of the work in generating MIDI events and passing them up to the host. I'll write some psuedocode in the Techniques blog but suffice it to say that when all of the data is read, lists of noteOn and corresponding noteOff MIDI events are created for each progression. These events represent the placement and duration of notes on the virtual staff. A MIDI clock is started within the application, and the session subscribes to that clock's tick event. When the number of ticks that have occurred aligns with a noteOn event's location on the staff, the session sends that event to the host program (the DAW). That basically amounts to a key being pressed on a synthesizer. When the number of ticks aligns with the corresponding noteOff event, the session class sends an event up to the DAW letting it know it's time to tun that note off. That's really it in a nutshell.
Like all projects, the Aleator is hosted on GitHub. Get in touch if you're interested in gaining access and/or contributing.