Moments for Norm (Monochromatic)

for Norman Adams

Max Project (for Mac)

Note that the above requires Max 8, and MaxScore (which will install the required Java package)

The musebots, without Norm; they will play, albeit reluctantly, alone. For this recording, their output was fed back into the system to provide *some* input…

Moments for Norm is an updated version of Moments Monochromatic (2017), a generative system which creates its own structure that audio-generating musebots – RezNoizBots housed in the MultiBOT – explore. The information generated for the musebots – such as available pitches, spectrum, volume – is also presented to a live performer, in this case cellist Norman Adams. Live performance information (i.e. pitches) are sent to the musebots, which adjust their own performance based upon the live input. But rather than being a reactive system, the live performer is viewed by the musebots as just another Bot, and treated accordingly (i.e. often ignored, sometimes followed, usually affecting other musebot decisions).

The audio is pink noise sent through resonant filters, resulting in an airy sound. Each RezNoizBot can play up to three pitches at a time, and there can be up to ten RezNoizBots. These musebots play single pitches, and thus the musical result is a shifting set of harmonies, punctuated by periods of silence.

About the individual musebots:

OrchestratorBOT – Determines timbral content for a work. Text files are used to specify parameter information for the RezNoizBots, including tempo, duration (which can be overwritten by the ParamBOT), # of instances, and “personality” parameters for each RezNoizBot: impatience, persistence, vitality, consistency, compliance, repose, and style.

Individual ensembles can be created as text files, and placed into the Ensembles folder. This folder needs to be manually dragged into the OrchestratorBOT; once you have dragged it into the Dropfile object (the outlined square), select one of the prepared ensembles in the umenu object. It is possible to create your own ensembles: the text files should be in the following format:

tempo <beats per minute> duration <in seconds>
@instances <number of instances>
@RezNoizBot-1/impatience <value between 0.0 and 1.0>
@RezNoizBot-1/persistence <value between 0.0 and 1.0>
@RezNoizBot-1/vitality <value between 0.0 and 1.0>
@RezNoizBot-1/consistency <value between 0.0 and 1.0>
@RezNoizBot-1/compliance <value between 0.0 and 1.0>
@RezNoizBot-1/repose <value between 0.0 and 1.0>
@RezNoizBot-1/style <value between 0.0 and 1.0>

(each instance should have data for every parameter)

ParamBOT – Generates overall structure for a work. Select the duration for the work to be either from the OrchestratorBOT‘s file, or generated using the selected duration range. A final duration is presented in minutes and seconds, as well as the number of sections generated. Parameters are generated for each section – either consistent within the section (a horizontal line) or dynamic (a diagonal line). Spectral targets, taken from a database of analysed ambient works, are also selected. During performance, progress is shown through the section (gray bar) as well as across the sections.

PCsetBOT – Generates pitch sets for a work. Pitchsets are generated for each section based upon the section’s valence (which can be considered pleasantness, or complexity). A set is then transposed to have the maximum number of common tones between it and the previous set.

LivePerformerBOT – Displays the available pitches for the live performer – the same as those utilised by the musebots – as well as active pitches (those currently played by the musebots) and spectral bands for all RezNoizBots. Audio input, and level, for the live performer is set here.

The requested volume and arousal for the section, as generated by the ParamBOT, are displayed at bottom left as black lines; the yellow bars are current analyses of the live performer’s volume and activity level (arousal).

MultiBOT – The audio-generating musebots. Individual musebots have parameter settings assigned from the OrchestratorBOT: these determine how the bot reacts to its environment:
Impatience determines how quickly a bot is willing to start playing. Bots with low impatience actually are will play sooner;
Persistence determines how long a bot will play. Low persistence bots will play shorter notes;
Vitality determines how much energy a bot has across a section. Every time a bot plays, it loses some energy. Bots with higher vitality will tend to play more often within a section;
Consistency determines how often a bot will change what it is doing. Bots with low consistency will change pitches more frequently;
Compliance determines how closely a bot will follow the score. Bots with low compliance will play faster (or slower) than the score, more (or less) often, with more (or less) complexity, and with higher (or lower) volume than requested;
Repose determines whether a bot is willing to sit out entire sections. Bots with high repose will do so more often than bots with low repose.

Individual RezNoizBots (the number of which is determined by the textfile loaded in the OrchestratorBOT) display which sections they will be active within, the available pitches (red) and active pitches (yellow), as well as their current active frequency bands. RezNoizBots can play when their playcounter rises above their impatience level, and will continue to play until their playcounter passes their persistence level (at which point they become inactive). RezNoizBots can play up to three pitches at a time, based upon their vitality; the number of active voices is displayed in yellow, while the duration of a note is displayed in red (moving from left to right) as well as the delay until the next possible note (moving back from right to left).

ListenBOT – Listens to the audio output of the system, including the live performer, and compares it to the generated score. The spectral difference (black sliders) between the two – the request (blue) vs. the actual (orange) – is sent back to the RezNoizBots, which can adjust their output to make up the difference.

microConductor – The time generator. Turning it on starts a work; turning it off clears all parameters.

Getting Started

  1. Double click (or open from inside Max 8) the file “Moments4Norm.maxproj”; you should get lots of messages (but no errors) in the Max window having to do with Java being loaded.
  2. In the LivePerformerBOT:
    1. click on the microphone to turn Max/MSP on;
    2. select the proper audio input and output;
    3. raise the input slider beside the audio input menu;
    4. click the “tracking” button beside the microphone icon;
    5. audio input should make the “live input” object register a pitch.
  3. In the ListenBOT:
    1. – make sure “analysis on” is, in fact, on;
    2. audio input should cause orange faders to register.
  4. In the OrchestratorBOT:
    1. select a pre-made ensemble (I suggest 05_ensemble);
    2. click the green “launch” button;
      1. > this will cause new RezNoizBots to load in the MultiBOT, which can take some time…
    3. when all RezNoizBots have loaded, ParamBOT should display a varying set of parameters.
  5. In the microConductor:
    1. start the timer (click the “off” button).
    2. Bots won’t necessarily play right away…try starting to play on your own…

In performance

  • select pitches to play from the “available pitches” in the LivePerformerBot;
  • ParamBOT displays the current section, and gives an indication for section changes;
  • PCsetBOT displays the current mode on a keyboard;
  • microConductor displays the current time;
  • ParamBOT displays the actual duration of the generated composition;
  • individual RezNoizBots can become active – their active lights can turn on – but not actually play: “Active” indicates that they can play, but will only play if they meet a series of complex criteria for that instant of time.