Creating a web-based pattern library that works with Ableton

Creating a web-based pattern library that works with Ableton

[_new world] is a object-driven library built in Javascript. It uses real-time MIDI data to display audio-reactive visuals in the browser.

To break this down a little...

Using the webmidi library, the app listens for MIDI triggers passed from an active Ableton Live session. This MIDI information is passed into a localhost session where attributes such as channel, note velocity, and note-length are used to compose active modules & on-screen animations.

Why browser-based, in Javascript?

You might see this approach as unorthodox (and you wouldn't be wrong). Most people in this field go straight to TouchDesigner, Blender, Processing, or even Python, but hear me out... The browser today is not what it used to be, and as a web developer, I've seen client-side development come a long way in just the past decade. Thanks to code splitting, ES6+ conventions, HTML canvas—and in my case—even libraries such as Three, D3, and P5, it's easier than ever to get performance-friendly animations on the client.

Building an audio-reactive library in Javascript: Some pros & cons:


  • Multiple technologies — Modules in this project can take many forms, and are not limited to a specific library. Isolated components can feature anything from Three.js, p5.js, Blender models, canvas-based particle systems, or even just good old CSS.
  • Access to the web — Within this project we can easily import datasets or API's from the open web, which in turn can serve as a data source. If I ever need to query—let's say—some live asteroid data from NASA, I can do this easily directly from a module.


  • Performance — Compared to something like Blender or TouchDesigner, the browser scores slightly lower for things like painting, rendering, and even MIDI latency. With that said, these limitations have forced me into thinking deeper on module-optimisation.

What kinds of modules are being used here?

Each module serves as an independent, object-based file within the repo. Modules have a lifecycle with methods that control how they behave; things like animation styles, position, colour handlers, and z-index. See below some example modules built so far.

### WaveFractal

Animated wave fractal using P5.js

| Method          | Props         |
| :-------------- | :------------ |
| .changeSpeed(); | speed: Int    |
| .changeColor(); | color: String |

### Gltf

Renders any Blender export .gltf file via Three.js

| Method          | Props         |
| :-------------- | ------------- |
| .handleModel(); | color: String |
| .rotateRand();  | null: null    |

### Text

Parses text onto the screen, with terminal animations.

| Method        | Props         |
| :------------ | :------------ |
| .setColor();  | color: String |
| .parseText(); | null: null    |

Doing this as a pattern library means that over time, I can refine and expand the modules being used. These modules can be used in conjunction, triggered by any variation of instruments or velocity.

So how do I pick & choose which modules are triggered in a given session? We have all these modules to pick from, but how do I know what will appear (and how it will be triggered)?

channelMacros: [
    { channel: 1, instrument: 'ch1' },
    { channel: 2, instrument: 'ch2' },
    { channel: 3, instrument: 'ch3' },
    { channel: 4, instrument: 'ch4' },
    { channel: 5, instrument: 'ch5' },
    { channel: 6, instrument: 'ch6' },
    { channel: 7, instrument: 'ch7' },
    { channel: 8, instrument: 'ch8' },
    { channel: 15, instrument: 'section' },
    { channel: 16, instrument: 'velocity' }

Using the array above, Ableton channels are macro'd to be later used in the composer.

The composer is a simple object file which listens for changes in the Ableton session and decides which modules are loaded, and what functionality is triggered via MIDI.

The composer is my fundamental logic handler for an audio-visual performance. This system promotes scalability in the number of modules that can be configured.

Basic composer example:

const waveFractal = {
  constructors: {
    1: module => { // fires once on 'section 1';
  reactors: {
    1: { // within 'section 1'
      ch1: module => { // fires on each 'ch1' MIDI note

export const composer = {

constructors contain functions that are fired at the beginning of each section. A track in Ableton might be split up into multiple sections; Introduction, Drop, Breakdown, etc. In a Constructor, you would typically initialise modules with methods like or module.setColor().

reactors contain functions that listen for each time an instrument's MIDI note is fired (within a specific section). ch1 might be your 'kick drum' layer. So every kick that you hear, this function will fire. Reactors typically handle anything audio-reactive.

Work in progress notes:

Both the pattern library and this blog piece are currently under development. Whilst not yet available to the public, I hope to open-source the repo later into the year at version 1.0.

If you made it this far, thanks for exploring my latest project.

Daniel Aagentah 2023