[_new world]
A/V through programming, research, and imagined worlds.
Sat Nov 02 2024
This project serves as a proof-of-concept for working within the limitations of web-technologies in the world of audio-visuals and new media art.
Using WebGL, programming, and music software; I intend to create imagined worlds which reflect the many things that have inspired me for a lifetime.
This project is titled [_new world].
Concept
[_new world] is both the A/V that you see on-screen, and the underlying software that hosts it. The project is a bespoke, MIDI-reactive pattern-library containing:
- A functional dashboard for pre-configuring visual environments.
- A performance window to demonstrate these environments in real-time.
Using Electron's ipcRenderer
, [_new world] listens for MIDI events passed from an active Ableton Live session. This data is parsed locally; with attributes such as channel, note velocity, and note-length used to compose active modules & on-screen animations.
Functional Overview
Within the project exists isolated modules.
Each module exists as a Class file; with a respective lifecycle of methods controlling how they behave functionally.
An example of three unique Classes:
### WaveFractal
Animated wave fractal using P5.js
| Method | Props |
| :-------------- | :------------ |
| .changeSpeed(); | speed: Int |
| .changeColor(); | color: String |
### Gltf
Renders any Blender export .gltf file via Three.js
| Method | Props |
| :-------------- | ------------- |
| .handleModel(); | color: String |
| .rotateRand(); | null: null |
### Text
Parses text onto the screen, with terminal animations.
| Method | Props |
| :------------ | :------------ |
| .setColor(); | color: String |
| .parseText(); | null: null |
Triggering animations
In Ableton Live, unique MIDI notes are mapped to a set of channels via two plugins from the LiveGrabber set; the SingleNoteGrabber, and then sent into a localhost environment through the GrabberSender.
In the codebase, mapped are channels assigned listeners.
const noteToChannelMap = {
"G8": "ch1",
"F#8": "ch2",
"F8": "ch3",
"E8": "ch4",
"D#8": "ch5",
"D8": "ch6",
"C#8": "ch7",
"C8": "ch8",
};
// Initialize OSC Server
const oscServer = new osc.Server(1337, "127.0.0.1");
oscServer.on("message", async (msg) => {
const midiNote = msg[1];
const channel = noteToChannelMap[midiNote];
if (channel) {
console.log(`MIDI Note: ${midiNote}, Channel: ${channel}`);
try {
const moduleName = `module_${channel}`;
const module = await import(`./newModules/${moduleName}.js`);
const ModuleClass = module.default;
// Additional functionality if needed for the ModuleClass
} catch (error) {
console.error(`Error loading module for channel ${channel}:`, error);
}
} else {
console.warn(`Note ${midiNote} is not mapped to a channel.`);
}
});
For the software to understand which modules & method's have been configured to trigger; exists userData, a JSON file compiled directly from a User Dashboard (more on the dashboard to follow).
{
"WaveFractal": { // Module name
"methods": {
"ch1": [ // Fires on 'ch1', AKA 'G8' from Ableton
{
"name": "randomizeShape",
"options": null
},
{
"name": "shakeAnimation",
"options": null
}
]
}
}
}
The methods
property defines the specific method triggered by a given MIDI note. For instance, ch1
could represent a 'kick drum' layer. Each time a kick drum is triggered, this assigned function will run.
User Dashboard
[_new world] features a custom-built user Dashboard (built in React.js and Tailwind) for piecing together animation sequences between modules. Ultimately, the UI here is developed to mutate the userData.json
.
Note: A previous version of this project required me to manually update the userData.json for an entire performance, which became quickly unsustainable.
The dashboard allows for dragging a .mid
file directly from Ableton, automatically parsing MIDI notes that can then be assigned to specific methods.
Why not proprietary?
Many artists in this field gravitate towards software like TouchDesigner, Blender, and MaxMSP. However, WebGL has evolved significantly, and as a programmer, I've witnessed remarkable advancements in client-side development over the past decade.
Pros and cons
Pros:
- Multiple technologies — Modules in this project can take many forms, and are not limited to a specific library. Isolated components can feature anything from Three.js, p5.js, Blender models, canvas-based particle systems, and even standard HML + CSS.
- Access to the web — The project allows easy access to datasets or API's from the open web, serving as a data source.
Cons:
- Performance — Compared to something like Blender or TouchDesigner, Electron scores slightly lower for painting and rendering. These limitations have forced a focus on module-optimisation.
Work within limits
A philosophy I subscribe to; and ceases to disappoint. Robert Henke discusses this in relation to musical pursuit.
Having pre-defined limits allows an artist to be more creative with what they have. In this case, the limitation is Javascript itself.
As an abstract, high-level language, it operates with a level of dynamism distinct from lower-level languages like C++. In JavaScript, there’s less focus on memory management, direct hardware interaction, or strict type enforcement.
Understanding the performance issues that this language poses, forces to consider new concepts in animation sequencing, module-loading, and transitional parts of a display.
Influence
Seeking the old
I believe the name for this aesthetic is Cassette Futurism; a concept that unknowingly captured me from an early age. There's something in the ability to represent future worlds through old technology that I find inescapably fascinating.
Sci-fi UI & HUD displays
With some subtle overlap, a more futuristic design pattern. Whilst not present within older technology such as CRT displays, the visuals present on-screen data & information in a similar fashion.
With [_new world], I hope to align loosely with these underlying graphic-styles.
Data Points
This project offers an opportunity to repurpose information. I advocate for the overlap between practical science and artistic representation; organisations like NASA, World In Data, and archive.org host numerous datasets that lend themselves to creative reinterpretation, transforming raw information into compelling artistic narratives.
[
{
"name": "Aachen",
"id": "1",
"nametype": "Valid",
"recclass": "L5",
"mass": "21",
"fall": "Fell",
"year": "1880-01-01T00:00:00.000",
"reclat": "50.775000",
"reclong": "6.083330",
"geolocation": { "type": "Point", "coordinates": [6.08333, 50.775] }
},
{
"name": "Aarhus",
"id": "2",
"nametype": "Valid",
"recclass": "H6",
"mass": "720",
"fall": "Fell",
"year": "1951-01-01T00:00:00.000",
"reclat": "56.183330",
"reclong": "10.233330",
"geolocation": { "type": "Point", "coordinates": [10.23333, 56.18333] }
},
...
]
Used in a p5.js
environment, the data is used on this very webpage (right-hand side).
if (p.width > 0 && p.height > 0) {
meteors.forEach((meteor, index) => {
meteor.maxY = Math.min(meteor.maxY + meteor.animationSpeed, p.height);
let distortionMagnitude = Math.min(meteor?.mass / 10 || 1, maxDistortion);
let highestDistortion = 0, peakX = centerX, peakY = 0;
p.stroke(255 - index * 50);
p.noFill();
p.beginShape();
for (let y = 0; y < meteor.maxY; y += 5) {
let distortion = (p.noise(noiseOffsetX + y * 0.01, noiseOffsetY + index) - 0.5) * 2 * distortionMagnitude;
let x = centerX - distortion;
p.vertex(x, y);
if (Math.abs(distortion) > highestDistortion) {
highestDistortion = Math.abs(distortion);
peakX = x; peakY = y;
}
}
p.endShape();
if (!meteor.textAppeared && p.millis() - meteor.startTime >= meteor.textDelay) {
meteor.textAppeared = true;
}
if (meteor.textAppeared && meteor.geolocation?.coordinates) {
p.fill(255 - index * 50);
p.text(`${meteor.geolocation.coordinates[0]}, ${meteor.geolocation.coordinates[1]}`, peakX - 15, peakY);
p.noFill();
}
});
noiseOffsetX += 0.003;
noiseOffsetY += 0.003;
}
Showcasing
Taking a cheap projector from Amazon out into the city—or even inside an apartment—to test modules in various aspect ratios has been a fun way to better understand how this might work in an exhibition setting.
Like many in this space, an artist can look forward to the performing aspect of this. Ultimately, [_new world] is designed to piece together not just concept-modules, but also audio-visual sets.
Closing words
As the project grows, I aim to continue sharing, exploring, and documenting the process to give the work a sense of purpose and to offer insight for others.
Thank you for reading.
[d.a]