Author Topic: Wanted to share my recent project which combines Ableton with Javascript  (Read 172 times)

0 Members and 1 Guest are viewing this topic.

Offline daniel.aagentahTopic starter

  • Newbie
  • Posts: 2
  • Country: gb
I’ve created an audio-visual setup within an Electron application that caters to both exhibition settings and live performances. This setup is capable of processing a variety of data inputs, including OSC and MIDI, and utilizes a range of technologies such as P5.js, Three.js, D3.js, TouchDesigner embeds, and WebGL for visualizations. Here’s how I've structured it:

Input Handling
Based on the performance context, I switch between different data inputs:

Exhibition Setting: I use OSC data to drive the visuals.
Live Performance Setting: I prefer MIDI data to trigger changes in the visuals.
Dashboard and Module Configuration
Through my personal dashboard, I select which visual modules to display, along with configurations such as layout, color schemes, and which modules should be primary, secondary, or tertiary.

Inter-Process Communication (IPC)
My Electron app includes multiple windows, such as a dashboard and a projector window. I communicate settings and state changes from the dashboard to the projector window using Electron's IPC mechanism to ensure that updates are synchronized in real-time.

Visual Modules and Data Processing
Upon receiving OSC or MIDI data, the projector window decides which methods to activate on the selected visual modules. These modules are dynamically loaded from separate files and can utilize various web technologies for the visualizations.

Real-Time Interactivity
From the dashboard, I have the flexibility to swap modules, add overlays, and adjust visual parameters on the fly, enabling a highly dynamic and interactive performance experience.

Technical Implementation Highlights
Module Loading and Setup: I dynamically load modules by scanning a directory for .js files. Both primary and secondary visual components are initialized from these modules.
Throttling and Input Processing: I process inputs from OSC or MIDI with throttled functions to balance the execution frequency of visual triggers, which helps maintain performance without losing responsiveness.
Dynamic Layout and Styling: My visuals can adapt to different aspect ratios and background styles based on user selections, providing versatility for various display environments.
Real-Time Control and Feedback: I've built the system to support real-time module switching, layout adjustments, and visual effects tuning from the dashboard, with feedback mechanisms from the projector window.


Instead of uploading big videos here, I thought it'd make more sense to link my instagram: https://www.instagram.com/daniel.aagentah/
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf