MIDI in Scope: Hands-on real-time AI video control

MIDI in Scope enables hands-on control of real-time AI-generated video. Map any knob, fader, or pad to generation parameters with a learn-based system. Save mappings, switch profiles, sync to tempo via MIDI clock.

MIDI in Scope: Hands-on real-time AI video control
MIDI in Scope enables hands-on, real-time AI video control

Time flies, and we are already on Day 4 of Launch Week.

On Monday we started with Spout and Syphon for local video sharing. Tuesday brought NDI for network video. Yesterday, OSC gave us precision parameter control over the network. Today, the control gets physical.

MIDI is the protocol that connects hardware controllers to software. If you have a MIDI controller sitting on your desk, a drum pad, a keyboard, or a rack of faders, you can now map any of them directly to Scope's generation parameters. Turn a knob to adjust the noise scale. Tap a pad to switch prompts. Push a fader to blend between styles. The mapping is yours to define.

MIDI still matters

Yesterday, we talked about OSC and how it gives you high-resolution, network-native control. MIDI works differently. Instead of being about float precision or network routing. MIDI is about putting your hands on something physical and making adjustments.

MIDI has been the standard for hardware control in music and performance for decades. The ecosystem is massive. There are thousands of controllers, from simple knob boxes to complex performance surfaces, and they all speak the same language. When you pick up a MIDI controller, you already know how it works.

The resolution is lower than OSC (128 steps per parameter instead of continuous floats), but for live performance, that's often exactly enough. In most situations, you're not trying to dial in a value to four decimal places. Instead, you're reaching for a fader during a set and pushing it until the visual feels right.

What you can map

Scope uses the Web MIDI API with a learn-based mapping system. You don't need to manually configure CC numbers or channels. You click a parameter in Scope, move a knob on your controller, and the mapping is made.

Currently, four mapping types are available:

  • Continuous - faders and knobs controlling numeric parameters (noise scale, VACE strength, transition steps)
  • Toggle - buttons that flip boolean parameters on and off (pause generation, enable noise controller)
  • Trigger - one-shot actions like resetting the cache or switching prompts
  • Enum cycle - rotate through a list of values with each press (interpolation methods, input modes)

Beyond parameter control, you can also map MIDI to specific actions: switch between prompts, load presets, add or remove denoising steps, or toggle pause. These action triggers make MIDI especially useful during live performance when you need quick access to common operations without touching the screen.

Scope also supports MIDI clock for tempo sync. If your DAW or hardware is sending MIDI clock, Scope can derive BPM and beat phase from it, keeping visual transitions locked to the music. You can quantize parameter changes to beats or bars, and set lookahead for anticipating transitions.

What this looks like in practice

A VJ with a MIDI controller is the most straightforward setup. Map your faders to noise scale, denoising strength, and VACE context. Map your pads to prompt switching. You now have a physical instrument for performing with AI video. Scope's mapping UI shows visual feedback (a colored dot on each mapped control), so you always know what's connected to what.

DAW integration through Ableton or any MIDI-capable host opens up timeline-based control. Send MIDI CC automation from your DAW to Scope, and your visual transformations follow the music arrangement. Verse gets one look, chorus gets another, breakdown strips it back. The DAW becomes your visual sequencer.

Multi-app setups are where you'll need virtual routing. MIDI typically locks a device to a single application, so if you want the same controller to talk to Scope and Resolume, you route it through a virtual MIDI bus. On macOS, the built-in IAC Driver handles this (enable it in Audio MIDI Setup). On Windows, loopMIDI by Tobias Erichsen does the same job. Set up your virtual ports, point your apps at them, and your controller fans out to everything.

Gallery installations can use MIDI to give visitors physical interaction with AI video. A simple panel with a few knobs or buttons, mapped to Scope parameters, turns the generation into something people can touch. No screen interface needed.

Saving and switching mappings

Once you've built a mapping that works for a show or project, Scope saves it as a profile. You can switch between profiles for different sets or venues without remapping everything from scratch. The mappings persist in your browser's local storage, so they survive restarts.

You can view all active mappings through the mappings modal (the list icon next to the MIDI toggle), which shows each controller input, its target parameter, mapping type, and range. Delete individual mappings or clear everything and start fresh.

Getting started

  1. Connect your MIDI controller to your computer
  2. In Scope, toggle MIDI Input on in the settings panel
  3. Select your device from the Device dropdown
  4. Click Edit Mapping to enter mapping mode
  5. Click any parameter control in Scope (a slider, toggle, or button)
  6. Move a knob or press a pad on your controller
  7. The mapping is made - click Save Mapping when you're done

If your controller isn't showing up, make sure no other application has exclusive access to it. On macOS, check that the IAC Driver is enabled if you need multi-app routing.

Full setup details, including virtual routing guides for macOS and Windows, in the MIDI docs.

What's next

Tomorrow, we close out Launch Week and showcase how digital meets physical.

If you've ever wanted your real-time AI visuals and your venue's lighting to move as one, Friday is for you.