NDI in Scope: Your real-time AI video, across the network and beyond
Scope now sends and receives real-time AI video over your network via NDI - connecting to Resolume, OBS, TouchDesigner, and any NDI-compatible app on any machine.
Yesterday, we kicked off our first-ever Launch Week with Spout and Syphon to share video between apps on the same machine. Today, your video leaves that machine entirely.
NDI (Network Device Interface) is the protocol that professional broadcast studios, live production rigs, and creative tech venues already run on. It sends real-time video, audio, and metadata over your local network to any NDI-compatible application on any machine, on any operating system.
If you've worked in live production or creative tech, you've almost certainly run into it before. We built NDI support into Scope because once your video can travel across the network, the kinds of setups you can build change completely.
Why this matters
Spout and Syphon are incredibly fast, but they only work between applications on the same computer. The moment your workflow spans two machines - say a GPU workstation running Scope and a separate performance machine running Resolume, or even a cloud GPU on something like Runpod (via Bridge) - you need a protocol that works over the network.
NDI handles that, and it brings a few things along for the ride that make it more than just "video over IP."
For one, it works across every major platform. Spout only runs on Windows, Syphon only on macOS, but NDI works on all three - Windows, macOS, and Linux. If your installation uses different platforms for different roles, or if your team works across operating systems, NDI connects them all through a single protocol.
It also bundles audio and metadata with your video into a single stream. A single NDI connection from Scope can carry the generated video, synchronized audio, and arbitrary metadata, such as generation parameters or frame timing. You don't need to set up separate routing for each.
And it handles discovery automatically. When Scope starts sending NDI, every NDI-capable application on your network receives it immediately. There are no IP addresses to type in and no ports to forward. Sources just show up in a dropdown, ready to connect.
In practice
If you're running a multi-machine studio, NDI connects the pieces. Scope runs on your GPU workstation doing the AI inference, Resolume runs on a separate machine handling the final mix and output, and NDI links them over a standard Ethernet cable. No capture cards, no custom hardware needed.
You also might be building a multi-room installation, a single Scope instance can feed video to displays across an entire venue over your existing network. When you add a new machine, it discovers the NDI source automatically. You can scale up without rewiring anything.
Streaming with OBS is a common case, and the one where you can dedicate your Scope machine entirely to AI generation while your streaming rig pulls the NDI feed through the DistroAV plugin. That way, the GPU-heavy inference stays separate from encoding and streaming, and each machine focuses on what it does best.
If your team works across platforms, NDI gives everyone a common language. Your macOS artist, your Windows GPU rig, and your Linux render node all see the same NDI sources without needing any platform-specific bridges.
Spout, Syphon, and NDI working together
These protocols complement each other rather than compete. When two apps are on the same machine, Spout and Syphon are the right choice because they share GPU textures directly with almost zero overhead. When you need to go cross-machine, cross-platform, or you want audio and metadata bundled with your video, NDI takes over. It adds a small amount of latency since frames get encoded for network transport, but in exchange, you get reach, flexibility, and automatic discovery across your entire setup.
Between yesterday's Spout and Syphon and today's NDI, Scope's video transport layer is now complete. Your video goes wherever you need it, at whatever speed the situation calls for.
Getting started
First, install the NDI SDK/Tools for your platform (Windows, macOS, or Linux) and restart Scope. The NDI options appear automatically once the SDK is detected.
To receive NDI in Scope:
- Under Input & Controls, set Input Mode to Video
- Set Video Source to NDI
- Pick your source from the dropdown - Scope discovers available NDI sources on your network automatically
Scope matches the source's native resolution on connect, so there's no stretching or compression to deal with.
To send from Scope via NDI:
- Open the Outputs panel
- Toggle NDI Output on
- Set a sender name (default is "Scope") - this is what other apps on the network will see
Any NDI-compatible application on your network can then receive Scope's output.
Full setup details in the NDI docs.
Community project: Dynamic time-coded buffer for live VJing
One of our community members and Scope users, OpticMysticVJ (Will), built something that shows exactly why NDI matters in a real performance setup. His project is a time-coded buffer system for live VJing that solves one of the hardest problems in real-time AI video: keeping visuals locked to the beat even when AI generation times are inconsistent.
The system works by encoding beat position and frame sequence as a barcode stamped onto each frame before it goes through AI processing. On the output side, it decodes that data and releases frames quantized to the beat grid rather than the moment they arrive. The result is BPM-matched visuals regardless of how long inference actually takes.
What makes this relevant to NDI is the multi-machine, multi-app nature of the pipeline. Will's setup involves Scope running VACE for the AI generation, Resolume handling the final VJ mix, Ableton Link and Pioneer CDJ integration for beat sync, and a local vision model (Qwen 2.5 VL) monitoring both the main output and preview channels to generate contextually appropriate prompts automatically. NDI is what ties these pieces together across the network, letting each component run on the hardware that suits it best while keeping everything in sync.
"I'd been experimenting with different ways to integrate Scope into my VJing workflow, and I realized it would be great if I could scrub through the AI output the way DJs scrub through music on a CDJ. Nothing like that existed, so I built it. It captures Scope's NDI output into a real-time buffer that you can scratch, loop, and cue frame-by-frame as it's being generated."
He even built a system where DJs can embed creative directives in track metadata comment fields on their CDJs. When a track loads, those directives feed into the prompt logic automatically, with no coordination needed between the DJ and the VJ.
Check out the full project on Daydream.
Check out Will's demo video
Follow OpticMysticVJ: Website | Instagram | Twitch
What's next
We've got the video flowing, both locally and across the network. Tomorrow, we shift from moving video to controlling it - with a level of precision and flexibility that opens up entirely new ways to perform with Scope.
Links
- Launch Week 01 - Follow along all week
- Day 1: Spout and Syphon - Yesterday's post
- NDI documentation - Full setup guide
- Download Scope - Get the latest version
- NDI Tools - Download the NDI SDK for your platform
- GitHub - Star the repo and check out the source
- Discord - Join the community