Spout and Syphon in Scope: Zero latency on your machine
This week we're running our first Launch Week, and it's all about integrations. Over the next five days, we'll be walking through how Scope connects with the creative tools you already use.
This week we're running our first Launch Week, and it's all about integrations. Over the next five days, we'll be walking through how Scope connects with the creative tools you already use, from GPU texture sharing and network video to control protocols that let you perform with AI video in real time. Each day covers a different integration, with real workflows and use cases to show what becomes possible when Scope fits into your existing pipeline.
We're starting with the integration closest to the metal - Spout and Syphon.
If you haven't come across them before, Spout and Syphon are video-sharing frameworks that let applications on the same computer share GPU textures directly, without encoding or compressing anything. Spout runs on Windows, Syphon runs on macOS, and both do the same thing: move video between apps at GPU speed. If you're working in creative tech, chances are your tools already support one of them.
We built both into Scope because the question we kept hearing from VJs, creative technologists, and installation artists was always the same: how do I get Scope's output into the rest of my pipeline without adding latency or losing quality? Spout and Syphon are the most direct answer to that.
Why this matters
Most ways of moving video between applications involve some compromise. Virtual cameras compress your frames. Screen capture eats CPU and drops your alpha channel. These are fine for specific jobs, but when two applications are sitting on the same machine and you need the output of one to flow into the other at full quality, with transparency intact, you want something that skips all those steps entirely.
That's what Spout and Syphon do. The texture never leaves the GPU. There's no readback to CPU, no encoding, no decoding, no re-upload. The frame Scope renders is the frame your next application receives, and the whole thing happens in under a millisecond.
What this looks like in practice
If you're a VJ running Resolume on Windows, Spout lets you pipe Scope's AI-transformed video straight into your Resolume composition as a live source layer. You can mix it alongside your pre-rendered clips, apply effects, and send it to the projector. The AI output shows up in the same frame it was generated.
If you're a macOS artist working with MadMapper or VDMX, Syphon gives you the same workflow. Feed Scope's output into your projection mapping tool or your VJ mixing environment with no intermediate step and no quality loss. Your generative AI visuals become just another source in your pipeline.
Let's say that you are building in TouchDesigner, the workflow gets especially interesting. You can send a particle system or generative visual from TouchDesigner into Scope via Spout, let the AI transform it, and receive the result back into your TouchDesigner network for further compositing. A full round trip, all at GPU speed.
Also, if you're working in Unity or Unreal, KlakSpout and other Spout plugins let you bring real-time AI video into your 3D scenes as live textures. Interactive installations, game environments, immersive experiences - Scope becomes a texture source that responds to whatever you feed it.
The foundation for everything this week
We're starting with Spout and Syphon because they're the simplest integration we're shipping this week, and that's the point. They're the foundation that everything else builds on. Every creative tool you already use probably supports one of them - TouchDesigner, Resolume, Unity, Blender, OBS, MadMapper, VDMX, and the list keeps going.
Scope ships with Spout and Syphon support out of the box - the libraries are bundled as part of the install, so there's nothing extra to set up on your end. No separate SDK, no network configuration, no new protocol to learn. You enable Spout or Syphon in Scope, and your other applications see it immediately.
Instead of being a standalone destination where you generate video, Scope becomes a bridge between your creative tools and real-time AI. And Spout and Syphon are what make that bridge feel invisible.
Get started
1. Open Scope and go to Settings
2. Enable Spout Sender (Windows) or Syphon Sender (macOS)
3. In your receiving application, look for the Scope source - it will appear automatically. To receive video in Scope, set Input Mode to Video and select Spout Receiver or Syphon
Full setup details in the Spout docs and Syphon docs.
What's next
Tomorrow, we go bigger. Same machine, different machines, different operating systems - your video goes wherever you need it.