Build a Real-Time Video Effects Plugin
In this tutorial you will create scope-vfx - a Scope plugin that applies GPU-accelerated visual effects to any video input. You will ship two effects (chromatic aberration and VHS/retro CRT) and set the project up so adding more effects later is as simple as dropping in a new file. By the end you will have a working plugin installed in Scope with live UI controls, and you will understand the full plugin development workflow. Watch the full 13-minute build walkthrough:scope-vfx source code
The complete plugin built in this tutorial
What is a Scope plugin?
Daydream Scope is an open-source tool for running real-time interactive AI video pipelines. It supports models like StreamDiffusion V2, LongLive, and Krea Realtime - and its plugin system lets anyone add new pipelines without touching the core codebase. A plugin is a Python package that registers one or more pipelines. A pipeline is a class that:- Declares a configuration schema (what parameters appear in the UI)
- Accepts video frames and/or text prompts as input
- Returns processed video frames as output
Prerequisites
- Python 3.12 or newer
- uv package manager
- Daydream Scope installed and running (desktop app or CLI)
- Basic Python and PyTorch knowledge
Scaffold the project
Create the directory structure
Create a new directory with the following layout:The plugin entry point lives in
__init__.py, the configuration schema in schema.py, the pipeline logic in pipeline.py, and each effect gets its own file under effects/.Configure pyproject.toml
[project.entry-points."scope"]. This is how Scope discovers your plugin - it scans all installed packages for entry points in the "scope" group and loads whatever module they point to. No configuration files, no manual registration - just a standard Python entry point.There are no dependencies listed. Scope’s environment already includes PyTorch, Pydantic, and everything else this plugin needs. Only add
[project.dependencies] if your plugin uses third-party packages that Scope does not already provide.Register the plugin hook
Createsrc/scope_vfx/__init__.py:
@hookimpl decorator (from pluggy) marks this function as a hook implementation. When Scope loads your plugin, it calls register_pipelines() and passes a register callback. You call it once per pipeline class you want to make available.
Define the configuration schema
Createsrc/scope_vfx/schema.py:
Understanding the schema
Pipeline metadata -pipeline_id, pipeline_name, and pipeline_description are class variables that tell Scope how to display your pipeline in the UI.
modes = {"video": ModeDefaults(default=True)} - This declares that the pipeline requires video input (camera feed or video file). It will not appear in text-to-video mode. For a text-only pipeline (one that generates frames from nothing), you would use "text" instead.
supports_prompts = False - These effects do not use text prompts, so the prompt input is hidden.
Each field becomes a UI control. Scope’s frontend reads the JSON Schema that Pydantic generates from this class and automatically renders the right widget:
| Field type | UI widget |
|---|---|
bool | Toggle switch |
float with ge/le | Slider |
int with ge/le | Slider |
enum | Dropdown |
ui_field_config() helper sets display order, labels, and other UI hints. The order values control the vertical position in the settings panel - we use 1-3 for chromatic params and 10-14 for VHS params to keep them grouped with room for future effects in between.
All parameters here are runtime parameters (the default). They are editable while the pipeline is streaming - move a slider and see the result instantly. If you need a parameter that requires a restart (like model selection), add
is_load_param=True to its ui_field_config().Build the first effect - Chromatic Aberration
Createsrc/scope_vfx/effects/chromatic.py:
torch.roll() does the heavy lifting - it shifts a tensor along specified dimensions, wrapping pixels that fall off one edge back onto the other. Since this operates on GPU tensors, it runs in microseconds even at high resolutions.
The intensity parameter maps to a 0-20 pixel displacement range, and angle controls the direction. At intensity 0.3 (the default), you get about 6 pixels of shift - enough to notice without being overwhelming.
Build the second effect - VHS / Retro CRT
Createsrc/scope_vfx/effects/vhs.py:
scan_line_count parameter controls how many lines you see, and scan_line_intensity controls how dark they are.
Analog noise adds random Gaussian noise to simulate the grain you would see on a VHS tape. The multiplier is kept conservative (noise * 0.15) so even at maximum the image is not obliterated.
Tracking distortion is the most visually interesting part. It shifts each row of pixels horizontally by a different amount, following a sine curve. This creates the classic “wobbly VHS” look where the image drifts sideways. We use torch.nn.functional.grid_sample() instead of a per-row loop - this is the GPU-friendly way to apply spatially-varying displacements. It runs a single kernel on the GPU regardless of image resolution.
Wire it all together
Createsrc/scope_vfx/pipeline.py:
Understanding the pipeline class
get_config_class() tells Scope which schema to use for this pipeline.
__init__() receives load-time parameters. We only need the device. The **kwargs catch-all is important - Scope may pass additional parameters that we do not use.
prepare() tells Scope’s frame processor how many input frames to buffer before calling __call__(). We need exactly 1 frame since our effects are per-frame (no temporal dependencies).
__call__() is where the action happens. It extracts the video frames from kwargs (a list of tensors, each (1, H, W, C) in [0, 255] range), stacks and normalises them to [0, 1], runs each enabled effect in sequence, and returns the result in the required [0, 1] THWC format.
Finally, create effects/__init__.py to re-export the effect functions for clean imports:
Install and test
Install the plugin
If you are using the desktop app, click Browse and select the Click Install. Scope will install the plugin and restart the server.
scope-vfx folder. If you are running the server directly, enter the full path to the plugin directory or a Git URL:Select the pipeline
After restart, select VFX Pack from the pipeline selector. Connect a camera or video source and you should see your feed with chromatic aberration applied.
Development workflow
When you are iterating on effects, the cycle is:- Edit the effect code
- Click Reload next to the plugin in Settings
- Scope restarts and picks up your changes
Use it as a post-processor
So far we have been running VFX Pack as a main pipeline, meaning it processes raw camera or video input directly. But what if you want to apply these effects on top of AI-generated video? For example, run LongLive to generate video from a prompt and then add chromatic aberration and VHS effects on top of that output. That is what post-processors are for. In Scope, every pipeline sits in a chain:UsageType to your import and set usage in your config class:
__call__() method receives the exact same tensor format either way. The only difference is who produced those frames - the webcam, or the AI model.
As of writing, Scope’s UI renders parameter sliders for the main pipeline but not yet for pre/post-processors. Your effects will still apply with whatever values are set, but you will not see the sliders when VFX Pack is in the post-processor slot. A workaround: select VFX Pack as the main pipeline first, adjust your sliders, then switch back to your generative model with VFX Pack as post-processor. The values persist.
Adding more effects
The architecture makes extending trivial. Here is how you would add a pixelation / mosaic effect:
Same pattern every time: a standalone function, some schema fields, and a few lines in the effect chain.
What’s next
If this tutorial has inspired you, here are some effects you could add to your own VFX Pack using the exact same pattern:- Glitch blocks - random rectangular displacement for a digital corruption look
- Film grain - more realistic than simple noise, with luminance-dependent grain
- Vignette - darken the edges for a cinematic frame
- Color grading - lift/gamma/gain per channel for full color control
- Kaleidoscope - radial symmetry for trippy visuals
- Edge glow - Sobel edge detection with additive glow
effects/, some schema fields, and a few lines in the effect chain. The plugin grows but the architecture stays simple.
AI-assisted plugin development
We have prepared a set of Claude Code skills and detailed instructions that let you scaffold an entire Scope plugin through an interactive AI-assisted workflow. A dedicated video tutorial showcasing this approach is coming soon.scope-vfx source code
Browse the complete plugin source code, including the Claude Code skill in
.claude/skills/See Also
Using Plugins
Install, manage, update, and uninstall plugins in Scope
Developing Plugins
Full reference for plugin project setup, schemas, and pipeline types
Plugin Architecture
Technical deep-dive into how the plugin system works under the hood
Pipeline Architecture
Configuration schemas, artifacts, UI rendering, and the pipeline lifecycle