Documentation Index
Fetch the complete documentation index at: https://docs.daydream.live/llms.txt
Use this file to discover all available pages before exploring further.
Overview
This guide will take you through the process of sending video input to our StreamDiffusion pipeline. You will learn how to adjust parameters to create a variety of visual effects, utilize live streaming and audio interactivity features, generate real-time visuals, and view the resulting output video. Our goal by the end is to have an effect that will transform a user into an anime character via their webcam.API Auth
The use of the API key is currently subsidized for a limited time, and we will provide an update on pricing in the future.
Creating Your First App
Building on top of our StreamDiffusion pipeline consists of three parts:- Creating a
Streamobject (backend) - Sending in video and playing the output (frontend)
- Setting StreamDiffusion parameters
Using the SDKs
The easiest way to integrate Daydream is with our SDKs. Here’s a full-stack example using the TypeScript SDK (backend) and Browser SDK (frontend).Backend: Create a Stream
Install the TypeScript SDK:Frontend: Broadcast & Play
Install the Browser SDK:useBroadcast and usePlayer.
Alternative: Using cURL + OBS
If you prefer to use cURL and OBS instead of the SDKs, here’s how:1. Create a Stream
Available models:
stabilityai/sdxl-turbo- SDXL, high quality (recommended)stabilityai/sd-turbo- SD2.1, fastestLykon/dreamshaper-8- SD1.5, great for stylized effectsprompthero/openjourney-v4- SD1.5, artistic style
2. Stream with OBS
- Install OBS
- Go to Settings → Stream
- Set Service to
WHIPand paste thewhip_url - Add a video source and click Start Streaming
- Watch at:
https://lvpr.tv/?v=<output_playback_id>

Update Parameters
Change the prompt or other settings in real-time:You only need to include the parameters you want to change.
Add ControlNets
ControlNets preserve structure from your input video:Available ControlNets
SDXL Models (stabilityai/sdxl-turbo):
xinsir/controlnet-depth-sdxl-1.0- Depth guidancexinsir/controlnet-canny-sdxl-1.0- Edge detectionxinsir/controlnet-tile-sdxl-1.0- Texture preservation
Lykon/dreamshaper-8, prompthero/openjourney-v4):
lllyasviel/control_v11f1p_sd15_depth- Depthlllyasviel/control_v11f1e_sd15_tile- Tilelllyasviel/control_v11p_sd15_canny- Canny edges
stabilityai/sd-turbo):
thibaud/controlnet-sd21-depth-diffusers- Depththibaud/controlnet-sd21-canny-diffusers- Canny edgesthibaud/controlnet-sd21-openpose-diffusers- Body posesthibaud/controlnet-sd21-hed-diffusers- Soft edgesthibaud/controlnet-sd21-color-diffusers- Color composition
What’s Next?
- TypeScript SDK - Full server-side API
- Browser SDK - WebRTC broadcasting
- TouchDesigner Plugin - For VJ and creative apps
- OBS Plugin - Add AI to OBS streams
- Parameters Reference - All available parameters
- API Reference - Full API documentation