Learn how to get a video stream up and running and apply some AI on it
Authorization
header on every request:
Stream
objectwhip_url
from the Create Stream response.WHIP
as the Service and paste the whip_url
as the Server. Leave the Bearer Token blank and save the settings.Sources
section, add a video source for the stream.(Ex: Video Capture Device )Controls
section, select Start Streaming
to start the stream.output_playback_id
from the Create Stream response and open: https://lvpr.tv/?v=<your output_playback_id>
curl "https://livepeer.studio/api/playback/<your playback id>"
POST
to https://api.daydream.live/beta/streams/<YOUR_STREAM_ID>/prompts
with the full JSON body below to control the effect.
Parameter | Description |
---|---|
prompt | Guides the model toward a desired visual style or subject. |
negative_prompt | Tells the model what not to produce (e.g., discourages low quality, flat, blurry results). |
num_inference_steps | Higher values improve quality at the cost of speed/FPS. |
seed | Ensures reproducibility across runs. Change it to introduce variation. |
enabled
field to turn ControlNets on/off, as it currently triggers a pipeline reload. Set conditioning_scale
to 0
to effectively disable a ControlNet, or raise it above 0
to enable its influence.conditioning_scale
: