Documentation Index
Fetch the complete documentation index at: https://docs.cyberwave.com/llms.txt
Use this file to discover all available pages before exploring further.
What are Workflows?
Workflows in Cyberwave let you create automated sequences of robot operations. Connect nodes visually to build complex behaviors without writing procedural code. Workflows can execute in two environments:- Cloud — schedule, webhook, manual, event, MQTT, and email triggers run as Celery tasks on Cyberwave infrastructure.
- Edge — the
camera_frametrigger generates a Python worker that runs ML inference directly on the device. Raw video never leaves the edge.
Workflow Components
Nodes
Nodes are the building blocks of workflows. Each node performs a specific action:Trigger Nodes
Start the workflow: manual, schedule, webhook, event, MQTT, email, or camera_frame (edge-local)
Call Model Nodes
Run ML inference — cloud VLM/LLM or edge-local object detection (YOLO, etc.)
Twin Nodes
Control digital twin position, rotation, and state
Joint Nodes
Set individual joint positions or run trajectories
Condition Nodes
Branch based on sensor data, twin state, or model output. Includes time-based gates like
timed_condition for “must persist for N seconds” semantics.Spatial Filter
Polygon zones in normalized image coordinates — keep only detections inside the zone. Pairs with
timed_condition for zone-based intrusion alerts.Delay Nodes
Add timing between operations
Connections
Connections define the execution flow between nodes:- Sequential: Execute nodes one after another
- Parallel: Execute multiple nodes simultaneously
- Conditional: Branch based on conditions
camera_frame triggers can only connect to call_model nodes) are blocked before saving.
Inspector: Wired vs Available inputs
stub: When you select a node, the inspector splits its Inputs and Outputs into a Wired group (what the node is actually consuming or feeding) and an Available group (everything else, collapsed by default once anything is wired). An input counts as wired when a connection lands on it explicitly, when a node-level edge satisfies the schema’s sole required input, when a constant or upstream reference is set in the input’s editor, or when the schema declares the input is implicitly satisfied by an upstream node type (e.g.annotate/anonymizeconsume the upstream frame from acall_modelautomatically on edge camera-frame chains). An output counts as wired when a downstream node references it explicitly — either via reference-mode mapping ({ source_node_uuid, source_output }) or via an expression like{node-name.frame_index}. For nodes whose every input is optional (call_modelis the canonical case), a plain canvas-drawn edge to or from the node also lights up its inputs/outputs in the wired group — the wire itself is the signal that the node is in the pipeline.
Canvas card I/O strip
stub: Each node card on the canvas shows a compactin:/out:strip listing the inputs and outputs that are currently wired, not the full schema. A freshly added node with no connections shows no strip; once you wire it up the strip starts listing the ports actually in use. Drag-drawn edges between two nodes (which don’t pin a specific port) credit the source node’s outputs and — for nodes with no required inputs, likecall_model— also the target’s inputs, so the strip never goes silent on a node you’ve clearly connected. The strip caps each row at a few names and collapses the rest into a+Nbadge; hover the row to see every wired port with its type and arequiredmarker (sorted required-first). The full schema view (with both wired and available ports) lives in the inspector — open the node to discover everything else it can consume or produce.
Missing-configuration footer
stub: Nodes whose required inputs or parameters aren’t set yet show an amber “Configure …” footer on the canvas card (e.g.Configure twinon acamera_frametrigger,Add Python codeon a freshcodenode,Select a modelon acall_modelwithout a chosen LLM/VLM, or3 settings requiredon asend_emailnode missing recipient / subject / body). Click the footer to jump to the inspector — the same fields are highlighted there with aRequiredpill so the next step is obvious. The cue is a warning, not an error: the workflow still saves and structurally validates; it just can’t execute until the items listed are filled in.
Trigger Types
| Trigger | Where it runs | Description |
|---|---|---|
| Manual | Cloud | User clicks “Run” in the dashboard or calls the SDK |
| Schedule | Cloud | Cron or interval timer |
| Webhook | Cloud | HTTP POST to a generated URL |
| Event | Cloud | Business event matching conditions |
| MQTT | Cloud | Message on a subscribed MQTT topic |
| Cloud | Incoming email | |
| Camera Frame | Edge | Every camera frame — ML inference on-device, only events sent to cloud |
Edge Workflow Execution
When a workflow uses acamera_frame trigger connected to a call_model node with an edge-compatible model, the backend generates a Python worker file (wf_<uuid8>.py) via WorkerCodegen. The edge device pulls this file on boot and periodically, writes it to its workers directory, and the worker runtime activates the @cw.on_frame hook.
Schedule-triggered run_on_edge workflows use the same worker delivery path.
The generated module registers @cw.on_schedule(...); the worker runtime
evaluates the cron locally with croniter and calls the generated run(...)
entrypoint when due.
The call_model node supports configurable event emission via emit_event:
- emit_mode:
always(every detection),on_enter(new classes only),on_change(count changes) - cooldown_seconds: minimum delay between consecutive event publications (default 5s)
Multi-twin perception workflows
A single perception workflow can drive multiple twins by adding more than onecamera_frame trigger, each pinned to a different twin. The compiler emits one @cw.on_frame(<twin_uuid>, …) handler per trigger and ships the same wf_<uuid8>.py to every involved twin’s edge — each handler only fires for frames from its own twin’s camera, so co-located edges never collide.
The /api/v1/workflows/{uuid}/compile endpoint returns the full set of referenced twins as twin_uuids (sorted). For backward compatibility, the legacy twin_uuid field is set to the only twin for single-twin workflows and to null for multi-twin workflows.
Navigation workflows (those containing a twin_control / Move Twin node) remain single-twin: the compiled worker is scoped to one client.twin(...) handle. Activate one workflow per twin if you need to drive multiple robots, or set run_on_edge=false to run the workflow as a cloud workflow that can address several twins from a single process.
See Edge Workers for the full lifecycle, eject pattern, and generated worker format.
For a privacy-preserving end-to-end recipe combining camera_frame, call_model, anonymize, spatial_filter, timed_condition, and send_alert, see the Zone-based intrusion detection tutorial.
Creating a Workflow
- Navigate to Workflows in the dashboard
- Click Create Workflow — set a name, optional slug, and visibility
- Drag nodes from the palette to the canvas
- Connect nodes by dragging from output to input ports
- Configure each node’s parameters
- Click Activate
sync_workflows MQTT command to every twin the workflow references, so the edge picks up the new wf_*.py within seconds. Running cyberwave workflow sync or waiting for the periodic reconcile still works and is the fallback if the MQTT broker is unreachable.
Executing Workflows
Workflows can be triggered by:- Schedule: Run at specific times (cron)
- Events: Run when sensor data matches conditions
- API: Trigger from external systems
- Camera Frame: Run on every frame at the edge device
Best Practices
Keep workflows focused
Keep workflows focused
Create separate workflows for distinct operations rather than one large
workflow. This makes debugging and maintenance easier.
Add error handling
Add error handling
Include condition nodes to handle failure cases gracefully. Consider what
should happen if a joint can’t reach its target.
Use meaningful names
Use meaningful names
Name nodes and workflows descriptively. “Alert on person in zone A” is
better than “Node 1”.
Use emit modes for edge workflows
Use emit modes for edge workflows
Use
on_enter for alert-style use cases (person entering a zone) and
on_change for occupancy tracking. Set cooldown_seconds to avoid event floods.Next Steps
API Reference
Full workflow API documentation
Edge Workers
Generated workers, eject pattern, and custom workers