Documentation Index
Fetch the complete documentation index at: https://docs.cyberwave.com/llms.txt
Use this file to discover all available pages before exploring further.
What are Workflows?
Workflows in Cyberwave let you create automated sequences of robot operations. Connect nodes visually to build complex behaviors without writing procedural code. Workflows run either on the cloud (Celery tasks — for manual, schedule, webhook, event, MQTT, and email triggers) or on the edge device (forcamera_frame triggers that run ML inference locally without sending video to the cloud).
Workflow Components
Nodes
Nodes are the building blocks of workflows. Each node performs a specific action.stub: Nodes are organised into a hybrid robotics + automation taxonomy. The same nine categories show up in the editor palette and onGET /api/v1/workflows/config(node_categories) — backend, API, and UI all read from the sameWorkflowNodeCategoryenum (cyberwave-backend/src/lib/node_categories.py). Categories with no nodes today (Perception) are reserved and only show up once we ship nodes for them.
Sources & Triggers
Bring events and raw data into the workflow: manual, schedule, webhook, event, MQTT, email, camera_frame, alert, plus sensor
data_source nodesPerception
Convert raw signals into reliable observations — reserved for object trackers, sensor fusion, IMU filtering, ASR (no nodes ship today)
Transform & Routing
Reshape, convert, route, multiplex, or fan out data:
json_parser, annotate, anonymizeState & Memory
Store and retrieve memory over time:
create_asset, edit_asset, update_attachment, video_taggerIntelligence
Use models for interpretation, reasoning, or prediction:
call_model (cloud VLM/LLM and edge ML)Decision & Control Flow
Choose what happens next:
conditional, loop; FSM / Behavior Tree / Rule Engine on the roadmapActuation
Execute physical or twin-side actions:
twin_control (Move Twin); future joint / gripper / navigation primitivesIntegration
Talk to external systems and run user-supplied logic:
http_request, send_email, code (edge-only)Observability & Safety
Guard, validate, observe, and alert:
send_alert; validators, watchdogs, e-stop guards, and anomaly detectors are on the roadmapConnections
Connections define the execution flow between nodes:- Sequential: Execute nodes one after another
- Parallel: Execute multiple nodes simultaneously
- Conditional: Branch based on conditions
camera_frame triggers can only connect to call_model nodes) are blocked.
Creating a Workflow
stub: Workflows now have an editableslugthat is unique within a workspace. You can keep the generated slug or customize it for stable SDK and automation references. Workflows can also be markedpublicfrom the workflow creation and editing UI.
Using the Dashboard
- Navigate to Workflows in the dashboard
- Click Create Workflow — set a name, optional slug, and visibility
- Drag nodes from the palette to the canvas
- Connect nodes by dragging from output to input ports
- Configure each node’s parameters
- Click Activate
stub: The bottom-right toolbar of the workflow canvas now includes a multi-select tool. Toggle it on, drag a marquee across the canvas (or shift-click individual nodes) to pick a group, then drag any selected node to shift the whole group, or press Delete / use the floating Delete button to remove them in bulk. Press Esc to leave multi-select mode.
stub: The workflow editor supports undo (Ctrl/Cmd + Z) and redo (Ctrl/Cmd + Shift + Z, or Ctrl + Y), with up to 50 steps per session. It covers node creation, deletion (single & bulk), node moves (single & group), connection wiring, node renames / notes, and workflow-level changes (name, description, slug, visibility, activation). Toolbar buttons next to the Hide/Show Nodes button mirror the same actions.
stub: Workflows created from the dedicated Workflows page aregeneralworkflows. From an environment, Cyberwave can also createmissionworkflows that stay bound to that environment and reuse the same workflow editor/runtime with a mission-specific profile layered on top.
stub: Mission workflows now expose a Move Twin node that targets an environment waypoint. The node waits for navigation completion before continuing and can optionally dwell at the waypoint for a configured number of seconds.
Using the CLI
--base-url / -u to override the API URL. When a UUID argument is omitted, an interactive arrow-key selector is shown.
Using the SDK
Executing Workflows
Manual Execution
Triggered Execution
Workflows can be triggered by:- Schedule: Run at specific times (cron)
- Events: Run when sensor data matches conditions
- API: Trigger from external systems via REST or MCP
- Camera Frame: Run on every camera frame at the edge device
Edge Workflows (Camera Frame)
Thecamera_frame trigger generates a Python worker (wf_<uuid8>.py) that runs ML inference directly on the edge device. The call_model node’s emit_event parameter controls event emission:
- emit_mode:
always,on_enter(new classes only),on_change(count changes) - cooldown_seconds: minimum delay between events (default 5s)
cyberwave workflow sync or wait for the automatic periodic sync.
stub: When acall_modelnode sits downstream of acamera_frametrigger, the model picker greys out models whose output isn’t compatible with the edge perception chain. The gate rejects (in priority order): cloud-only deployments; models that don’t accept image input; models taggedclassification(single-label classifiers can’t drive bbox / mask filters); and models whoseoutput_familyistext/action/mesh/image. Models withoutput_family = jsonor unset are accepted — most edge YOLO seeds resolve tojsontoday and are the canonical use case. Greyed cards stay visible with a tooltip explaining why; toggle Allow incompatible models in the dialog header to override.
stub: The same gate runs at compile time.See Edge Workers for the full generated worker lifecycle and format.cyberwave workflow sync(and any other path that triggers worker codegen) fails fast with a named error if acall_modelreferences an incompatible model — no more silently-dropped chains in the generated worker. The override toggle in the picker only relaxes the UI affordance; persisting an incompatible selection (or importing a workflow that already has one) is still rejected at compile time.
Monitoring Executions
Track workflow execution status and results:started_at, finished_at, and error_message fields.
Example: Edge Detection Workflow
Acamera_frame workflow that runs YOLO on the edge and emits alerts:
Best Practices
Keep workflows focused
Keep workflows focused
Create separate workflows for distinct operations rather than one large workflow. This makes debugging and maintenance easier.
Add error handling
Add error handling
Include condition nodes to handle failure cases gracefully. Consider what should happen if a joint can’t reach its target.
Use meaningful names
Use meaningful names
Name nodes and workflows descriptively. “Alert on person in zone A” is better than “Node 1”.
Use emit modes for edge workflows
Use emit modes for edge workflows
Use
on_enter for alert-style use cases (person entering a zone) and
on_change for occupancy tracking. Set cooldown_seconds to avoid event floods.Next Steps
API Reference
Full workflow API documentation
Edge Workers
Generated workers, eject pattern, and custom workers