What is a compatible driver?
A compatible driver creates the connection between a hardware device’s own API and the Cyberwave platform. It is responsible for translating the device’s native protocol into the twin model that the rest of the platform understands. Drivers can run anywhere — on a dedicated edge device, directly on the robot hardware, in the cloud, or on a developer laptop. In production, they typically run on edge hardware co-located with the device they control. Each driver is packaged as a Docker container image. Edge Core pulls and runs that image, injecting the environment variables described below. This means you can develop and test your driver locally using the same image that will run in production.Quickstart: scaffold with the Claude skill
The fastest way to get started is the Cyberwave Driver skill for Claude Code. It asks you a few questions about your hardware and scaffolds a complete, production-ready driver project — including the Dockerfile, local dev setup, and a working twin connection. Install the skill:Quickstart: use the SDK
The fastest way to write a compatible driver is to use one of the official SDKs:- Python SDK —
cyberwave-sdk - C++ SDK — see the C++ SDK docs
Environment variables
When Edge Core starts a driver container it injects the following environment variables. You can develop your driver assuming these are always set to valid values — no need to handle the case where they are absent.| Variable | Description |
|---|---|
CYBERWAVE_TWIN_UUID | UUID of the twin instance this driver manages |
CYBERWAVE_API_KEY | API key scoped to this driver for authenticating platform calls |
CYBERWAVE_TWIN_JSON_FILE | Absolute path to a writable JSON file containing the twin’s current state (see Twin JSON file) |
CYBERWAVE_CHILD_TWIN_UUIDS | (optional) Comma-separated UUIDs of child twins (e.g. cameras) attached to this driver |
CYBERWAVE_CHILD_TWIN_UUIDS is set when child twins are attached to the driver twin. Drivers can use this to coordinate child devices (for example, multiple cameras) without additional configuration.
Restart behavior tuning
The following optional variables let you override Edge Core’s restart defaults:| Variable | Default | Description |
|---|---|---|
CYBERWAVE_DRIVER_RESTART_LOOP_THRESHOLD | 4 | Number of restarts before the driver is marked as flapping |
CYBERWAVE_DRIVER_RESTART_LOOP_WINDOW_SECONDS | 60 | Time window (seconds) used to count restarts |
CYBERWAVE_DRIVER_TROUBLESHOOTING_URL | https://docs.cyberwave.com | URL surfaced in platform alerts for operator guidance |
Driver failure handling
Drivers must exit with a non-zero code when they cannot access required hardware (for example, a missing/dev/video* device or a disconnected peripheral). This allows Edge Core to detect startup failures and trigger restart logic.
Edge Core raises the following alerts:
driver_start_failure— raised when a driver container cannot reach a stable running state.driver_restart_loop— raised when a driver exceeds the restart threshold within the window. The container is stopped and marked as flapping.
Twin JSON file
CYBERWAVE_TWIN_JSON_FILE points to a JSON file on disk that contains the digital twin instance (including its metadata) and the associated catalog twin data, matching the TwinSchema and AssetSchema API schemas.
Drivers may read and modify this file. Edge Core syncs any changes back to the backend when connectivity is available.
Runtime configuration
Drivers should treatmetadata["edge_configs"] as the source of truth for per-device runtime configuration, and metadata["edge_fingerprint"] as the edge identity (not duplicated inside edge_configs).
Read edge_configs from CYBERWAVE_TWIN_JSON_FILE at startup to obtain per-device settings without hardcoding them in the image.
Sensor data output
If your driver produces sensor data (video frames, depth maps, audio, joint states, etc.), publish it to the edge data bus so worker containers and ML models can consume it locally with zero network overhead. There are two options: the Zenoh data bus (recommended) and the filesystem convention (fallback for constrained environments). Both use the same channel names — a driver can switch between them by changing one env var.Option A: Zenoh data bus (recommended)
The Zenoh data bus provides zero-copy shared memory between driver and worker containers. Data is consumed directly by worker hooks andcw.data.latest().
Key expression convention
| Segment | Value | Example |
|---|---|---|
cw | Fixed prefix | cw |
twin_uuid | UUID of the twin | a1b2c3d4-... |
data | Fixed namespace | data |
channel | Canonical channel name | frames/default |
DataBus handles key composition automatically via CYBERWAVE_TWIN_UUID.
Canonical channels
| Channel | Encoding | Pattern | Wire payload |
|---|---|---|---|
frames/default | numpy/ndarray | Stream | SDK header + raw BGR/RGB uint8 |
depth/default | numpy/ndarray | Stream | SDK header + raw uint16 depth (mm) |
joint_states | application/json | Latest value | {ts, names, positions, velocities?, efforts?, source_type} |
position | application/json | Latest value | {ts, x, y, z, qx?, qy?, qz?, qw?} |
audio/default | numpy/ndarray | Stream | SDK header + float32 PCM |
pointcloud/default | numpy/ndarray | Stream | SDK header + Nx3 float32 |
imu | application/json | Stream | {ts, accel: {x,y,z}, gyro: {x,y,z}} |
battery | application/json | Latest value | {ts, voltage_v, current_a, charge_pct} |
telemetry | application/json | Latest value | Free-form {ts, ...} |
Python SDK example
CYBERWAVE_TWIN_UUID is read automatically from the environment. CYBERWAVE_DATA_BACKEND selects the transport (zenoh or filesystem).
Wire format reference (for native language publishers)
For C++, Rust, or any language that needs to publish without the Python SDK:content_type:"numpy/ndarray"|"application/json"|"application/octet-stream"shape:[H, W, C](for ndarray; omit for JSON/bytes)dtype:"uint8"|"uint16"|"float32"etc. (for ndarray; omit for JSON/bytes)
C++ native publish example
Minimalzenoh-cpp snippet that publishes frames with the correct header:
DataBus.subscribe() automatically decodes this payload — no adapter code needed.
Option B: Filesystem convention (fallback)
The filesystem convention is the fallback for environments where
eclipse-zenoh cannot be installed. For most drivers, use cw.data.publish() (Zenoh data bus) instead — it provides zero-copy shared memory and is consumed directly by worker hooks. Both conventions use the same channel names.CYBERWAVE_EDGE_CONFIG_DIR is always set by Edge Core (defaults to /app/.cyberwave).
Ring buffer (for stream data)
- Write
.npyfiles to numbered slots:{slot:06d}.npy - Slot index =
write_count % buffer_size(default: 120) - Atomic writes: write to
{slot}.npy.tmp, thenrename()to{slot}.npy - Update
meta.jsonafter each write
Latest value (for state data)
- Write a single JSON file:
latest.json - Atomic writes: write to
latest.json.tmp, thenrename() - Include a
timestampfield
.npy files and JSON to the same paths.
MQTT topics and payloads
If you publish data over MQTT directly (rather than through the SDK’scw.data.publish), see the MQTT API Reference for the complete list of topics and payload schemas supported by the platform. That page covers:
- Twin transform: position, rotation, scale
- Joint state updates (single-joint, flat multi-joint, and aggregated formats)
- Navigation commands and status reporting
- Locomotion commands (
move_forward,turn_left, etc.) - Telemetry lifecycle events (
connected,telemetry_start,telemetry_end) - Sensor data: depth frames, point clouds, metrics
- Edge health reporting
- WebRTC signalling
- Health check ping/pong
Migrating from MQTT-only drivers
If your driver currently publishes sensor data over MQTT, you can add Zenoh publishing without removing the MQTT path. The two paths serve different consumers:- MQTT → cloud backend (telemetry, frontend, workflows)
- Zenoh → local worker containers (zero-copy inference, fusion)
Step 1: Set CYBERWAVE_DATA_BACKEND
EnsureCYBERWAVE_DATA_BACKEND=zenoh is set in the driver container. Edge Core sets this automatically for managed drivers. For manual testing:
Step 2: Add cw.data.publish alongside the MQTT call
Step 3: Verify with a subscriber
Controlling which paths are active
SetCYBERWAVE_PUBLISH_MODE to choose:
| Value | Effect |
|---|---|
dual | Both MQTT and Zenoh publish (default) |
zenoh_only | Only Zenoh (local-only drivers) |
mqtt_only | Only MQTT (legacy mode) |
Licensing your driver
You own your driver code. There are two common paths:- Open source — publish your driver as a public repository on GitHub under the Apache 2.0 license. This is our recommended default and makes it easier for the community to contribute and reuse your work.
- Closed source — keep your driver proprietary. In this case, we recommend obfuscating your code before distributing the image and including a clear license file that reflects your distribution terms. Interested in writing a closed-source driver? Reach out to us.
Example drivers
The following open-source drivers are good starting points and reference implementations:- Camera driver — cyberwave-os/cyberwave-edge-camera-driver: integrates a USB/RTSP camera as a Cyberwave twin.
- SO-101 arm driver — cyberwave-os/cyberwave-edge-so101: integrates the SO-101 robotic arm.
Advanced topics
Once you have a working driver, these guides cover the platform features your driver can leverage:Edge Workers
Hook-based worker modules for on-device ML inference and event-driven processing.
Data Wire Format
SDK header encoding, key expressions, and the on-wire contract for edge data channels.
Data Fusion Primitives
Time-aware sensor fusion: interpolated point reads and time-window queries.
Synchronized Multi-Channel Hooks
Approximate time synchronizer that fires when samples from all listed channels arrive within tolerance.
Record & Replay
Capture live edge data to disk and replay it for deterministic debugging.
MQTT API Reference
Complete list of MQTT topics and payload schemas: telemetry, commands, navigation, joint states, and more.