Skip to main content

What is Live Teleoperation?

Live Teleoperation lets you control a physical robot in real time through its digital twin in the Cyberwave dashboard. Operator inputs — keyboard, gamepad, or SDK commands — are sent to the robot via Cyberwave’s low-latency MQTT and WebRTC infrastructure, while live video and telemetry stream back to the browser.
Teleoperation requires a physical robot connected through Cyberwave Edge. The Edge Core bridges your hardware to the cloud in real time.

How It Works

1. Operator Input

Send commands from the dashboard UI, a keyboard/gamepad controller, or the Python SDK using cw.affect("live").

2. Cloud Relay

Cyberwave routes commands through MQTT to the Edge Core running on the robot’s host machine.

3. Robot Execution

The Edge Core translates commands into hardware-level actions and streams sensor data back.

Supported Control Methods

MethodDescription
Keyboard controllerAssign a keyboard controller to a twin in the dashboard — move joints or navigate with key bindings
GamepadUse a USB or Bluetooth gamepad for more intuitive control of mobile robots and arms
Leader armTeleoperate a follower arm using a physical leader arm (e.g. SO101 leader/follower setup)
Python SDKSend commands programmatically with cw.affect("live") — see the SDK docs

Switching Between Simulation and Live

The same environment supports both modes. In Simulation mode, your actions affect the digital twin only. Switch to Live mode to drive the physical robot. From the dashboard: use the mode toggle in the environment header. From the SDK:
from cyberwave import Cyberwave

cw = Cyberwave()

cw.affect("simulation")
robot = cw.twin("the-robot-studio/so101")
robot.joints.set("shoulder_joint", 45, degrees=True)  # moves the digital twin

cw.affect("live")
robot.joints.set("shoulder_joint", 45, degrees=True)  # moves the physical robot

Live Video Streaming

During teleoperation, live camera feeds from the robot are streamed via WebRTC directly into the dashboard. You can view one or multiple camera feeds alongside the 3D digital twin view. Supported camera types:
  • Standard cameras (USB/CSI via OpenCV)
  • Intel RealSense (RGB + depth)
See the Python SDK video streaming docs for details on setting up camera streams from edge devices.

Prerequisites

1

Digital Twin configured

Create a digital twin for your robot in an environment.
2

Edge Core installed

Install and configure Cyberwave Edge on the machine connected to the robot hardware.
3

Hardware paired

Pair the physical robot with the digital twin via the CLI — see the Quick Start.

Latency

A small delay between operator input and robot response is expected in any teleoperation system. See Live Teleoperation Latency for details on what to expect and why.

Video Walkthroughs

Go2 Digital to Physical

End-to-end: catalog to physical robot deployment with a Unitree Go2

Rover AI Inspection

Autonomous rover mission with AI-driven analysis in simulation

Next Steps

Python SDK

Full SDK reference for live robot control

Digital Twins

Configure twin capabilities and sensors

Workflows

Automate operations with visual workflows