What are SO101 Robot Arms?
SO101 is an open-source, 6-degree-of-freedom (6-DOF) robotic arm set designed for desk-based use. It is commonly built using 3D-printed parts and standard hardware servos, making it low-cost and highly customizable. These robot arms expose developers to real robotic hardware without the cost or complexity of industrial systems. The SO101 arm set is often deployed as a dual-arm (leader–follower) configuration, but this setup is optional. Users can also operate a single follower arm independently.Physical Components
- 6-DOF Articulated Arm: A compact 6-DOF robotic manipulator designed for close-range, desk-based operations.
- Servo-Driven Joints: Uses position-controlled servo motors to enable real-time joint movement.
- Lightweight, Open-Source Hardware: Built from lightweight, open-source hardware components that are easy to modify and extend.
- Simple End-Effector (Gripper): Includes a gripper suitable for basic manipulation tasks.
- Leader-Follower Physical Setup (Optional): Supports a dual-arm configuration useful for teleoperation and imitation learning:
- Leader arm (manually actuated): Joint positions are sampled.
- Follower arm (actively controlled): Mirrors the leader’s joint trajectories in real time.
- USB Control from a Computer / SBC: The SO101 can be directly controlled from a laptop or single-board computer (SBC, such as a Raspberry Pi) over USB or serial communication. This allows developers to control the robot without industrial controllers or specialized hardware.
Set up the SO101
Before configuring the software, you need to physically connect your SO101 hardware components.Connect the Hardware Components
Follow this sequence to set up your physical hardware:Step 1: Power the Robot Arms
Connect both the leader and follower arms to their power supplies:- Locate the power input on each arm’s controller board.
- Connect the appropriate power supply to each arm.
- Verify voltage: Ensure the voltage matches your motor specifications
- Common configurations: 6V or 12V (depends on your motors).
- Check your SO101 build documentation for the correct voltage.
Step 2: Connect Arms to Computer
Each arm needs a USB connection to communicate with your computer:-
Leader arm:
- Plug one end of a USB-C cable into the leader arm’s controller.
- Plug the other end into your computer (laptop, Raspberry Pi, or SBC).
-
Follower arm:
- Plug one end of a USB-C cable into the follower arm’s controller.
- Plug the other end into your computer.
Each arm appears as a separate serial device. You’ll identify their specific ports in the software setup steps.
Step 3: Connect the Camera
If you’re using an external camera for dataset recording:- Connect your USB camera or IP camera to the computer.
- Verify the camera is detected by your system.
Use Cyberwave with SO101
The SO101 robot arm set provides a low-cost and efficient way to get started with robotic manipulation. Using Cyberwave with an SO101 arm set enables the following capabilities:- Quick onboarding: Onboard an SO101 arm from the Cyberwave catalog, automatically create its digital twin, and begin interacting with it in just a few clicks, no manual hardware configuration required.
- Teleoperation: Teleoperate the SO101 using Cyberwave’s SDK, enabling real-time control of the follower arm through a physical leader arm with joint-level mirroring.
- Remote operation: Operate the SO101 without a leader arm by sending control commands directly from Cyberwave via the browser, SDK, or APIs.
- Controller policies: Assign external controller policies such as keyboard input, scripted controllers, or vision-language-action (VLA) models using a standardized control interface.
- Create and export datasets: Record SO101 operations, including video feeds and control actions, and automatically structure them into episodic datasets for training and evaluation.
- Train and deploy models: Train machine learning models using collected datasets and deploy them directly as controller policies within Cyberwave.
- Simulation and real-world execution: Test trained models in a browser-based 3D simulated environment using the SO101 digital twin, then deploy the same models to the physical robot without changing the logic.
Get Started with SO101
Goals
This guide helps you:- Set up an SO101 arm and an external camera in a real environment and replicate the same setup in Cyberwave.
- Configure teleoperation and remote operation to control the follower arm using a leader arm and Cyberwave data.
- Create datasets for specific tasks and use them to train and deploy ML models.
- Use deployed ML models as controller policies to control the follower arm directly from Cyberwave.
Prerequisites
Before you begin this quick start guide, ensure you have the following:- Hardware
- Software
- Credentials
- SO101 robot arm set (leader and follower arms) (Contact us if you want access to this hardware)
- External camera (USB or IP camera) to record video feeds for datasets
- Computer or single-board computer (SBC, e.g., Raspberry Pi)
- USB or serial connection to the SO101 devices
Set Up Teleoperation
Step 1: Install the SDK
We use thecyberwave-edge-python-so101 SDK to handle teleoperation and remote control of the SO101.
Open your Terminal and run the following commamnds:
Step 2: Configure Environment Credentials
Before connecting to Cyberwave, you need to configure your environment with the necessary authentication credentials. This step secures your connection and identifies your workspace. In your local terminal, navigate to the SDK directory (if not already there) and create your environment configuration file:- Navigate to Cyberwave Dashboard.
- Go to Settings → API Keys.
- Generate a new API Token.
- Copy the generated token.
- Add your token to the
.envfile:

Step 3: Set Up the Cyberwave Environment
Now that your local environment is configured, you need to create a corresponding digital environment in Cyberwave that mirrors your physical setup.- Log in to Cyberwave.
- Create a new Project and Environment.
- In your environment, click Add Scene Object to create a new twin.
- Browse the Catalog and select SO101.
- Click Add to Environment.
- Click Add Scene Object again.
- Browse the Catalog and select Standard Camera.
- Click Add to Environment.
Your Cyberwave environment now replicates your real-world physical setup with both digital twins configured and ready to connect.
Step 4: Find Device Ports
When you connect your SO101 robot arm(s) to your computer via USB, each device appears as a serial port on your system. You need to identify the correct port name to communicate with each arm. Understanding Serial Ports: Your computer may have multiple USB/serial devices connected at any time. Each device appears as a port with a name like:- macOS/Linux:
/dev/tty.usbmodem123or/dev/ttyUSB0 - Windows:
COM3orCOM4
Every SO101 arm appears as a separate serial device. If you’re using both a leader and follower arm, each will have its own unique port that you must identify separately.
- The tool scans for available serial ports
- It may prompt you to plug or unplug the SO101 arm
- When the device is detected, it confirms which port appeared or disappeared
- The tool displays the detected port name
Step 5: Verify Device Connection
Now that you’ve identified the serial ports for your SO101 arm(s), it’s time to verify that your computer can successfully communicate with the devices. This step ensures the hardware connection is working properly before proceeding to teleoperation. Test the Connection: The SDK includes a diagnostic tool that reads live data from your SO101 arm. This command queries the device and displays its current state, confirming that communication is working correctly. Run the following command, replacing/dev/tty.usbmodem123 with the actual port you identified in Step 4:
- Joint angles for all 6 degrees of freedom
- Device status (e.g., connection state, errors)
- Sensor values (e.g., position feedback)
Step 6: Calibrate the Devices
Calibration is a required step before using an SO101 arm for teleoperation or control. It ensures that the software correctly understands the physical state of the robot and can accurately map commands to hardware movements. Calibration defines:- The zero (reference) position of each joint
- The valid movement range for each joint
- The mapping between the physical arm and the software model
/dev/tty.usbmodem123 with your actual leader arm port from Step 4 and run the following command:
- Registers the device as a leader arm
- Stores its calibration under the ID leader1
- Prepares the arm to be moved manually by a human
/dev/tty.usbmodem456 with your actual follower arm port from Step 4 and run the following command:
- Registers the device as a follower arm
- Stores its calibration under the ID follower1
- Prepares the arm to receive and execute control commands
- Move one or more joints to specific positions
- Hold the arm steady for a short period
- Confirm that joints are aligned correctly
Step 7: Set Up Teleoperation
Teleoperation enables you to control the follower arm using the leader arm in real-time. The leader arm captures your manual movements, and the follower arm mirrors those movements instantly, creating an intuitive way to control the robot. How Teleoperation Works: The teleoperation system creates a synchronized connection between:- Physical leader arm → captures human-guided joint movements
- Physical follower arm → executes the movements in real-time
- Digital twin → receives telemetry data from both arms for monitoring and recording
YOUR_SO101_TWIN_UUID— The Twin UUID you copied in Step 3 for the SO101 robotYOUR_CAMERA_TWIN_UUID— The Twin UUID you copied in Step 3 for the Standard Camera/dev/tty.usbmodem123— Your actual leader arm port from Step 4/dev/tty.usbmodem456— Your actual follower arm port from Step 4
| Parameter | Description |
|---|---|
--twin-uuid | Digital twin ID for the SO101 robot arm in Cyberwave |
--leader-port | Serial port for the leader arm (input device) |
--follower-port | Serial port for the follower arm (output device) |
--camera-uuid | Digital twin ID for the camera to stream visual data |
--fps | Frames per second for telemetry updates (default: 30) |
The —fps parameter controls how frequently joint data is sent to Cyberwave. Higher values provide smoother visualization but require more bandwidth. 30 FPS is recommended for most use cases.
- Gently move one joint on the leader arm.
- Observe the corresponding joint moving on the follower arm.
- Check the digital twin in Cyberwave dashboard,it should mirror the movements.
Create and Export Datasets
Once teleoperation is set up and working, you can create datasets by recording episodes of the robot performing specific tasks. These datasets can later be used to train machine learning models for autonomous operation. A dataset consists of multiple episodes, individual recordings of the robot completing a task. Each episode captures:- Joint positions and movements over time
- Camera video feed showing the task execution
- Timing and sequence data
Step 1: Record Episodes
Recording episodes captures the manual operations performed through teleoperation. Each recording can contain multiple task demonstrations that you’ll later trim into episodes. Start Recording in Live Mode:- Navigate to your Cyberwave environment in the dashboard
- Switch to Live Mode in the environment viewer
- Turn on the camera:
- Locate the camera icon in the upper-right corner
- Click the Turn On icon to activate the camera feed
- Click Start Recording to begin capturing data

Make sure teleoperation is running (from Step 7) before you start recording. The recording captures both the arm movements and camera feed simultaneously.
- Position the robot at the starting configuration
- Execute the task smoothly using the leader arm
- Complete the task fully (e.g., pick up object → move → place in box)
- Repeat the same task multiple times to create variety in the dataset
Example: Pick and Place Task
Example: Pick and Place Task
Goal: Train the SO101 to pick up an object and drop it inside a box.Recording process:
- Start with the gripper open near the object
- Move the leader arm to position the follower over the object
- Close the gripper to pick up the object
- Move to the box location
- Open the gripper to release the object
- Return to starting position
- Repeat 10-15 times with slight variations
- Click Stop Recording in the Cyberwave interface
- The recording will be saved and ready for processing
Step 2: Export Dataset
After recording, you’ll trim the raw recording into discrete episodes and export them as a structured dataset. Create Episodes from Recording:- Open the recorded session in your Cyberwave environment.
- Review the timeline: You’ll see the full recording with video and telemetry data.
- Trim episodes:
- Identify the start and end of each successful task demonstration
- Use the trim tool to isolate each episode
- Remove any failed attempts, pauses, or unwanted sections
- Label episodes (optional): add descriptive names for organization.
Each episode should contain one complete task execution from start to finish. Keep episodes focused and remove any unnecessary setup or reset time between demonstrations.
- Review all episodes to ensure quality.
- Select the episodes you want to include in the final dataset.
- Check the box next to each desired episode
- Deselect any that have errors or poor quality
- Click Create Dataset.

- Navigate to the Manage Datasets tab in Cyberwave
- View all your created datasets.
- Access dataset details:
- Number of episodes
- Duration
- Download datasets for local training or use them directly in Cyberwave for model training.

- Go to File -> Export -> Export Datasets.
- Select the specific dataset you want to export.
- Click on Export.

Dataset Created: Your dataset is now ready for training machine learning models. Each episode contains synchronized robot movements and camera footage that can teach autonomous behaviors.
Train and Deploy an ML Model
With your dataset created, you can now train a machine learning model to autonomously replicate the behaviors you demonstrated. Once trained, the model can be deployed as a controller policy that directly controls the SO101 robot.Step 1: Train a Model
Training transforms your recorded demonstrations into a model that can predict and execute similar actions autonomously. Configure training parameters:- Workspace: Select your workspace from the dropdown.
- ML Model: Choose the appropriate ML model.
- Dataset: Select the dataset you created earlier.
-
Advanced Settings:
Data Augmentation:
- Use the slider to select augmentation level:
0— No augmentation1— Low augmentation (recommended for most cases)2— Medium augmentation (for more robust generalization)
Training Stop Policy: Choose one of two stopping strategies:Data augmentation adds variations to your training data (like slight position changes or lighting differences) to help the model generalize better to new situations.- Save best model until iterations (recommended for beginners)
- Set the number of iterations (max: 5000)
- Training continues until reaching the specified iterations
- The best-performing model checkpoint is saved
- Stop when validation loss is under threshold (for faster training)
- Set the validation loss threshold (default: 0.01)
- Set max iterations (max: 5000)
- Training stops early when validation loss reaches the threshold
- May be faster since training stops when a valid model is found
- Use the slider to select augmentation level:
- Click Start Training to begin.
Step 2: Deploy a Model
Once training completes successfully, deploy the model to make it available as a controller policy. Create a Deployment:- Navigate to AI → Deployments.
- Click Start New Deployment.
- Select your trained model from the list of completed trainings.
- Select the target twins to deploy the model to.
- Click Deploy.
Model Deployed: Your trained model is now available as a controller policy and ready to control the robot autonomously.
Step 4: Set Up Remote Operation
Remote operation allows you to control the SO101 follower arm directly from Cyberwave without using a physical leader arm. This is useful for testing, calibration, or when you only have a single follower arm. How Remote Operation Works: Unlike teleoperation (which uses a leader arm to control the follower), remote operation:- Connects only the follower arm to Cyberwave
- Allows control through the Cyberwave dashboard in your browser
- Enables manual joint control or scripted movements
- Streams real-time feedback to the digital twin
/dev/tty.usbmodem456— Your follower arm port from Step 4YOUR_SO101_TWIN_UUID— The SO101 Twin UUID from Step 3YOUR_CYBERWAVE_TOKEN— Your API token from Step 2 (or use the environment variable)
Step 4: Use the Model as a Controller Policy
Now use your trained model to autonomously control the physical SO101 robot. Assign the Controller Policy:- In your environment, switch to Edit Mode.
- Click Assign Controller Policy from the right side view.
- Select your deployed model from the dropdown.
- Click Save Configuration.
- The model now appears as a controller policy in the right side view.
- Switch to Live View.
- You’ll see an option to enter a prompt.
- Type your instruction (e.g., “Pick up the object and place it in the box”).
- The model deploys the action to the SO101 in your real environment setup.
Autonomous Control Active: Your SO101 is now controlled by AI using natural language prompts!