Back to Timeline
2025Archived

Yahboom x3 plus

Exploration and development with the Yahboom ROS robot platform. Development hit a brick wall and has been halted.

RoboticsROSHardware

The Initial Premise

We initially selected the Yahboom ROSMASTER X3 Plus for its appeal as a comprehensive, out-of-the-box hardware suite. Equipped with a 2D LiDAR, depth cameras, a robotic arm, omnidirectional mecanum wheels, and an onboard Jetson, it was slated to be our primary rig for real-world data collection.

LeRobot Integration Strategy

Our core objective was to integrate this platform with the LeRobot ecosystem to train Vision-Language-Action (VLA) models. To achieve this, we needed a seamless teleoperation pipeline capable of synchronously recording camera feeds and joint states. Had it succeeded, the X3 Plus would have served as a highly cost-effective solution for aggregating imitation learning datasets.

Why We Killed the Project

Ultimately, the hardware could not deliver the precision our research demanded. We encountered three critical bottlenecks that forced a complete pivot away from the Yahboom platform:

  • Erratic Motor Control: The native actuation was unacceptably jerky. The hardware fundamentally lacked the smooth, continuous control profiles required for fine manipulation and precise physical interactions.
  • Noisy Telemetry: Feedback from the internal joint encoders and motors was highly degraded. Because the underlying hardware was substandard, the resulting sensorimotor data was too noisy and inaccurate to be viable for model training.
  • Teleoperation Bottlenecks: Attempting to simultaneously pilot a mecanum-drive chassis and a multi-DOF arm using a standard keyboard was highly impractical. The control interface could not capture the nuanced, high-fidelity human movements necessary for generating quality imitation learning data.