Skip to content

Grigorij-Dudnik/RoboCrew

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

499 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Logo

Stars Downloads Discord Docs PyPI version

Create LLM agent for your robot. Connect movement tools, VLA policies and sensor scans just in a few lines of code.

RoboCrew demo

RoboCrew agent cleaning up a table.

๐Ÿš€ Quick Start

Run on your robot:

pip install robocrew

Start GUI app with:

robocrew-gui

โœจ Features

Schema

  • ๐Ÿš— Movement - Pre-built wheel controls for mobile robots
  • ๐Ÿฆพ Manipulation - VLA models as tools for arms control
  • ๐Ÿ‘๏ธ Vision - Camera feed with image augmentation for better spatial understanding
  • ๐ŸŽค Voice - Wake-word activated voice commands and TTS responses
  • ๐Ÿ—บ๏ธ LiDAR - Top-down mapping with LiDAR sensor
  • ๐Ÿง  Intelligence - Multi-agent control provides autonomy in decision making

๐ŸŽจ Supported Robots

  • โœ… XLeRobot - Full support for all features
  • ๐Ÿฅ LeKiwi - Use XLeRobot code (compatible platform)
  • ๐Ÿš™ Earth Rover mini plus - Full support
  • ๐Ÿ”œ More robot platforms coming soon! Request your platform โ†’

๐ŸŽฏ How It Works

How It Works Diagram

The RoboCrew Intelligence Loop:

  1. ๐Ÿ‘‚ Input - Voice commands, text tasks, or autonomous operation
  2. ๐Ÿง  LLM Processing - LLM analyzes the task and environment...
  3. ๐Ÿ› ๏ธ Tool Selection - ...and chooses appropriate tools (move, turn, grab an apple, etc.)
  4. ๐Ÿค– Robot Actions - Wheels and arms execute commands
  5. ๐Ÿ“น Visual Feedback - Cameras capture results with augmented overlay
  6. ๐Ÿ”„ Repeat - LLM evaluates results and adjusts strategy

๐Ÿ“ฑ Scripts to Use:

To gain full control over RoboCrew features, you can create your own script. Simplest example:

from robocrew.core.camera import RobotCamera
from robocrew.core.LLMAgent import LLMAgent
from robocrew.robots.XLeRobot.tools import create_move_forward, create_turn_right, create_turn_left
from robocrew.robots.XLeRobot.servo_controls import ServoControler

# ๐Ÿ“ท Set up main camera
main_camera = RobotCamera("/dev/camera_center")  # camera usb port Eg: /dev/video0

# ๐ŸŽ›๏ธ Set up servo controller
right_arm_wheel_usb = "/dev/arm_right"  # provide your right arm usb port. Eg: /dev/ttyACM1
servo_controler = ServoControler(right_arm_wheel_usb=right_arm_wheel_usb)

# ๐Ÿ› ๏ธ Set up tools
move_forward = create_move_forward(servo_controler)
turn_left = create_turn_left(servo_controler)
turn_right = create_turn_right(servo_controler)

# ๐Ÿค– Initialize agent
agent = LLMAgent(
    model="google_genai:gemini-3-flash-preview",
    tools=[move_forward, turn_left, turn_right],
    main_camera=main_camera,
    servo_controler=servo_controler,
)

# ๐ŸŽฏ Give it a task and go!
agent.task = "Approach a human."
agent.go()

๐ŸŽค Enable Listening and Speaking

Use voice to tell robot what to do.

๐Ÿ“– Docs: https://grigorij-dudnik.github.io/RoboCrew-docs/guides/examples/audio/

๐Ÿ’ป Code example: examples/2_xlerobot_listening_and_speaking.py


๐Ÿฆพ Add VLA Policy as a Tool

Let's make our robot manipulate objects with its arms!

๐Ÿ“– Docs: https://grigorij-dudnik.github.io/RoboCrew-docs/guides/examples/vla-as-tools/

๐Ÿ’ป Code example: examples/3_xlerobot_arm_manipulation.py


๐Ÿง  Increase intelligence with multiagent communication:

One agent plans mission, another controls robot.

๐Ÿ“– Docs: https://grigorij-dudnik.github.io/RoboCrew-docs/guides/examples/multiagent/

๐Ÿ’ป Code example: examples/4_xlerobot_multiagent_cooperation.py


๐Ÿ’ฌ Community & Support

โค๏ธ Special thanks to all contributors and early adopters!

About

๐ŸฆพMake your robot autonomous with LLM agent. Set it up with the same ease as normal agents in CrewAI or Autogen

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages