- Notifications
You must be signed in to change notification settings - Fork2
Connect AI models like Claude & GPT with robots using MCP and ROS.
License
robotmcp/robot-mcp-client
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
TheROS MCP Client is a reference implementation of a Model Context Protocol (MCP) client, designed to connect directly withros-mcp-server.
Instead of using a Desktop LLM client, it acts as a bridge that integrates an LLM, enabling natural-language interaction with any ROS or ROS2 robot.
ros-mcp-client implements the LLM-side of the MCP protocol.
It can:
- Connect to a
ros-mcp-serverover MCP (stdio or HTTP). - Send natural language queries or structured requests to the robot without the need to integrate it with a Desktop LLM client
- Stream back feedback, sensor data, or responses from the server.
- Integrate with a local LLM (Gemini, Ollama, Nvidia NeMo).
In short, it lets you run an MCP-compatible client that speaks to robots via the MCP interface — useful for testing, local reasoning, or autonomous AI controllers.
Implements MCP client specification — plug-and-play with the ROS MCP server.
ROS-aware LLM interface — specialized prompts and handlers for robotics tasks.
Supports bidirectional streaming — send commands, receive real-time topic feedback.
LLM integration ready — use Gemini, Anthropic, or Ollama APIs as reasoning engines.
Offline-capable — works entirely within local or LAN environments.
The MCP client is version-agnostic (ROS1 or ROS2).
- ROS or ROS2 running with
rosbridge - Active
ros-mcp-serverinstance
- Clone the repository
git clone https://github.com/robotmcp/ros-mcp-client.gitcd ros-mcp-client- Install dependencies
uv sync# or pip install -e .Follow the setup guide for the Gemini Live client:
- Gemini Live Client - Google Gemini integration
Start
rosbridgeon the target robot
ros2 launch rosbridge_server rosbridge_websocket_launch.xml
ros-mcp-client/├── clients/│ ├── gemini_live/ # Full-featured Gemini client│ │ ├── gemini_client.py # Main client script│ │ ├── mcp.json # MCP server configuration│ │ ├── setup_gemini_client.sh # Automated setup│ │ └── README.md # Detailed setup guide├── config/ # Shared configuration├── scripts/ # Utility scripts├── pyproject.toml # Python dependencies└── README.md # This fileThe project includes a comprehensive LLM client implementation:
- Full-featured Google Gemini integration
- Text-only mode optimized for WSL
- Real-time interaction with ROS robots
- Automated setup with
setup_gemini_client.sh
# Try the Gemini Live clientcd clients/gemini_live./setup_gemini_client.shuv run gemini_client.py
We welcome community PRs with new client implementations and integrations!
We love contributions of all kinds:
- Bug fixes and documentation updates
- New features (e.g., Action support, permissions)
- Additional examples and tutorials
Check out thecontributing guidelines and see issues taggedgood first issue to get started.
This project is licensed under theApache License 2.0.
About
Connect AI models like Claude & GPT with robots using MCP and ROS.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.

