Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Connect AI models like Claude & GPT with robots using MCP and ROS.

License

NotificationsYou must be signed in to change notification settings

robotmcp/robot-mcp-client

 
 

Repository files navigation

Static BadgeStatic BadgeStatic BadgePython

TheROS MCP Client is a reference implementation of a Model Context Protocol (MCP) client, designed to connect directly withros-mcp-server.

Instead of using a Desktop LLM client, it acts as a bridge that integrates an LLM, enabling natural-language interaction with any ROS or ROS2 robot.

🧠 What It Does

robot-mcp-client implements the LLM-side of the MCP protocol.

It can:

  • Connect to aros-mcp-server over MCP (stdio or HTTP).
  • Send natural language queries or structured requests to the robot without the need to integrate it with a Desktop LLM client
  • Stream back feedback, sensor data, or responses from the server.
  • Integrate with multiple LLM providers including OpenAI, Anthropic, Gemini, and Groq (Llama, GPT-OSS).

In short, it lets you run an MCP-compatible client that speaks to robots via the MCP interface — useful for testing, local reasoning, or autonomous AI controllers.

ROS MCP Client Demo


⚙️ Key Features of the ROS MCP Client

  • Implements MCP client specification — plug-and-play with the ROS MCP server.

  • ROS-aware LLM interface — specialized prompts and handlers for robotics tasks.

  • Supports bidirectional streaming — send commands, receive real-time topic feedback.

  • LLM integration ready — use Gemini, Anthropic, or Ollama APIs as reasoning engines.

  • Offline-capable — works entirely within local or LAN environments.


🛠 Getting Started

The MCP client is version-agnostic (ROS1 or ROS2).

Prerequisites

Installation

  1. Clone the repository
git clone https://github.com/robotmcp/robot-mcp-client.gitcd robot-mcp-client
  1. Install dependencies
uv sync# or pip install -e .
  1. Minimal setup for custom MCP client
./setup.sh# Then edit .env and set:# - ROS_MCP_SERVER_PATH=/absolute/path/to/ros-mcp-server# - LLM_PROVIDER=gpt-oss|openai|anthropic|ollama
  1. Run the base client
uv run clients/baseclient.py
  1. Startrosbridge on the target robot
ros2 launch rosbridge_server rosbridge_websocket_launch.xml

📁 Project Structure

robot-mcp-client/├── .github/│   └── workflows/│       └── test-setup.yml    # CI for cross-platform setup├── clients/│   ├── baseclient.py         # Multi-LLM MCP client (LangGraph)│   ├── llm_store.py          # LLM provider configuration│   └── gemini_live/          # Gemini Live client│       ├── gemini_client.py  # Main client script│       ├── mcp.json          # MCP server configuration│       ├── setup_gemini_client.sh  # Automated setup│       └── README.md         # Detailed setup guide├── .env                      # Environment config (not tracked)├── setup.sh                  # Cross-platform setup script├── pyproject.toml            # Python dependencies└── README.md                 # This file

📚 Available Clients

The project includes multiple LLM client implementations:

🤖Base MCP Client (clients/baseclient.py)

Multi-provider LLM client with support for:

  • OpenAI: GPT-4.1
  • Anthropic: Claude Sonnet 4.5
  • Google Gemini: Gemini 2.5 Flash Lite
  • Groq (open-source models):
    • Llama 4 Scout 17B
    • Llama 3.1 8B Instant
    • Llama 3.3 70B Versatile
    • OpenAI GPT-OSS 120B

Configuration: SetLLM_PROVIDER in.env (see setup.sh)

Supported Models

ProviderKeywordParameters
Proprietary Models
Google Gemini 2.5 Flash Litegemini-
OpenAI GPT 4.1openai-
Anthropic Claude Sonnet 4.5claude-
Groq (Open Source)
Llama 4 Scoutllama-scout109B
Llama 3.1 8B Instantllama-8b8B
Llama 3.3 70B Versatilellama70B
OpenAI GPT OSSgpt-oss120B

Usage: SetLLM_PROVIDER=<keyword> in your.env file (e.g.,LLM_PROVIDER=llama-scout)

🎤Gemini Live Client (clients/gemini_live/)

  • Full-featured Google Gemini integration
  • Text-only mode optimized for WSL
  • Real-time interaction with ROS robots
  • Automated setup withsetup_gemini_client.sh

🚀Quick Start

# Multi-provider base client./setup.sh# Edit .env: set ROS_MCP_SERVER_PATH and LLM_PROVIDERuv run clients/baseclient.py# Gemini Live clientcd clients/gemini_live./setup_gemini_client.shuv run gemini_client.py

We welcome community PRs with new client implementations and integrations!


🤝 Contributing

We love contributions of all kinds:

  • Bug fixes and documentation updates
  • New features (e.g., Action support, permissions)
  • Additional examples and tutorials

Check out thecontributing guidelines and see issues taggedgood first issue to get started.


📜 License

This project is licensed under theApache License 2.0.

aggedgood first issue to get started.

About

Connect AI models like Claude & GPT with robots using MCP and ROS.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors3

  •  
  •  
  •  

[8]ページ先頭

©2009-2025 Movatter.jp