Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

🦀 Prevents outdated Rust code suggestions from AI assistants. This MCP server fetches current crate docs, uses embeddings/LLMs, and provides accurate context via a tool call.

License

NotificationsYou must be signed in to change notification settings

Govcraft/rust-docs-mcp-server

Repository files navigation

License: MIT

Like this project? Pleasestar the repository onGitHub to show your support and stay updated!

Motivation

Modern AI-powered coding assistants (like Cursor, Cline, Roo Code, etc.) excelat understanding code structure and syntax but often struggle with the specificsof rapidly evolving libraries and frameworks, especially in ecosystems like Rustwhere crates are updated frequently. Their training data cutoff means they maylack knowledge of the latest APIs, leading to incorrect or outdated codesuggestions.

This MCP server addresses this challenge by providing a focused, up-to-dateknowledge source for a specific Rust crate. By running an instance of thisserver for a crate (e.g.,serde,tokio,reqwest), you give your LLM codingassistant a tool (query_rust_docs) it can usebefore writing code related tothat crate.

When instructed to use this tool, the LLM can ask specific questions about thecrate's API or usage and receive answers derived directly from thecurrentdocumentation. This significantly improves the accuracy and relevance of thegenerated code, reducing the need for manual correction and speeding updevelopment.

Multiple instances of this server can be run concurrently, allowing the LLMassistant to access documentation for several different crates during a codingsession.

This server fetches the documentation for a specified Rust crate, generatesembeddings for the content, and provides an MCP tool to answer questions aboutthe crate based on the documentation context.

Features

  • Targeted Documentation: Focuses on a single Rust crate per serverinstance.
  • Feature Support: Allows specifying required crate features fordocumentation generation.
  • Semantic Search: Uses OpenAI'stext-embedding-3-small model to find themost relevant documentation sections for a given question.
  • LLM Summarization: Leverages OpenAI'sgpt-4o-mini-2024-07-18 model togenerate concise answers basedonly on the retrieved documentation context.
  • Caching: Caches generated documentation content and embeddings in theuser's XDG data directory (~/.local/share/rustdocs-mcp-server/ or similar)based on crate, version,and requested features to speed up subsequentlaunches.
  • MCP Integration: Runs as a standard MCP server over stdio, exposing toolsand resources.

Prerequisites

  • OpenAI API Key: Needed for generating embeddings and summarizing answers.The server expects this key to be available in theOPENAI_API_KEYenvironment variable. (The server also requires network access to downloadcrate dependencies and interact with the OpenAI API).

Installation

The recommended way to install is to download the pre-compiled binary for youroperating system from theGitHub Releases page.

  1. Go to theReleases page.
  2. Download the appropriate archive (.zip for Windows,.tar.gz forLinux/macOS) for your system.
  3. Extract therustdocs_mcp_server (orrustdocs_mcp_server.exe) binary.
  4. Place the binary in a directory included in your system'sPATH environmentvariable (e.g.,/usr/local/bin,~/bin).

Building from Source (Alternative)

If you prefer to build from source, you will need theRust Toolchain installed.

  1. Clone the repository:
    git clone https://github.com/Govcraft/rust-docs-mcp-server.gitcd rust-docs-mcp-server
  2. Build the server:
    cargo build --release

Usage

Important Note for New Crates:

When using the server with a crate for the first time (or with a new version/feature set), it needs to download the documentation and generate embeddings. This process can take some time, especially for crates with extensive documentation, and requires an active internet connection and OpenAI API key.

It is recommended to run the server once directly from your command line for any new crate configurationbefore adding it to your AI coding assistant (like Roo Code, Cursor, etc.). This allows the initial embedding generation and caching to complete. Once you see the server startup messages indicating it's ready (e.g., "MCP Server listening on stdio"), you can shut it down (Ctrl+C). Subsequent launches, including those initiated by your coding assistant, will use the cached data and start much faster.

Running the Server

The server is launched from the command line and requires thePackage IDSpecification for the target crate. This specification follows the format usedby Cargo (e.g.,crate_name,crate_name@version_req). For the fullspecification details, seeman cargo-pkgid or theCargo documentation.

Optionally, you can specify required crate features using the-F or--features flag, followed by a comma-separated list of features. This isnecessary for crates that require specific features to be enabled forcargo doc to succeed (e.g., crates requiring a runtime feature likeasync-stripe).

# Set the API key (replace with your actual key)export OPENAI_API_KEY="sk-..."# Example: Run server for the latest 1.x version of serderustdocs_mcp_server"serde@^1.0"# Example: Run server for a specific version of reqwestrustdocs_mcp_server"reqwest@0.12.0"# Example: Run server for the latest version of tokiorustdocs_mcp_server tokio# Example: Run server for async-stripe, enabling a required runtime featurerustdocs_mcp_server"async-stripe@0.40" -F runtime-tokio-hyper-rustls# Example: Run server for another crate with multiple featuresrustdocs_mcp_server"some-crate@1.2" --features feat1,feat2

On the first run for a specific crate versionand feature set, the serverwill:

  1. Download the crate documentation usingcargo doc (with specified features).
  2. Parse the HTML documentation.
  3. Generate embeddings for the documentation content using the OpenAI API (thismay take some time and incur costs, though typically only fractions of a USpenny for most crates; even a large crate likeasync-stripe with over 5000documentation pages cost only $0.18 USD for embedding generation duringtesting).
  4. Cache the documentation content and embeddings so that the cost isn'tincurred again.
  5. Start the MCP server.

Subsequent runs for the same crate versionand feature set will load the datafrom the cache, making startup much faster.

MCP Interaction

The server communicates using the Model Context Protocol over standardinput/output (stdio). It exposes the following:

  • Tool:query_rust_docs

    • Description: Query documentation for the specific Rust crate the serverwas started for, using semantic search and LLM summarization.
    • Input Schema:
      {"type":"object","properties": {"question": {"type":"string","description":"The specific question about the crate's API or usage."    }  },"required": ["question"]}
    • Output: A text response containing the answer generated by the LLM basedon the relevant documentation context, prefixed withFrom <crate_name> docs:.
    • Example MCP Call:
      {"jsonrpc":"2.0","method":"callTool","params": {"tool_name":"query_rust_docs","arguments": {"question":"How do I make a simple GET request with reqwest?"    }  },"id":1}
  • Resource:crate://<crate_name>

    • Description: Provides the name of the Rust crate this server instance isconfigured for.
    • URI:crate://<crate_name> (e.g.,crate://serde,crate://reqwest)
    • Content: Plain text containing the crate name.
  • Logging: The server sends informational logs (startup messages, queryprocessing steps) back to the MCP client vialogging/message notifications.

Example Client Configuration (Roo Code)

You can configure MCP clients like Roo Code to run multiple instances of thisserver, each targeting a different crate. Here's an example snippet for RooCode'smcp_settings.json file, configuring servers forreqwest andasync-stripe (note the added features argument forasync-stripe):

{"mcpServers": {"rust-docs-reqwest": {"command":"/path/to/your/rustdocs_mcp_server","args": ["reqwest@0.12"      ],"env": {"OPENAI_API_KEY":"YOUR_OPENAI_API_KEY_HERE"      },"disabled":false,"alwaysAllow": []    },"rust-docs-async-stripe": {"command":"rustdocs_mcp_server","args": ["async-stripe@0.40","-F"," runtime-tokio-hyper-rustls"      ],"env": {"OPENAI_API_KEY":"YOUR_OPENAI_API_KEY_HERE"      },"disabled":false,"alwaysAllow": []    }  }}

Note:

  • Replace/path/to/your/rustdocs_mcp_server with the actual path to thecompiled binary on your system if it isn't in your PATH.
  • ReplaceYOUR_OPENAI_API_KEY_HERE with your actual OpenAI API key.
  • The keys (rust-docs-reqwest,rust-docs-async-stripe) are arbitrary namesyou choose to identify the server instances within Roo Code.

Example Client Configuration (Claude Desktop)

For Claude Desktop users, you can configure the server in the MCP settings.Here's an example configuring servers forserde andasync-stripe:

{"mcpServers": {"rust-docs-serde": {"command":"/path/to/your/rustdocs_mcp_server","args": ["serde@^1.0"      ]    },"rust-docs-async-stripe-rt": {"command":"rustdocs_mcp_server","args": ["async-stripe@0.40","-F","runtime-tokio-hyper-rustls"      ]    }  }}

Note:

  • Ensurerustdocs_mcp_server is in your system's PATH or provide the full path(e.g.,/path/to/your/rustdocs_mcp_server).
  • The keys (rust-docs-serde,rust-docs-async-stripe-rt) are arbitrary namesyou choose to identify the server instances.
  • Remember to set theOPENAI_API_KEY environment variable where Claude Desktopcan access it (this might be system-wide or via how you launch ClaudeDesktop). Claude Desktop's MCP configuration might not directly supportsetting environment variables per-server like Roo Code.
  • The example shows how to add the-F argument for crates likeasync-stripethat require specific features.

Caching

  • Location: Cached documentation and embeddings are stored in the XDG datadirectory, typically under~/.local/share/rustdocs-mcp-server/<crate_name>/<sanitized_version_req>/<features_hash>/embeddings.bin.Thesanitized_version_req is derived from the version requirement, andfeatures_hash is a hash representing the specific combination of featuresrequested at startup. This ensures different feature sets are cachedseparately.
  • Format: Data is cached usingbincode serialization.
  • Regeneration: If the cache file is missing, corrupted, or cannot bedecoded, the server will automatically regenerate the documentation andembeddings.

How it Works

  1. Initialization: Parses the crate specification and optional features fromthe command line usingclap.
  2. Cache Check: Looks for a pre-existing cache file for the specific crate,version requirement, and feature set.
  3. Documentation Generation (if cache miss):
    • Creates a temporary Rust project depending only on the target crate,enabling the specified features in itsCargo.toml.
    • Runscargo doc using thecargo library API to generate HTMLdocumentation in the temporary directory.
    • Dynamically locates the correct output directory withintarget/doc bysearching for the subdirectory containingindex.html.
  4. Content Extraction (if cache miss):
    • Walks the generated HTML files within the located documentation directory.
    • Uses thescraper crate to parse each HTML file and extract text contentfrom the main content area (<section>).
  5. Embedding Generation (if cache miss):
    • Uses theasync-openai crate andtiktoken-rs to generate embeddings foreach extracted document chunk using thetext-embedding-3-small model.
    • Calculates the estimated cost based on the number of tokens processed.
  6. Caching (if cache miss): Saves the extracted document content and theircorresponding embeddings to the cache file (path includes features hash)usingbincode.
  7. Server Startup: Initializes theRustDocsServer with theloaded/generated documents and embeddings.
  8. MCP Serving: Starts the MCP server usingrmcp over stdio.
  9. Query Handling (query_rust_docs tool):
    • Generates an embedding for the user's question.
    • Calculates the cosine similarity between the question embedding and allcached document embeddings.
    • Identifies the document chunk with the highest similarity.
    • Sends the user's question and the content of the best-matching documentchunk to thegpt-4o-mini-2024-07-18 model via the OpenAI API.
    • The LLM is prompted to answer the question basedonly on the providedcontext.
    • Returns the LLM's response to the MCP client.

License

This project is licensed under the MIT License.

Copyright (c) 2025 Govcraft

Sponsor

Govcraft is a one-person shop—no corporate backing, no investors, just me building useful tools. If this project helps you,sponsoring keeps the work going.

Sponsor on GitHub

About

🦀 Prevents outdated Rust code suggestions from AI assistants. This MCP server fetches current crate docs, uses embeddings/LLMs, and provides accurate context via a tool call.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published

Contributors4

  •  
  •  
  •  
  •  

[8]ページ先頭

©2009-2025 Movatter.jp