Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

README file generator, powered by AI.

License

NotificationsYou must be signed in to change notification settings

eli64s/readme-ai

ReadmeAI Logo

Designed for simplicity, customization, and developer productivity.

Github ActionsTest CoveragePyPI VersionTotal DownloadsMIT License

line break

Quick Links

Important

Explore theOfficial Documentation for a complete list of features, customization options, and examples.

line break

Introduction

ReadmeAI is a developer tool that automatically generates README files using a robust repository processing engine and advanced language models. Simply provide a URL or path to your codebase, and a well-structured and detailed README will be generated.

Why Use ReadmeAI?

This project aims to streamline the process of creating and maintaining documentation across all technical disciplines and experience levels. The core principles include:

  • 🔵 Automate: Generate detailed and structured README files with a single command.
  • ⚫️ Customize: Select from a variety of templates, styles, badges, and much more.
  • 🟣 Flexible: Switch betweenOpenAI,Ollama,Anthropic, andGemini anytime.
  • 🟠 Language Agnostic: Compatible with a wide range of languages and frameworks.
  • 🟡 Best Practices: Ensure clean and consistent documentation across all projects.
  • 🟢 Smart Filtering: Intelligent file analysis with customizable.readmeaiignore patterns.
  • ⛔️ Offline Mode: Create README files offline, without using a LLM API service.

Demo

Run from your terminal:

cli-demo.mov

line break

Features

Customize Your README

Let's begin by exploring various customization options and styles supported by ReadmeAI:

Header Styles
Classic Header

CLI Command:

$ readmeai --repository https://github.com/eli64s/readme-ai-streamlit \           --logo custom \           --badge-color FF4B4B \           --badge-style flat-square \           --header-style classic
Modern Header

CLI Command:

$ readmeai --repository https://github.com/olliefr/docker-gs-ping \           --badge-color 00ADD8 \           --badge-style for-the-badge \           --header-style modern \           --navigation-style roman
Compact Header

CLI Command:

$ readmeai --repository https://github.com/rumaan/file.io-Android-Client \           --badge-style plastic \           --badge-color blueviolet \           --logo PURPLE \           --header-style COMPACT \           --navigation-style NUMBER \           --emojis solar

Banner Styles

Console Header

CLI Command:

$ readmeai --repository https://github.com/emcf/thepipe \           --badge-style flat-square \           --badge-color 8a2be2 \           --header-style console \           --navigation-style accordion \           --emojis water
SVG Banner

CLI Command:

$ readmeai --repository https://github.com/FerrariDG/async-ml-inference \           --badge-style plastic \           --badge-color 43a047 \           --header-style BANNER

And More!

Project Overview

CLI Command:

$ readmeai --repository 'https://github.com/eli64sreadme-ai-streamlit' \           --badge-style FLAT-SQUARE \           --badge-color E92063 \           --header-style COMPACT \           --navigation-style ACCORDION \           --emojis RAINBOW \           --logo ICE
Custom Logo

CLI Command:

$ readmeai --repository https://github.com/jwills/buenavista \           --align LEFT \           --badge-style FLAT-SQUARE \           --logo CUSTOM

Generated Sections & Content

꩜ Expand to view more!

Project Introduction

  • This section captures your project's essence and value proposition.
  • The prompt template used to generate this section can be viewedhere.

Features Table

  • Detailed feature breakdown and technical capabilities.
  • The prompt template used to generate this section can be viewedhere.

Project Structure

  • Visual representation of your project's directory structure.
  • The tree is generated usingpure Python and embedded in a code block.

Project Index

  • Summarizes key modules of the project, which are also used as context for downstreamprompts.toml.

Getting Started Guides

  • Dependencies and system requirements are extracted from the codebase during preprocessing.
  • Theparsers handle most of the heavy lifting here.

Installation, Usage, & Testing

  • Setup instructions and usage guides are automatically created based on data extracted from the codebase.

Community & Support

  • Development roadmap, contribution guidelines, license information, and community resources.
  • Areturn button is also included for easy navigation.

Contribution Guides

  • Instructions for contributing to the project, including resource links and a basic contribution guide.
  • Graph of contributors is also included for open-source projects.

line break

Getting Started

Prerequisites

ReadmeAI requires Python 3.9 or higher, and one of the following installation methods:

RequirementDetails
Python ≥3.9Core runtime
Installation Method (choose one)
pipDefault Python package manager
pipxIsolated environment installer
uvHigh-performance package manager
dockerContainerized environment

Supported Repository Platforms

To generate a README file, provide the source repository. ReadmeAI supports these platforms:

PlatformDetails
File SystemLocal repository access
GitHubIndustry-standard hosting
GitLabFull DevOps integration
BitbucketAtlassian ecosystem

Supported LLM API Services

ReadmeAI is model agnostic, with support for the following LLM API services:

ProviderBest ForDetails
OpenAIGeneral useIndustry-leading models
AnthropicAdvanced tasksClaude language models
Google GeminiMultimodal AILatest Google technology
OllamaOpen sourceNo API key needed
Offline ModeLocal operationNo internet required

Installation

ReadmeAI is available onPyPI as readmeai and can be installed as follows:

 Pip

Install with pip (recommended for most users):

❯ pip install -U readmeai

 Pipx

Withpipx, readmeai will be installed in an isolated environment:

❯ pipx install readmeai

 Uv

The fastest way to install readmeai is withuv:

❯ uv tool install readmeai

 Docker

To runreadmeai in a containerized environment, pull the latest image from [Docker Hub][dockerhub-link]:

❯ docker pull zeroxeli/readme-ai:latest

 From source

Click to buildreadmeai from source
  1. Clone the repository:

    ❯ git clone https://github.com/eli64s/readme-ai
  2. Navigate to the project directory:

    cd readme-ai
  3. Install dependencies:

    ❯ pip install -r setup/requirements.txt

Alternatively, use the [setup script][setup-script] to install dependencies:

 Bash
  1. Run the setup script:

    ❯ bash setup/setup.sh

Or, usepoetry to build and install project dependencies:

 Poetry
  1. Install dependencies with poetry:

    ❯ poetry install

Additional Optional Dependencies

Important

To use theAnthropic andGoogle Gemini clients, extra dependencies are required. Install the package with the following extras:

  • Anthropic:

    ❯ pip install"readmeai[anthropic]"
  • Google Gemini:

    ❯ pip install"readmeai[google-generativeai]"
  • Install Multiple Clients:

    ❯ pip install"readmeai[anthropic,google-generativeai]"

Usage

Set your API key

When runningreadmeai with a third-party service, you must provide a valid API key. For example, theOpenAI client is set as follows:

export OPENAI_API_KEY=<your_api_key># For Windows users:set OPENAI_API_KEY=<your_api_key>
Click to view environment variables for -Ollama,Anthropic,Google Gemini
Ollama

Refer to theOllama documentation for more information on setting up the Ollama server.

To start, follow these steps:

  1. Pull your model of choice from the Ollama repository:

    ❯ ollama pull llama3.2:latest
  2. Start the Ollama server and set theOLLAMA_HOST environment variable:

    export OLLAMA_HOST=127.0.0.1&& ollama serve
Anthropic
  1. Export your Anthropic API key:

    export ANTHROPIC_API_KEY=<your_api_key>
Google Gemini
  1. Export your Google Gemini API key:

    export GOOGLE_API_KEY=<your_api_key

Using the CLI

Running with a LLM API service

Below is the minimal command required to runreadmeai using theOpenAI client:

❯ readmeai --api openai -o readmeai-openai.md -r https://github.com/eli64s/readme-ai

Important

The default model set isgpt-3.5-turbo, offering the best balance between cost and performance.When using any model from thegpt-4 series and up, please monitor your costs and usage to avoid unexpected charges.

ReadmeAI can easily switch between API providers and models. We can run the same command as above with theAnthropic client:

❯ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -o readmeai-anthropic.md -r https://github.com/eli64s/readme-ai

And finally, with theGoogle Gemini client:

❯ readmeai --api gemini -m gemini-1.5-flash -o readmeai-gemini.md -r https://github.com/eli64s/readme-ai
Running with local models

We can also runreadmeai with free and open-source locally hosted models using the Ollama:

❯ readmeai --api ollama --model llama3.2 -r https://github.com/eli64s/readme-ai
Running on a local codebase

To generate a README file from a local codebase, simply provide the full path to the project:

❯ readmeai --repository /users/username/projects/myproject --api openai

Adding more customization options:

❯ readmeai --repository https://github.com/eli64s/readme-ai \           --output readmeai.md \           --api openai \           --model gpt-4 \           --badge-color A931EC \           --badge-style flat-square \           --header-style compact \           --navigation-style fold \           --temperature 0.9 \           --tree-depth 2           --logo LLM \           --emojis solar
Running in offline mode

ReadmeAI supportsoffline mode, allowing you to generate README files without using a LLM API service.

❯ readmeai --api offline -o readmeai-offline.md -r https://github.com/eli64s/readme-ai

 Docker

Run thereadmeai CLI in a Docker container:

❯ docker run -it --rm \    -e OPENAI_API_KEY=$OPENAI_API_KEY \    -v"$(pwd)":/app zeroxeli/readme-ai:latest \    --repository https://github.com/eli64s/readme-ai \    --api openai

 Streamlit

Try readme-ai directly in your browser on Streamlit Cloud, no installation required.

See thereadme-ai-streamlit repository on GitHub for more details about the application.

Warning

The readme-ai Streamlit web app may not always be up-to-date with the latest features. Please use the command-line interface (CLI) for the most recent functionality.

 From source

Click to runreadmeai from source
 Bash

If you installed the project from source with the bash script, run the following command:

  1. Activate the virtual environment:

    ❯ conda activate readmeai
  2. Run the CLI:

    ❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
 Poetry
  1. Activate the virtual environment:

    ❯ poetry shell
  2. Run the CLI:

    ❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

line break

Testing

Thepytest andnox frameworks are used for development and testing.

Install the dependencies with uv:

❯ uv pip install --dev --grouptest --all-extras

Run the unit test suite using Pytest:

❯ maketest

Using nox, test the app against Python versions3.9,3.10,3.11, and3.12:

❯ make test-nox

Tip

Nox is an automation tool for testing applications in multiple environments. This helps ensure your project is compatible with across Python versions and environments.

line break

Configuration

Customize your README generation with a variety of options and style settings supported such as:

OptionDescriptionDefault
--alignText alignment in headercenter
--apiLLM API service provideroffline
--badge-colorBadge color name or hex code0080ff
--badge-styleBadge icon style typeflat
--header-styleHeader template styleclassic
--navigation-styleTable of contents stylebullet
--emojisEmoji theme packs prefixed to section titlesNone
--logoProject logo imageblue
--logo-sizeLogo image size30%
--modelSpecific LLM model to usegpt-3.5-turbo
--outputOutput filenamereadme-ai.md
--repositoryRepository URL or local directory pathNone
--temperatureCreativity level for content generation0.1
--tree-max-depthMaximum depth of the directory tree structure2

Run the following command to view all available options:

❯ readmeai --help

Visit theOfficial Documentation for a complete guide on configuring and customizing README files.

line break

Example Gallery

This gallery showcases a diverse collection of README examples generated across various programming languages, frameworks, and project types.

TechRepositoryREADMEProject Description
PythonREADME-Python.mdreadmeaiReadmeAI's core project
Apache FlinkREADME-Flink.mdpyflink-pocPyFlink proof of concept
StreamlitREADME-Streamlit.mdreadmeai-streamlitWeb application interface
Vercel & NPMREADME-Vercel.mdgithub-readme-quotesDeployment showcase
Go & DockerREADME-DockerGo.mddocker-gs-pingContainerized Golang app
FastAPI & RedisREADME-FastAPI.mdasync-ml-inferenceML inference service
JavaREADME-Java.mdminimal-todoMinimalist To-Do app
PostgreSQL & DuckDBREADME-PostgreSQL.mdbuenavistaDatabase proxy server
KotlinREADME-Kotlin.mdandroid-clientMobile client application
Offline ModeREADME-Offline.mdlitellmOffline functionality demo

Community Contribution

Share Your README Files

We invite developers to share their generated README files in ourShow & Tell discussion category. Your contributions help:

  • Showcase diverse documentation styles
  • Provide real-world examples
  • Help improve the ReadmeAI tool

Find additional README examples in ourexamples directory on GitHub.

line break

Roadmap

  • Releasereadmeai 1.0.0 with robust documentation creation and maintenance capabilities.
  • Extend template support for variousproject types andprogramming languages.
  • DevelopVscode Extension to generate README files directly in the editor.
  • DevelopGitHub Actions to automate documentation updates.
  • Addbadge packs to provide additional badge styles and options.
    • Code coverage, CI/CD status, project version, and more.

Contributing

Contributions are welcome! Please read theContributing Guide to get started.


Acknowledgments

A big shoutout to the projects below for their awesome work and open-source contributions:

shields.iosimpleicons.orgtandpfun/skill-iconsastrit/css.ggIleriayo/markdown-badgesIleriayo/markdown-badges

🎗 License

Copyright © 2023-2025readme-ai.
Released under theMIT license.

line break


[8]ページ先頭

©2009-2025 Movatter.jp