Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

GPT wrapper for git — generate commit messages with an LLM in 1 sec — works best with Claude 3.5 — supports local models too

License

NotificationsYou must be signed in to change notification settings

di-sukharev/opencommit

Use this GitHub action with your project
Add this Action to an existing workflow or create a new one
View on Marketplace

Repository files navigation

OpenCommit logo

OpenCommit

Author

Auto-generate meaningful commits in a second

Killing lame commits with AI 🤯🔫

Current version

🪩 Winner ofGitHub 2023 hackathon 🪩


OpenCommit example

All the commits in this repo are authored by OpenCommit — look atthe commits to see how OpenCommit works. Emojis and long commit descriptions are configurable, basically everything is.

Setup OpenCommit as a CLI tool

You can use OpenCommit by simply running it via the CLI like thisoco. 2 seconds and your staged changes are committed with a meaningful message.

  1. Install OpenCommit globally to use in any repository:

    npm install -g opencommit
  2. Get your API key fromOpenAI or other supported LLM providers (we support them all). Make sure that you add your OpenAI payment details to your account, so the API works.

  3. Set the key to OpenCommit config:

    oco configset OCO_API_KEY=<your_api_key>

    Your API key is stored locally in the~/.opencommit config file.

Usage

You can call OpenCommit withoco command to generate a commit message for your staged changes:

git add<files...>oco

Runninggit add is optional,oco will do it for you.

Running locally with Ollama

You can also run it with local model through ollama:

  • install and start ollama
  • runollama run mistral (do this only once, to pull model)
  • run (in your project directory):
git add<files...>oco configset OCO_AI_PROVIDER='ollama' OCO_MODEL='llama3:8b'

Default model ismistral.

If you have ollama that is set up in docker/ on another machine with GPUs (not locally), you can change the default endpoint url.

You can do so by setting theOCO_API_URL environment variable as follows:

oco configset OCO_API_URL='http://192.168.1.10:11434/api/chat'

where 192.168.1.10 is example of endpoint URL, where you have ollama set up.

Flags

There are multiple optional flags that can be used with theoco command:

Use Full GitMoji Specification

Link to the GitMoji specification:https://gitmoji.dev/

This flag can only be used if theOCO_EMOJI configuration item is set totrue. This flag allows users to use all emojis in the GitMoji specification, By default, the GitMoji full specification is set tofalse, which only includes 10 emojis (🐛✨📝🚀✅♻️⬆️🔧🌐💡).

This is due to limit the number of tokens sent in each request. However, if you would like to use the full GitMoji specification, you can use the--fgm flag.

oco --fgm

Skip Commit Confirmation

This flag allows users to automatically commit the changes without having to manually confirm the commit message. This is useful for users who want to streamline the commit process and avoid additional steps. To use this flag, you can run the following command:

oco --yes

Configuration

Local per repo configuration

Create a.env file and add OpenCommit config variables there like this:

...OCO_AI_PROVIDER=<openai (default), anthropic, azure, ollama, gemini, flowise, deepseek>OCO_API_KEY=<your OpenAI API token> // or other LLM provider API tokenOCO_API_URL=<may be used to set proxy path to OpenAI api>OCO_TOKENS_MAX_INPUT=<max model token limit (default: 4096)>OCO_TOKENS_MAX_OUTPUT=<max response tokens (default: 500)>OCO_DESCRIPTION=<postface a message with ~3 sentences description of the changes>OCO_EMOJI=<boolean, add GitMoji>OCO_MODEL=<either 'gpt-4o', 'gpt-4', 'gpt-4-turbo', 'gpt-3.5-turbo' (default), 'gpt-3.5-turbo-0125', 'gpt-4-1106-preview', 'gpt-4-turbo-preview' or 'gpt-4-0125-preview' or any Anthropic or Ollama model or any string basically, but it should be a valid model name>OCO_LANGUAGE=<locale, scroll to the bottom to see options>OCO_MESSAGE_TEMPLATE_PLACEHOLDER=<message template placeholder, default: '$msg'>OCO_PROMPT_MODULE=<either conventional-commit or @commitlint, default: conventional-commit>OCO_ONE_LINE_COMMIT=<one line commit message, default: false>

Global configs are same as local configs, but they are stored in the global~/.opencommit config file and set withoco config set command, e.g.oco config set OCO_MODEL=gpt-4o.

Global config for all repos

Local config still has more priority than Global config, but you may setOCO_MODEL andOCO_LOCALE globally and set local configs forOCO_EMOJI andOCO_DESCRIPTION per repo which is more convenient.

Simply set any of the variables above like this:

oco configset OCO_MODEL=gpt-4o-mini

ConfigureGitMoji to preface a message.

oco configset OCO_EMOJI=true

To remove preface emojis:

oco configset OCO_EMOJI=false

Other config options are behaving the same.

Output WHY the changes were done (WIP)

You can set theOCO_WHY config totrue to have OpenCommit output a short description of WHY the changes were done after the commit message. Default isfalse.

To make this perform accurate we must store 'what files do' in some kind of an index or embedding and perform a lookup (kinda RAG) for the accurate git commit message. If you feel like building this comment on this ticket#398 and let's go from there together.

oco configset OCO_WHY=true

Switch to GPT-4 or other models

By default, OpenCommit usesgpt-4o-mini model.

You may switch to gpt-4o which performs better, but costs more 🤠

oco configset OCO_MODEL=gpt-4o

or for as a cheaper option:

oco configset OCO_MODEL=gpt-3.5-turbo

Switch to other LLM providers with a custom URL

By default OpenCommit usesOpenAI.

You could switch toAzure OpenAI Service or Flowise or Ollama.

oco configset OCO_AI_PROVIDER=azure OCO_API_KEY=<your_azure_api_key> OCO_API_URL=<your_azure_endpoint>oco configset OCO_AI_PROVIDER=flowise OCO_API_KEY=<your_flowise_api_key> OCO_API_URL=<your_flowise_endpoint>oco configset OCO_AI_PROVIDER=ollama OCO_API_KEY=<your_ollama_api_key> OCO_API_URL=<your_ollama_endpoint>

Locale configuration

To globally specify the language used to generate commit messages:

# de, German, Deutschoco configset OCO_LANGUAGE=deoco configset OCO_LANGUAGE=Germanoco configset OCO_LANGUAGE=Deutsch# fr, French, françaiseoco configset OCO_LANGUAGE=froco configset OCO_LANGUAGE=Frenchoco configset OCO_LANGUAGE=française

The default language setting isEnglishAll available languages are currently listed in thei18n folder

Push to git (gonna be deprecated)

A prompt for pushing to git is on by default but if you would like to turn it off just use:

oco configset OCO_GITPUSH=false

and it will exit right after commit is confirmed without asking if you would like to push to remote.

Switch to@commitlint

OpenCommit allows you to choose the prompt module used to generate commit messages. By default, OpenCommit uses its conventional-commit message generator. However, you can switch to using the@commitlint prompt module if you prefer. This option lets you generate commit messages in respect with the local config.

You can set this option by running the following command:

oco configset OCO_PROMPT_MODULE=<module>

Replace<module> with eitherconventional-commit or@commitlint.

Example:

To switch to using the'@commitlint prompt module, run:

oco configset OCO_PROMPT_MODULE=@commitlint

To switch back to the default conventional-commit message generator, run:

oco configset OCO_PROMPT_MODULE=conventional-commit

Integrating with@commitlint

The integration between@commitlint and OpenCommit is done automatically the first time OpenCommit is run withOCO_PROMPT_MODULE set to@commitlint. However, if you need to force set or reset the configuration for@commitlint, you can run the following command:

oco commitlint force

To view the generated configuration for@commitlint, you can use this command:

oco commitlint get

This allows you to ensure that the configuration is set up as desired.

Additionally, the integration creates a file named.opencommit-commitlint which contains the prompts used for the local@commitlint configuration. You can modify this file to fine-tune the example commit message generated by OpenAI. This gives you the flexibility to make adjustments based on your preferences or project guidelines.

OpenCommit generates a file named.opencommit-commitlint in your project directory which contains the prompts used for the local@commitlint configuration. You can modify this file to fine-tune the example commit message generated by OpenAI. If the local@commitlint configuration changes, this file will be updated the next time OpenCommit is run.

This offers you greater control over the generated commit messages, allowing for customization that aligns with your project's conventions.

Git flags

Theopencommit oroco commands can be used in place of thegit commit -m "${generatedMessage}" command. This means that any regular flags that are used with thegit commit command will also be applied when usingopencommit oroco.

oco --no-verify

is translated to :

git commit -m"${generatedMessage}" --no-verify

To include a message in the generated message, you can utilize the template function, for instance:

oco'#205: $msg’

opencommit examines placeholders in the parameters, allowing you to append additional information before and after the placeholders, such as the relevant Issue or Pull Request. Similarly, you have the option to customize the OCO_MESSAGE_TEMPLATE_PLACEHOLDER configuration item, for example, simplifying it to $m!"

Message Template Placeholder Config

Overview

TheOCO_MESSAGE_TEMPLATE_PLACEHOLDER feature in theopencommit tool allows users to embed a custom message within the generated commit message using a template function. This configuration is designed to enhance the flexibility and customizability of commit messages, making it easier for users to include relevant information directly within their commits.

Implementation Details

In our codebase, the implementation of this feature can be found in the following segment:

commitMessage=messageTemplate.replace(config.OCO_MESSAGE_TEMPLATE_PLACEHOLDER,commitMessage);

This line is responsible for replacing the placeholder in themessageTemplate with the actualcommitMessage.

Usage

For instance, using the commandoco '$msg #205’, users can leverage this feature. The provided code represents the backend mechanics of such commands, ensuring that the placeholder is replaced with the appropriate commit message.

Committing with the Message

Once users have generated their desired commit message, they can proceed to commit using the generated message. By understanding the feature's full potential and its implementation details, users can confidently use the generated messages for their commits.

Ignore files

You can remove files from being sent to OpenAI by creating a.opencommitignore file. For example:

path/to/large-asset.zip**/*.jpg

This helps prevent opencommit from uploading artifacts and large files.

By default, opencommit ignores files matching:*-lock.* and*.lock

Git hook (KILLER FEATURE)

You can set OpenCommit as Gitprepare-commit-msg hook. Hook integrates with your IDE Source Control and allows you to edit the message before committing.

To set the hook:

oco hookset

To unset the hook:

oco hookunset

To use the hook:

git add<files...>git commit

Or follow the process of your IDE Source Control feature, when it callsgit commit command — OpenCommit will integrate into the flow.

Setup OpenCommit as a GitHub Action (BETA) 🔥

OpenCommit is now available as a GitHub Action which automatically improves all new commits messages when you push to remote!

This is great if you want to make sure all commits in all of your repository branches are meaningful and not lame likefix1 ordone2.

Create a file.github/workflows/opencommit.yml with the contents below:

name:'OpenCommit Action'on:push:# this list of branches is often enough,# but you may still ignore other public branchesbranches-ignore:[main master dev development release]jobs:opencommit:timeout-minutes:10name:OpenCommitruns-on:ubuntu-latestpermissions:write-allsteps:      -name:Setup Node.js Environmentuses:actions/setup-node@v2with:node-version:'16'      -uses:actions/checkout@v3with:fetch-depth:0      -uses:di-sukharev/opencommit@github-action-v1.0.4with:GITHUB_TOKEN:${{ secrets.GITHUB_TOKEN }}env:# set openAI api key in repo actions secrets,# for openAI keys go to: https://platform.openai.com/account/api-keys# for repo secret go to: <your_repo_url>/settings/secrets/actionsOCO_API_KEY:${{ secrets.OCO_API_KEY }}# customizationOCO_TOKENS_MAX_INPUT:4096OCO_TOKENS_MAX_OUTPUT:500OCO_OPENAI_BASE_PATH:''OCO_DESCRIPTION:falseOCO_EMOJI:falseOCO_MODEL:gpt-4oOCO_LANGUAGE:enOCO_PROMPT_MODULE:conventional-commit

That is it. Now when you push to any branch in your repo — all NEW commits are being improved by your never-tired AI.

Make sure you exclude public collaboration branches (main,dev,etc) inbranches-ignore, so OpenCommit does not rebase commits there while improving the messages.

Interactive rebase (rebase -i) changes commits' SHA, so the commit history in remote becomes different from your local branch history. This is okay if you work on the branch alone, but may be inconvenient for other collaborators.

Payments

You pay for your requests to OpenAI API on your own.

OpenCommit stores your key locally.

OpenCommit by default uses 3.5-turbo model, it should not exceed $0.10 per casual working day.

You may switch to gpt-4, it's better, but more expensive.


[8]ページ先頭

©2009-2025 Movatter.jp