Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Local Codebase Assistant, data are locally and the inference is customizable!

License

NotificationsYou must be signed in to change notification settings

CodeAtCode/PicoCode

Repository files navigation

License

PicoCode - Local Codebase Assistant

🤖Note: This project was fully generated and developed using GitHub Copilot.

Screenshots

Web UI

PyCharm (Intellij Plugin)

immagine

Are you looking for a simple way to asks question to your codebase using the inference provider you want without to be locked to a specific service?This tool is a way to achieve this!

Overview

Check theblogpost about the project that has more info!

  • Production-ready RAG backend with per-project persistent storage
  • PyCharm/IDE integration via REST API (seeREST_API.md)
  • Per-project databases: Each project gets isolated SQLite database
  • Indexes files, computes embeddings using an OpenAI-compatible embedding endpoint
  • Stores vector embeddings in SQLite using sqlite-vector for fast semantic search
  • Analysis runs asynchronously (FastAPI BackgroundTasks) so the UI remains responsive
  • Minimal web UI for starting analysis and asking questions (semantic search + coding model)
  • Health check and monitoring endpoints for production deployment

PyCharm Plugin

A full-featured PyCharm/IntelliJ IDEA plugin is available:

  • Download: Get the latest plugin fromReleases
  • Per-Project Indexing: Automatically indexes current project
  • Secure API Keys: Stores credentials in IDE password safe
  • Real-time Responses: Streams answers from your coding model
  • File Navigation: Click retrieved files to open in editor
  • Progress Indicators: Visual feedback during indexing

Seeide-plugins/README.md for building and installation instructions.

Prerequisites

  • Python 3.8+ (3.11+ recommended for builtin tomllib)
  • Git (optional, if you clone the repo)
  • If you use Astraluv, install/configureuv according to the official docs:https://docs.astral.sh/uv/

Installation and run commands

First step: Example .env (copy.env.example ->.env and edit)

Astral uv

  • Follow Astral uv installation instructions first:https://docs.astral.sh/uv/
  • Typical flow (afteruv is installed and you are in the project directory):
  uv pip install -r pyproject.toml  uv run python ./main.py

Notes:

  • The exactuv subcommands depend on the uv version/configuration. Check the Astral uv docs for the exact syntax for your uv CLI release. The analyzer only needs a Python executable in the venv to runpython -m pip list --format=json;uv typically provides or creates that venv.

Using plain virtualenv / pip (fallback)

  • Create a virtual environment and install dependencies listed inpyproject.toml with your preferred tool.
  # create venv  python -m venv .venv  # activate (UNIX)  source .venv/bin/activate  # activate (Windows PowerShell)  .venv\Scripts\Activate.ps1  uv pip install -r pyproject.toml  # run the server  python ./main.py

Using Poetry

  poetry install  poetry run main.py

[8]ページ先頭

©2009-2025 Movatter.jp