- Notifications
You must be signed in to change notification settings - Fork5
🚀 A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.
PocketLLM/PocketLLM
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
PocketLLM is a cross-platform assistant that pairs a Flutter application with a FastAPI backend to deliver secure, low-latency access to large language models. Users can connect their own provider accounts, browse real-time catalogues, import models, and chat across mobile and desktop targets with a shared experience.
PocketLLM focuses on three pillars:
- Unified catalogue – aggregate models from OpenAI, Groq, OpenRouter, and ImageRouter using official SDKs with per-user API keys.
- Bring your own keys – users activate providers securely; secrets are encrypted at rest and never fall back to environment credentials.
- Consistent chat experience – Flutter renders the same responsive interface on Android, iOS, macOS, Windows, Linux, and the web.
The backend exposes REST APIs that the Flutter client consumes. A Supabase instance stores provider configurations, encrypted secrets, and chat history.
| Area | Highlights |
|---|---|
| Model management | Dynamic/v1/models endpoint returns live catalogues with helpful status messaging when keys are missing or filters remove all results. Users can import, favourite, and set defaults. |
| Provider operations | Granular activation flows validate API keys with official SDKs, support base URL overrides, and expose a status dashboard. |
| Chat experience | Streaming responses, Markdown rendering, inline code blocks, and token accounting. |
| Security | Secrets encrypted using Fernet + project key, strict error messages when configuration is incomplete, and no environment fallback for user operations. |
| Observability | Structured logging across services and catalogue caching with per-provider metrics. |
| Onboarding & referrals | Invite-gated signup,/v1/waitlist applications, backend-validated invite codes, and a Flutter referral center for sharing codes and tracking rewards. |
PocketLLM├── lib/ # Flutter client (Riverpod, GoRouter, Material 3)│ ├── component/ # Shared widgets and UI primitives│ ├── pages/ # Screens including Library, API Keys, Chat│ └── services/ # State management, API bridges, secure storage├── pocketllm-backend/ # FastAPI application│ ├── app/api/ # Versioned routes (/v1)│ ├── app/services/ # Provider catalogue, auth, jobs, models│ ├── app/utils/ # Crypto helpers, security utilities│ └── database/ # Dataclasses mirroring Supabase tables└── docs/ # Operational guides and API references| Component | Requirement |
|---|---|
| Flutter | 3.19.6 (seeAGENTS.md setup script) |
| Dart | Included with Flutter SDK |
| Python | 3.11+ for FastAPI backend |
| Node / pnpm | Optional for tooling around Supabase migrations |
| Supabase | Service-role key and project URL configured in.env |
cd pocketllm-backendpython -m venv .venvsource .venv/bin/activatepip install -r requirements.txtcp .env.example .env# configure Supabase credentials and ENCRYPTION_KEYuvicorn main:app --reload
Key endpoint:GET /v1/models
GET /v1/modelsAuthorization: Bearer <JWT>{"models": [ {"provider":"openai","id":"gpt-4o","name":"GPT-4 Omni","metadata": {"owned_by":"openai"} } ],"message":null,"configured_providers": ["openai"],"missing_providers": ["groq","openrouter","imagerouter"]}
When no API keys are stored the endpoint responds with an emptymodels array and a descriptivemessage, enabling the Flutter UI to prompt users to add credentials.
cd ..flutter pub getflutter run# chooses a connected device or emulator
The API Keys page surfaces provider status, preview masks, and validation results. The Model Library consumes the unified/v1/models response and displays grouped catalogues with filtering options.
| Layer | Command |
|---|---|
| Flutter | flutter analyze && flutter test |
| Backend | cd pocketllm-backend && pytest |
Note: Some integration suites stub external SDKs; install
openai,groq, andopenrouterpackages locally for full coverage.
docs/api-documentation.md– REST endpoints and schemas.docs/backend-guide.md– Environment variables, Supabase integration, and deployment playbooks.docs/groq-guide.md– Official SDK usage for catalogue, chat, audio, and reasoning APIs.docs/frontend_cleanup_tasks.md– Outstanding UI refinements.
Contributions are welcome! Please reviewCONTRIBUTING.md and ensure:
- New features include unit or widget tests.
- Backend changes run through
pytestwith optional SDKs installed. - Documentation and changelogs reflect API or workflow updates.
- Secrets and API keys are never committed.
PocketLLM is released under theMIT License.
Have questions or ideas? Open an issue or join the discussion — we’d love to hear how you are using PocketLLM.
About
🚀 A powerful Flutter-based AI chat application that lets you run LLMs directly on your mobile device or connect to local model servers. Features offline model execution, Ollama/LLMStudio integration, and a beautiful modern UI. Privacy-focused, cross-platform, and fully open source.
Topics
Resources
Contributing
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors2
Uh oh!
There was an error while loading.Please reload this page.