- Notifications
You must be signed in to change notification settings - Fork2
Frontend for the Ollama LLM, built with React.js and Flux architecture.
NotificationsYou must be signed in to change notification settings
kastorcode/ollama-gui-reactjs
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
👨💻 Developed by Matheus Ramalho de Oliveira
🏗️ Brazilian Software Engineer
✉️kastorcode@gmail.com
🦫LinkedIn •Instagram
This application is a frontend for the LLM (large language model) Ollama. Ollama is an interface created by Meta that facilitates the use of artificial intelligence.
Craco
Flux Architecture
React.js
React Hooks Global State
React Router
React Transition Group
Styled Components
TypeScript
- You need to have theOllama server installed on your machine, or configure the app to use an external URL;
- Make a clone of this repository;
- Open the project folder in a terminal;
- Run
yarnto install dependencies; - Run
yarn startto launch athttp://localhost:3000.
- You need to have theOllama server installed on your machine, or configure the app to use an external URL;
- By default, the app uses the llama3 model, you can install it with the command:
ollama run llama3; - If you have the local server, run it with the following command releasing CORS:
export OLLAMA_ORIGINS=https://*.github.io && ollama serve; - Access at:kastorcode.github.io/ollama-gui-reactjs.
<kastor.code/>
About
Frontend for the Ollama LLM, built with React.js and Flux architecture.
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Uh oh!
There was an error while loading.Please reload this page.





