Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Frontend for the Ollama LLM, built with React.js and Flux architecture.

NotificationsYou must be signed in to change notification settings

kastorcode/ollama-gui-reactjs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

👨‍💻 Developed by Matheus Ramalho de Oliveira
🏗️ Brazilian Software Engineer
✉️kastorcode@gmail.com
🦫LinkedInInstagram


This application is a frontend for the LLM (large language model) Ollama. Ollama is an interface created by Meta that facilitates the use of artificial intelligence.


Screenshots


Technologies

Craco
Flux Architecture
React.js
React Hooks Global State
React Router
React Transition Group
Styled Components
TypeScript


Installation and execution

  1. You need to have theOllama server installed on your machine, or configure the app to use an external URL;
  2. Make a clone of this repository;
  3. Open the project folder in a terminal;
  4. Runyarn to install dependencies;
  5. Runyarn start to launch athttp://localhost:3000.

Running from GitHub Pages

  1. You need to have theOllama server installed on your machine, or configure the app to use an external URL;
  2. By default, the app uses the llama3 model, you can install it with the command:ollama run llama3;
  3. If you have the local server, run it with the following command releasing CORS:export OLLAMA_ORIGINS=https://*.github.io && ollama serve;
  4. Access at:kastorcode.github.io/ollama-gui-reactjs.

<kastor.code/>


[8]ページ先頭

©2009-2025 Movatter.jp