- Notifications
You must be signed in to change notification settings - Fork30
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j
License
ollama4j/ollama4j-web-ui
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
A web UI forOllama written in JavausingSpring Boot andVaadin frameworkandOllama4j.
The goal of the project is to enable Ollama users coming from Java and Spring background to have a fully functional webUI.
This project focuses on the raw capabilities of interacting with various models running on Ollama servers.
flowchart LR owui[Ollama4j Web UI] o4j[Ollama4j] o[Ollama Server] owui -->|uses| o4j o4j -->|Communicates with| o; m[Models] subgraph Ollama Deployment direction TB o -->|Manages| m end
If you already have a Ollama service running, the easiest way to get started is by using Docker by pointing to the hostaddress of Ollama. Find the image tagshere.
Run the Docker container by issuing this in your terminal:
docker run -it \ -p 9090:8080 \ -e OLLAMA_HOST_ADDR='http://192.168.10.1:11434' \ amithkoujalgi/ollama4j-web-ui
If you want to start the Ollama service and the Ollama Web UI as Docker containers, create afile calleddocker-compose.yaml
and add the following contents in it:
services:ollama:image:ollama/ollamaports: -"11434:11434"volumes: -~/ollama:/root/.ollamashm_size:512mbollama4j-web-ui:image:amithkoujalgi/ollama4j-web-uiports: -"9090:8080"environment:OLLAMA_HOST_ADDR:'http://ollama:11434'
Then open up your terminal and then type in:
docker-compose -f /path/to/your/docker-compose.yaml up
And then you access the Ollama4j Web UI onhttp://localhost:9090.
Download the latest version fromhere.
Or, you could download it via command-line.Just make sure to specify the version you want to download.
VERSION=0.0.1; wget https://github.com/ollama4j/ollama4j-web-ui/releases/download/$VERSION/ollama4j-web-ui-$VERSION.jar
Set environment variables.
export SERVER_PORT=8080export OLLAMA_HOST_ADDR=http://localhost:11434
Or, if you would want to override the base config file, create a fileapplication.properties
and add the followingconfiguration.Update the values ofserver.port
andollama.url
according to your needs.
server.port=8080logging.level.org.atmosphere = warnspring.mustache.check-template-location =falsespring.servlet.multipart.max-file-size=50MBspring.servlet.multipart.max-request-size=50MBvaadin.launch-browser=truevaadin.whitelisted-packages = com.vaadin,org.vaadin,dev.hilla,io.github.ollama4jollama.url=http://localhost:11434ollama.request-timeout-seconds=120
java -jar ollama4j-web-ui-$VERSION.jar \ --spring.config.location=/path/to/your/application.properties
Then openhttp://localhost:8080 in your browser to access the Ollama4j Web UI.




- Show errors on the UI. For example,
io.github.ollama4j.exceptions.OllamaBaseException: model "llama3" not found, try pulling it first
. - Settings pane for configuring default params such as
top-p
,top-k
, etc.
Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helpingwith code - any sortof contribution is much appreciated.
The project is inspired by the awesomeollama4j-ui projectby@AgentSchmecker.
The nomenclature has been adopted from the incredibleOllamaproject.
Thanks to the amazing contributors
About
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors3
Uh oh!
There was an error while loading.Please reload this page.