- Notifications
You must be signed in to change notification settings - Fork1
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
shekharP1536/ollama-web
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Ollama Web is a web-based interface for interacting with the Ollama model. This README will guide you through setting up and running the project on both Windows and Linux systems.
Take a quick look at the Ollama Web interface:
Below are some snapshots of the interface:
Before running the project, you need to download and set upOllama on your system.
- Visit theOllama download page for Windows.
- Download the Windows installer and run it.
- Follow the installation instructions to complete the process.
Open your terminal and run the following command to install Ollama:
curl -fsSL https://ollama.com/install.sh| bash
After installation, verify that Ollama is installed successfully by running:
ollama --version
To get the code, you need to clone theollama-web
repository from GitHub. Use the following command to clone the repository:
git clone https://github.com/shekharP1536/ollama-web.gitcd ollama-web
Once Ollama is installed, you need to install the necessary Python packages to run the project.
Create a virtual environment (optional but recommended) to keep dependencies isolated:
python3 -m venv venv
Activate the virtual environment:
On Windows:
.\venv\Scripts\activate
On Linux/Mac:
source venv/bin/activate
Install the required Python packages:
pip install -r requirements.txt
After cloning the repository, navigate to the project directory and run theindex.py
script to start the web application:
python index.py
Once the application is running, open your web browser and go to:
http://localhost:5000
This will open the Ollama Web interface, and you can start using it locally!
- Ensure that you've followed all the installation steps for your operating system (Windows or Linux).
- Verify that Python and all required dependencies are installed correctly by running
pip install -r requirements.txt
. - If you encounter any issues with Ollama, try restarting your system or checking theOllama documentation for more help.
Enjoy using Ollama Web! If you have any questions or need assistance, feel free to open an issue on theGitHub repository.
If you have any questions or need assistance, feel free to reach out:
- 📧Email:mywork1536@gmail.com
- 🔗LinkedIn:Shekhar
About
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.