- Notifications
You must be signed in to change notification settings - Fork0
A full-stack AI chatbot using local (Ollama) & cloud (OpenRouter) LLMs. Built with .NET 9 API & Angular 20 UI. Easily run models like PHI-3, Mistral, Gemma, Llama3 locally or online.
License
hardikpatelse/AIChatBot
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
AIChatBot is a local and online AI-powered chatbot built using open-source language models. This project supports both locally hosted models (viaOllama) and cloud-based models viaOpenRouter. It demonstrates the integration of AI with a .NET 8 API and Angular 20 frontend.
Transform your chatbot experience withKnowledge-Based AI! Upload your documents and get contextually-aware responses based on your own content.
Key Highlights:
- 📄Multi-format Document Support: Upload
.txt
,.md
, and.pdf
files - 🔍Intelligent Document Search: Advanced retrieval algorithms find relevant content
- 💬Context-Aware Responses: AI answers based on your uploaded documents
- 📊Source Attribution: See exactly which documents informed each response
- 🗂️Document Management: Easy upload, view, and delete capabilities
- 👤User-Specific Collections: Each user maintains their own document library
Experience AI that truly understands YOUR content!
Watch the AIChatBot in action on YouTube:
📺AIChatBot Demo
To run this project locally, ensure the following:
- Windows 10/11 with WSL support
- Installed Ubuntu 20.04.6 LTS (via Microsoft Store)
- .NET 8 SDK
- Node.js (v20+)
- Angular CLI (
npm install -g @angular/cli
) - Ollama installed in Ubuntu for running local AI models
- Optional: Account onhttps://openrouter.ai
- Go to Microsoft Store → Search forUbuntu 20.04.6 LTS → Install.
- Open Ubuntu and create your UNIX user account.
curl -fsSL https://ollama.com/install.sh| sh
To pull and run the desired models:
# Pull modelsollama pull phi3:latestollama pull mistral:latestollama pull gemma:2bollama pull llama3:latest# Run modelsollama run phi3ollama run mistralollama run gemma:2bollama run llama3
# List all pulled modelsollama list# Stop a running modelollama stop phi3# View running modelsps aux| grep ollama
From Ubuntu terminal:
shutdown now
Or simply close the terminal window if you don’t need a full shutdown.
Go tohttps://openrouter.ai andsign up.
Navigate toAPI Keys in your profile andgenerate an API key.
Set this key as an environment variable in your API project:
export OPENROUTER_API_KEY=your_key_here
Models used:
google/gemma-3-27b-it:free
deepseek/deepseek-chat-v3-0324:free
API requests are routed via OpenRouter using this key, supporting seamless AI chat.
- Navigate to
AIChatBot.API/
- Run the following commands:
dotnet restoredotnet builddotnet run
- Ensure
appsettings.json
file includes:
ApiKey=YOUR_KEY_HERE
- Navigate to
AIChatBot.UI/
- Run:
npm installng serve
- Access the chatbot UI at
http://localhost:4200/
The AIChatBot includes advanced RAG capabilities that allow AI models to answer questions based on your uploaded documents. This feature significantly enhances the AI's ability to provide contextually relevant and accurate responses.
The RAG system consists of several key components:
Document Processing Pipeline:
- File upload handling for multiple formats
- Text extraction from PDF, TXT, and MD files
- Content chunking for efficient retrieval
- In-memory indexing with similarity search
Retrieval System:
- Semantic search across document chunks
- Top-K retrieval (configurable, default: 3 chunks)
- Relevance scoring and ranking
- Source attribution and metadata tracking
Generation Enhancement:
- Context-aware prompt construction
- Integration with all supported AI models
- Source citation in responses
- Fallback to general knowledge when needed
The RAG system supports various AI models with different levels of effectiveness:
Model Category | Models | RAG Performance |
---|---|---|
Best RAG Support | GPT-3.5 Turbo Gemini Flash 2.0 (Unlimited) Gemini Flash 2.0 (Limited) | ⭐⭐⭐⭐⭐ |
Good RAG Support | DeepSeek v3 Gemma 3 27B LLaMA 3 | ⭐⭐⭐⭐ |
Basic RAG Support | PHI-3 Mistral 7B | ⭐⭐⭐ |
Upload Documents:
# Supported formats- Plain text files (.txt)- Markdown files (.md) - PDF documents (.pdf)
Document Management:
- View all uploaded documents
- Delete individual documents
- User-specific document collections
- Automatic indexing upon upload
RAG-Enhanced Chat:
- Select "Knowledge-Based (RAG)" mode
- Ask questions about your uploaded content
- Receive responses with source attribution
- Contextual answers based on document content
Currently, the RAG system uses in-memory storage (InMemoryRagStore
), which provides:
- Fast retrieval performance
- Simple deployment setup
- Automatic cleanup on application restart
- User-isolated document collections
Note: For production deployments, consider implementing persistent storage solutions.
Model | Type | Source | Access | RAG Support |
---|---|---|---|---|
PHI-3:latest | Local | Ollama | ollama run | ⭐⭐⭐ |
Mistral:latest | Local | Ollama | ollama run | ⭐⭐⭐ |
Gemma:2b | Local | Ollama | ollama run | ⭐⭐⭐ |
Llama3:latest | Local | Ollama | ollama run | ⭐⭐⭐⭐ |
google/gemma-3-27b-it:free | Online | OpenRouter.ai | API Key | ⭐⭐⭐⭐ |
deepseek/deepseek-chat-v3-0324 | Online | OpenRouter.ai | API Key | ⭐⭐⭐⭐ |
google/gemini-2.0-flash-exp | Online | OpenRouter.ai | API Key | ⭐⭐⭐⭐⭐ |
openai/gpt-3.5-turbo-0613 | Online | OpenRouter.ai | API Key | ⭐⭐⭐⭐⭐ |
google/gemini-2.0-flash-001 | Online | OpenRouter.ai | API Key | ⭐⭐⭐⭐⭐ |
AIChatBot/│├── AIChatBot.API/ # .NET 8 API for chatbot│ ├── Controllers/ # API endpoints│ │ └── DocumentsController.cs # RAG document upload/management│ ├── Services/ # Business logic services│ │ ├── RagChatService.cs # RAG-enabled chat functionality│ │ ├── InMemoryRagStore.cs # Document indexing and search│ │ ├── AgentService.cs # AI tool integration│ │ └── ChatService.cs # Standard chat functionality│ ├── Interfaces/ # Service contracts│ │ └── IRagStore.cs # RAG storage interface│ └── Migrations/ # Database schema updates for RAG├── AIChatBot.UI/ # Angular 20 UI frontend│ └── src/app/components/│ ├── document-upload/ # RAG document upload component│ ├── chat/ # Main chat interface│ └── model-selector/ # AI model and mode selection└── README.md # Project documentation
The AIChatBot supports three advanced operation modes beyond simple chat:
In this mode, the AI can recognize specific tasks in user prompts and use internal tools (functions) to perform actions. Integrated tools include:
Tool Function | Description | Example Prompt |
---|---|---|
CreateFile | Creates a text file with given content | "Create a file calledreport.txt with the textHello world ." |
FetchWebData | Fetches the HTML/content of a public URL | "Fetch the content ofhttps://example.com" |
SendEmail | Simulates sending an email (console-logged) | "Send an email tojohn@example.com with subjectHello ." |
These functions are executed server-side in.NET
, with input parsed from natural language prompts.
NEW FEATURE: The RAG (Retrieval-Augmented Generation) mode enables AI to answer questions based on your uploaded documents. This powerful feature allows you to:
- Upload Documents: Support for
.txt
,.md
, and.pdf
files - Intelligent Retrieval: Automatically finds relevant content from your documents
- Source Attribution: AI responses include references to source documents
- Document Management: View, organize, and delete uploaded documents
- User-Specific Storage: Each user has their own document collection
- Text Files:
.txt
- Plain text documents - Markdown Files:
.md
- Formatted markdown documents - PDF Documents:
.pdf
- Portable document format
- Upload Documents: Use the drag-and-drop interface or browse to upload documents
- Select RAG Mode: Choose "Knowledge-Based (RAG)" from the chat mode dropdown
- Ask Questions: Query your documents using natural language
- Get Contextual Answers: Receive AI responses enriched with your document content
User: "What are the key findings in the latest market report?"AI: Based on the provided context from "market_analysis_2025.pdf", the key findings include:1. Consumer spending increased by 15% in Q42. Digital transformation investments rose by 23%3. Supply chain disruptions decreased significantly*Sources: 1 document(s) referenced*
The AI agent is capable of:
- Understanding high-level tasks
- Selecting and invoking appropriate tools
- Providing intelligent responses based on the outcome
This is powered by anAgentService
that works with bothlocal LLMs (via Ollama) andcloud models (via OpenRouter) to determine the right function to execute and handle the response.
You can toggle between AI modes via the UI:
- Chat-Only Mode
- AI + Tools Mode
- Knowledge-Based (RAG) Mode
- Agent Mode (multi-step planning, coming soon)
- Choose your preferred model type (local or online)
- Start the backend using
.NET 8
- Start the frontend using Angular CLI
- Access AIChatBot at
http://localhost:4200/
Upload Documents:
- Click the "Documents for RAG" section in the sidebar
- Drag & drop or browse to upload
.txt
,.md
, or.pdf
files - Wait for successful upload confirmation
Enable Knowledge-Based Mode:
- Select "Knowledge-Based (RAG)" from the chat mode dropdown
- Choose an AI model with good RAG support (⭐⭐⭐⭐ or ⭐⭐⭐⭐⭐)
Start Asking Questions:
Examples:"Summarize the key points from my uploaded documents""What does the report say about market trends?""Find information about [specific topic] in my files"
Review Source Attribution:
- AI responses will include "Sources: X document(s) referenced"
- Responses are enriched with content from your uploaded documents
Pull requests and suggestions are welcome! Feel free to fork the repo and enhance it.
This project is open-source and available under theMIT License.
About
A full-stack AI chatbot using local (Ollama) & cloud (OpenRouter) LLMs. Built with .NET 9 API & Angular 20 UI. Easily run models like PHI-3, Mistral, Gemma, Llama3 locally or online.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Contributors3
Uh oh!
There was an error while loading.Please reload this page.