You signed in with another tab or window.Reload to refresh your session.You signed out in another tab or window.Reload to refresh your session.You switched accounts on another tab or window.Reload to refresh your session.Dismiss alert
A comprehensive AI-powered platform for football statistics analysis and real-time insights using LangChain v3, multiple LLM models, and advanced agent orchestration.
🌟 Key Features
1. Multi-Model LLM Integration
OpenAI GPT-4 Turbo: Primary model for complex analysis and reasoning
Groq Mixtral-8x7b: Used for real-time processing and initial query routing
Model Selection Logic: Automatic selection based on task complexity and requirements
📹 Demo GIF
2. Advanced Agent Architecture
Supervisor Agent
Orchestrates the entire query processing pipeline
Manages agent delegation and task routing
Handles fallback scenarios and error recovery
Maintains processing state and debugging information
Specialized Agents
Analysis Agent: Historical data analysis and statistical comparisons
Realtime Agent: Live scores and current match statistics
Enhancement Agent: Query refinement and context enrichment
Security Agent: Query validation and scope verification
3. RAG (Retrieval Augmented Generation)
Vector store integration for semantic search
Redis-based document storage
Dynamic context retrieval based on query relevance
Automatic document embedding and indexing
Support for multiple document types (team stats, player stats, tournament data)
# Clone the repositorygit clone<repository-url># Install dependenciesnpm install# Configure environment variablescp .env.example .env# Edit .env with your API keys# Start Redisdocker-compose up -d# Run the applicationnpm run start
Usage Examples
CLI Interface
# Start the CLInpm run cli# Example queries:-"How did Manchester United perform in 2023?"-"Who was Liverpool's top scorer last season?"-"Compare Arsenal and Chelsea's recent performance"
API Endpoints
# Query endpointPOST /query{"query":"Tell me about Manchester United's performance"}# Health checkGET /health
🔧 Configuration
Environment Variables
OPENAI_API_KEY: OpenAI API key
GROQ_API_KEY: Groq API key
REDIS_URL: Redis connection string
PORT: API server port
Model Configuration
Adjust temperature and other parameters inconfig/config.ts
Configure model selection logic insupervisor.agent.ts
📚 Documentation
Agent Documentation
Each agent has specific responsibilities and capabilities
Agents can be extended or modified for custom use cases
New agents can be added by implementing the base agent interface
Tool Documentation
Tools provide specific functionalities
New tools can be added by implementing the tool interface
Tools are automatically discovered and registered
API Documentation
RESTful endpoints for query processing
JSON request/response format
Error codes and handling
🤝 Contributing
Fork the repository
Create a feature branch
Submit a pull request
📝 License
MIT License
🙏 Acknowledgments
LangChain team for the excellent framework
OpenAI and Groq for their LLM APIs
Redis for memory management capabilities
About
Boilerplate for multi-LLM agent applications using Langchain v3 with multi-model reasoning capabilities.