- Notifications
You must be signed in to change notification settings - Fork0
nftmakerio/NMKR-Support-Agentic-Workflow
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
An intelligent AI-powered support system forNMKR, designed to provide automated, accurate responses to user queries about NMKR's products and services. This system uses CrewAI to orchestrate multiple AI agents that work together to provide comprehensive support responses.
This project implements an AI support system that:
- Processes support requests through REST API and webhooks
- Automatically crawls and analyzes NMKR documentation
- Provides detailed, context-aware responses about NMKR services
- Handles asynchronous processing with Redis queue
- Integrates with Plain for customer support workflows
- Docker and Docker Compose
- Python 3.10 or higher
- OpenAI API key
- Plain webhook secret (for webhook integration)
- Clone the repository:
git clone<repository-url>cd nmkr_support_v4
- Create a
.env
file with your credentials:
MODEL=gpt-4oOPENAI_API_KEY=your_openai_api_key_hereWEBHOOK_SECRET=your_webhook_secret_hereANTHROPIC_API_KEY=your_anthropic_key_here# OptionalSPIDER_API_KEY=your_spider_key_here# OptionalFIRECRAWL_API_KEY=your_firecrawl_key_here# Optional
- Start using the convenience script:
chmod +x start.sh./start.sh
Or manually with Docker Compose:
# Build and start servicesdocker-compose up --build -d# View logsdocker-compose logs -f# Stop servicesdocker-compose down
curl -X POST"http://localhost:8000/api/support" \ -H"Content-Type: application/json" \ -d'{"query": "How much does it cost to do an Airdrop with NMKR?"}'
Response:
{"job_id":"123-456-789","status":"queued"}
curl"http://localhost:8000/api/support/status/123-456-789"
curl -X POST"http://localhost:8000/api/webhook" \ -H"Content-Type: application/json" \ -H"Plain-Workspace-Id: ws_123" \ -H"Plain-Event-Type: thread.created" \ -H"Plain-Event-Id: evt_123" \ -H"Plain-Signature: your-webhook-signature" \ -d'{ "id": "evt_123", "type": "thread.created", "payload": { "message": { "content": "How much does it cost to do an Airdrop with NMKR?" } } }'
nmkr_support_v4/├── src/│ └── nmkr_support_v4/│ ├── api.py # FastAPI application│ ├── crew.py # CrewAI configuration│ ├── queue_manager.py # Redis queue management│ ├── tools/│ │ └── custom_tool.py # Web crawling tools│ ├── links_with_descriptions.json # NMKR links data│ └── docs_links_with_descriptions.json├── docker-compose.yml # Docker services configuration├── Dockerfile # Container build instructions├── requirements.txt # Python dependencies├── setup.py # Package configuration└── start.sh # Convenience startup script
- Install in development mode:
pip install -e.
- Run locally without Docker:
# Start Redisredis-server# Start RQ workerrq worker nmkr_support# Start APIuvicorn nmkr_support_v4.api:app --reload
MODEL
: OpenAI model to use (default: gpt-4o)OPENAI_API_KEY
: Your OpenAI API keyWEBHOOK_SECRET
: Secret for Plain webhook verificationREDIS_URL
: Redis connection URL (default: redis://localhost:6379)
- API: FastAPI application serving endpoints
- Worker: RQ worker processing support requests
- Redis: Queue and cache management
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
[Your chosen license]
About
A Crew AI Agentic Workflow to enhance the NMKR Support System
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
No releases published
Packages0
No packages published