- Notifications
You must be signed in to change notification settings - Fork1
License
Azure-Samples/langchain-agent-python
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
| page_type | languages | products | urlFragment | name | description | |||||
|---|---|---|---|---|---|---|---|---|---|---|
sample |
|
| langchain-agent-mcp | LangChain Python Agent with Model Context Protocol (MCP) | A production-ready LangChain agent in Python using Azure OpenAI Responses API with MCP server integration, deployed on Azure Container Apps |
This sample demonstrates aproduction-ready LangChain agent that uses theOpenAI Responses API withModel Context Protocol (MCP) for tool integration. The agent usesAzure OpenAI GPT-5-mini withEntra ID authentication,PostgreSQL with pgvector for semantic search, and is deployed as microservices onAzure Container Apps.
This is a simplified, Python version inspired by theMicrosoft AI Tour WRK540 workshop, but uses the same product data and instructions.
LangChain with Responses API - Uses OpenAI's latest Responses API for native MCP tool support
Azure OpenAI GPT-5-mini - Latest reasoning model deployed via Azure
PostgreSQL with pgvector - Semantic search over product catalog using vector embeddings
Entra ID Authentication - Keyless authentication using Managed Identity (no API keys)
MCP Server - FastMCP server with database and semantic search tools
Microservices Architecture - Agent and MCP server deployed as independent container apps
Infrastructure as Code - Complete Bicep templates with Azure best practices
One-command Deployment - Deploy everything withazd up
┌─────────────────────────────────────────────────────────────┐│ Azure Cloud ││ ││ ┌──────────────────────────────────────────────────────┐ ││ │ Azure Container Apps Environment │ ││ │ │ ││ │ ┌─────────────────┐ ┌──────────────────┐ │ ││ │ │ Agent Container │──────│ MCP Server │ │ ││ │ │ - LangChain │ HTTP │ - PostgreSQL │ │ ││ │ │ - Responses API │◄─────│ - Semantic │ │ ││ │ │ │ │ Search │ │ ││ │ └─────────┬────────┘ └────────┬─────────┘ │ ││ │ │ │ │ ││ └────────────┼─────────────────────────┼─────────────────┘ ││ │ Entra ID │ ││ ▼ ▼ ││ ┌─────────────────────────┐ ┌──────────────────────┐ ││ │ Azure OpenAI │ │ PostgreSQL │ ││ │ - GPT-5-mini │ │ - pgvector │ ││ │ - text-embedding- │ │ - Zava database │ ││ │ 3-small │ │ │ ││ └─────────────────────────┘ └──────────────────────┘ │└─────────────────────────────────────────────────────────────┘Local Development:┌──────────────┐ ┌──────────────┐ ┌──────────────┐│ agent.py │─HTTP─│ MCP Server │ │ Azure OpenAI ││ (localhost) │◄─────│ (localhost: │─────▶│ (cloud) ││ │ │ 8000) │ Entra│ │└──────────────┘ └──────────────┘ ID └──────────────┘- Azure Subscription -Create one for free
- Azure Developer CLI (azd) -Install azd
- Azure CLI -Install Azure CLI
- Python 3.11+ -Download Python
- Docker Desktop -Install Docker
- GitHub Codespaces - Click the badge above to start in a pre-configured environment with all tools installed!
# 1. Login to Azureaz loginazd auth login# 2. Deploy everythingazd up
That's it! Theazd up command will:
- Provision Azure OpenAI with GPT-5-mini
- Create Container Apps environment
- Build and deploy both the agent and MCP server containers
- Configure networking and managed identity
- Set up monitoring with Application Insights
After deployment completes, you'll see output like:
SUCCESS: Your application was provisioned and deployed to Azure!Endpoints: - MCP Server: https://ca-mcp-abc123.region.azurecontainerapps.io - Agent: https://ca-agent-abc123.region.azurecontainerapps.ioOption 1: Use Azure Database (Recommended)
# 1. Deploy to Azure firstazd up# 2. Get configuration and set MCP server URLazd env get-values> .env.localecho"MCP_SERVER_URL=http://localhost:8000">> .env.local# 3. Start MCP server (Terminal 1)cd mcpsource ../.env.localpython app.py# 4. Start agent server (Terminal 2)cd agentsource ../.env.localPORT=8001 python app.py# 5. Open browser to http://localhost:8001
Option 2: Full Local Stack
# 1. Start PostgreSQL with pgvectordocker-compose up -d# 2. Configure environmentcp .env.example .env.local# Edit .env.local with your Azure OpenAI credentials# 3. Initialize databasecd datasource ../.env.localpython generate_database.py# 4. Regenerate embeddings (required if your Azure OpenAI uses a different embedding model)python regenerate_embeddings.py# 5. Start MCP server (Terminal 1)cd mcpsource ../.env.localpython app.py# 6. Start agent server (Terminal 2)cd agentsource ../.env.localPORT=8001 python app.py# 7. Open browser to http://localhost:8001
VS Code Tasks:
The project includes pre-configured VS Code tasks. PressCmd+Shift+P (Mac) orCtrl+Shift+P (Windows/Linux) and select "Tasks: Run Task" to see available tasks:
- Start MCP Server
- Start Agent
- Start PostgreSQL (Docker)
- Initialize Database
Ports:
- MCP Server:
8000 - Agent/Chat UI:
8001(set viaPORTenvironment variable)
langchain-agent-python/├── azure.yaml # Azure Developer CLI configuration├── README.md # This file├── LICENSE # MIT License├── SECURITY.md # Security policy├── SUPPORT.md # Support information├── CODE_OF_CONDUCT.md # Code of conduct├── .env.example # Template for environment variables├── .gitignore # Git ignore rules│├── agent/ # LangChain Agent Service│ ├── agent.py # Main agent with Responses API│ ├── config.py # Pydantic configuration│ ├── instructions.txt # System prompt for agent│ ├── requirements.txt # Python dependencies│ ├── Dockerfile # Container definition│ └── .dockerignore # Docker ignore rules│├── mcp/ # MCP Server Service│ ├── mcp_server.py # FastMCP server with tools│ ├── requirements.txt # Python dependencies│ ├── Dockerfile # Container definition│ └── .dockerignore # Docker ignore rules│└── infra/ # Infrastructure as Code ├── main.bicep # Main deployment template ├── main.parameters.json # Default parameters ├── abbreviations.json # Resource name abbreviations ├── core/ # Reusable Bicep modules │ ├── ai/ │ │ └── cognitiveservices.bicep │ ├── host/ │ │ ├── container-apps-environment.bicep │ │ └── container-registry.bicep │ ├── monitor/ │ │ └── monitoring.bicep │ └── security/ │ ├── managed-identity.bicep │ └── role.bicep └── app/ # Application-specific modules ├── agent.bicep └── mcp-server.bicepThe agent uses LangChain'sChatOpenAI with the newResponses API for native MCP tool support:
fromlangchain_openaiimportChatOpenAIllm=ChatOpenAI(model="gpt-5-mini",base_url=f"{endpoint}/openai/v1/",api_key=token_provider,model_kwargs={"use_responses_api":True})# Bind MCP tools from servermcp_tools=get_mcp_tools(mcp_server_url)llm=llm.bind_tools(mcp_tools)
The MCP server exposes tools via FastMCP:
fromfastmcpimportFastMCPmcp=FastMCP("Data Analysis Tools")@mcp.tool()defexecute_query(query:str)->dict:"""Execute SQL query on database."""# ... implementation
Both services use environment-aware configuration:
- Local: Uses
.env.localfile, connects tolocalhost:8000for MCP - Production: Uses environment variables from Container Apps, connects via HTTPS to cloud MCP server
- Azure OpenAI: Uses Managed Identity (Entra ID) - no API keys
- MCP Server: Internal Container Apps networking
- Monitoring: Application Insights for observability
The MCP server provides these tools to the agent:
get_current_utc_date()- Returns current UTC timestamp for time-sensitive queriesget_table_schemas()- Returns PostgreSQL database schema informationexecute_sales_query(query: str)- Executes SQL queries on PostgreSQL databasesemantic_search_products(query_description: str)- Semantic product search using pgvector
The sample usesAzure PostgreSQL Flexible Server withpgvector for semantic product search.
Key Features:
- 10-table retail schema (products, orders, customers, inventory, etc.)
- Vector embeddings for semantic search using Azure OpenAI
- Pre-populated Zava DIY product catalog with ~424 products
- Natural language queries like "waterproof outdoor electrical boxes"
Data Files Included:
This repository includes pre-generated data files in thedata/ folder, so you don't need to download anything:
products_pregenerated.json- 424 products with pre-computed embeddingscustomers_pregenerated.json- 500 sample customersorders_pregenerated.json- 2000 sample orders
Setup:
- Production: Automatically provisioned during
azd up - Local: Run
docker-compose up -dthenpython data/generate_database.py
Editmcp/mcp_server.py:
@mcp.tool()defmy_custom_tool(param:str)->dict:"""Description of what this tool does."""# Your implementationreturn {"result":"data"}
The tool is automatically exposed via the/tools endpoint in OpenAI function format.
Editinfra/main.parameters.json:
{"openAiModelName": {"value":"gpt-5-mini"// Change to gpt-4o, etc. }}Editagent/instructions.txt to change the agent's behavior and personality.
View logs and metrics in Azure Portal:
# Open Application Insightsazd monitor# View container logsaz containerapp logs show --name<agent-name> --resource-group<rg-name> --follow
"Deployment quota exceeded"
→ Try a different region:azd env set AZURE_LOCATION eastus2
"Authentication failed"
→ Ensure you're logged in:az login && azd auth login
"GPT-5-mini not available"
→ Model may not be available in your region - try eastus or westus
"Container apps failing to start"
→ Check logs:azd monitor
"MCP tools not loading"
→ Ensure MCP_SERVER_URL is accessible from the agent
Remove all Azure resources:
azd down
- Azure OpenAI Documentation
- LangChain Documentation
- Model Context Protocol (MCP)
- FastMCP Framework
- Azure Developer CLI (azd)
- Original Workshop (WRK540)
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA). For details, visithttps://cla.opensource.microsoft.com.
This project is licensed under the MIT License - see theLICENSE file for details.
Questions or feedback? Open an issue onGitHub or seeSUPPORT.md.
About
Resources
License
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
