Model Context Protocol (MCP) is an open standard developed by Anthropic that enables AI models to seamlessly access external tools and resources. It creates a standardized way for AI models to interact with tools, access the internet, run code, and more, without needing custom integrations for each tool or model.
In this tutorial, we'll build a complete MCP server that integrates with Brave Search, and then connect it to Google's Gemini 2.0 model to demonstrate how MCP creates a flexible architecture for AI-powered applications.
Resources: Find the complete code for this tutorial in theMCP Gemini Demo Repository and explore theofficial MCP documentation.
MCP provides a standardized interface between AI models and external tools. Benefits include:
MCP works through a client-server architecture:
The key insight:Any function that can be coded can be exposed as an MCP tool. This opens up endless possibilities - from API integrations to database access, custom calculations, or even control of physical devices.
To start building with MCP, you'll need the MCP SDK and Bun (for fast TypeScript execution):
mkdir mcp-geminicd mcp-geminibun init -ybun add @modelcontextprotocol/sdk@^1.7.0 @google/generative-ai
Note: Check out theMCP SDK Repository for the latest version and features.
Our MCP server will expose two tools:
But remember, you could expose virtually any function - PDF processing, database queries, email sending, image generation, etc.
The core of MCP is defining tools. Each tool represents a callable function with a defined input schema:
// Web Search Tool Definition (simplified)exportconstWEB_SEARCH_TOOL:Tool={name:"brave_web_search",description:"Performs a web search using the Brave Search API...",inputSchema:{type:"object",properties:{query:{type:"string",description:"Search query",},count:{type:"number",description:"Number of results (1-20, default 10)",default:10,},// Other parameters...},required:["query"],},};
The key components of a tool definition are:
This declarative approach means models can discover what tools are available and how to use them correctly.
Once you've defined a tool, you need to implement the actual functionality. This is simply a function that receives the tool's arguments and returns a result:
// Web search handler (simplified)exportasyncfunctionwebSearchHandler(args:unknown){// Validate argumentsif(!isValidArgs(args)){thrownewError("Invalid arguments");}const{query,count}=args;// Call your API, function, or any code you wantconstresults=awaitperformWebSearch(query,count);// Return formatted results in MCP response formatreturn{content:[{type:"text",text:results}],isError:false,};}
The power of MCP lies in this flexibility. Your tool implementation can:
As long as you can code it in TypeScript/JavaScript, you can expose it as an MCP tool.
The MCP server connects your tool definitions and implementations together:
// Create MCP server (simplified)exportfunctioncreateServer(){constserver=newServer({name:SERVER_CONFIG.name,version:SERVER_CONFIG.version,},{capabilities:{tools:{tools:[WEB_SEARCH_TOOL,LOCAL_SEARCH_TOOL],},},});// Register handlersserver.setRequestHandler(ListToolsRequestSchema,async()=>({tools:[WEB_SEARCH_TOOL,LOCAL_SEARCH_TOOL],}));server.setRequestHandler(CallToolRequestSchema,async(request)=>{const{name,arguments:args}=request.params;// Route to appropriate handlerswitch(name){case"brave_web_search":returnawaitwebSearchHandler(args);case"brave_local_search":returnawaitlocalSearchHandler(args);default:// Handle unknown tools}});returnserver;}
The server provides two key functions:
This separation of concerns makes your server maintainable and extensible. Adding new tools is as simple as defining them and adding a new case to the handler.
To demonstrate our server, let's create a basic client. This helps understand how tools are called from outside systems:
// Create MCP client and connect (simplified)consttransport=newStdioClientTransport({command:"bun",args:["index.ts"],});constclient=newClient({name:"brave-search-demo-client",version:"1.0.0"},{capabilities:{tools:{}}});// Connect to serverawaitclient.connect(transport);// List available toolsconst{tools}=awaitclient.listTools();// Call a toolconstresult=awaitclient.callTool({name:"brave_web_search",arguments:{query:"latest AI research papers",count:3,},});
This basic flow shows the core interactions:
The real power of MCP comes when we connect it to AI models. Let's integrate our server with Google's Gemini 2.0:
// Configure Gemini with function declarations (simplified)constmodel=googleGenAi.getGenerativeModel({model:"gemini-2.0-pro-exp-02-05",tools:[{functionDeclarations:[{name:"brave_web_search",description:"Search the web using Brave Search API",parameters:{// Schema matching our MCP tool},},// Other tools...],},],});// Process user queries with Gemini and MCP toolsasyncfunctionprocessQuery(userQuery){// Generate a response with Geminiconstresult=awaitmodel.generateContent({contents:[{role:"user",parts:[{text:userQuery}]}],});// Check if Gemini wants to call a functionif(hasFunctionCall(result)){constfunctionCall=extractFunctionCall(result);// Call our MCP toolconstsearchResults=awaitclient.callTool({name:functionCall.name,arguments:functionCall.args,});// Send function results back to Gemini for final responsereturnawaitgenerateFinalResponse(userQuery,functionCall,searchResults);}returnresult.text();}
This integration demonstrates the true potential of MCP:
The result is a seamless experience where the AI model acts as an intelligent router to the most appropriate functionality.
Deep dive: Learn more about Google Gemini's function calling capabilities in theofficial documentation.
While our example focused on search tools, remember that MCP can expose any functionality. You could add tools for:
Each tool follows the same pattern:
This extensibility makes MCP a powerful architecture for building AI applications that can grow with your needs.
To run the examples, set up the required environment variables:
export BRAVE_API_KEY="your_brave_api_key"export GOOGLE_API_KEY="your_google_api_key"
Then run the examples with Bun:
# Basic clientbun examples/basic-client.ts# Gemini integrationbun examples/gemini-tool-function.ts
Resource: Find ready-to-use example code in theMCP Examples Repository.
In this tutorial, we've explored how to build a complete MCP server and integrate it with Google's Gemini 2.0 model. The key takeaways:
This approach to AI architecture offers significant advantages:
As the AI ecosystem continues to evolve, standards like MCP will become increasingly important for building interoperable, extensible systems that combine the best of human-coded functions with the power of large language models.