How To Create A Custom Model Context Protocol (mcp) Client Using Gemini

Trending 20 hours ago
ARTICLE AD BOX

In this tutorial, we will beryllium implementing a civilization Model Context Protocol (MCP) Client utilizing Gemini. By nan extremity of this tutorial, you will beryllium capable to link your ain AI applications pinch MCP servers, unlocking powerful caller capabilities to supercharge your projects.

Gemini API 

We’ll beryllium utilizing nan Gemini 2.0 Flash exemplary for this tutorial.

To get your Gemini API key, sojourn Google’s Gemini API Key page and travel nan instructions.

Once you person nan key, shop it safely—you’ll request it later.

Node.js

Some of nan MCP servers require Node.js to run. Download nan latest version of Node.js from nodejs.org

  • Run nan installer.
  • Leave each settings arsenic default and complete nan installation.

National Park Services API

For this tutorial, we will beryllium exposing nan National Park Services MCP server to our client. To usage nan National Park Service API, you tin petition an API cardinal by visiting this link and filling retired a short form. Once submitted, nan API cardinal will beryllium sent to your email.

Make judge to support this cardinal accessible—we’ll beryllium utilizing it shortly.

Installing Python libraries

In nan bid prompt, participate nan pursuing codification to instal nan python libraries:

pip instal mcp python-dotenv google-genai

Creating mcp.json file

Next, create a record named mcp.json.

This record will shop configuration specifications astir nan MCP servers your customer will link to.

Once nan record is created, adhd nan pursuing first content:

{ "mcpServers": { "nationalparks": { "command": "npx", "args": ["-y", "mcp-server-nationalparks"], "env": { "NPS_API_KEY": <”YOUR_NPS_API_KEY”> } } } }

Replace <YOUR_NPS_API_KEY> pinch nan cardinal you generated.

Creating .env file

Create a .env record successful nan aforesaid directory arsenic nan mcp.json record and participate nan pursuing code:

GEMINI_API_KEY = <YOUR_GEMINI_API_KEY>

Replace <YOUR_GEMINI_API_KEY> pinch nan cardinal you generated.

We will now create a client.py record to instrumentality our MCP Client. Make judge that this record is successful nan aforesaid directory arsenic mcp.json and .env

Basic Client Structure

We will first import nan basal libraries and create a basal customer class

import asyncio import json import os from typing import List, Optional from contextlib import AsyncExitStack import warnings from google import genai from google.genai import types from mcp import ClientSession, StdioServerParameters from mcp.client.stdio import stdio_client from dotenv import load_dotenv load_dotenv() warnings.filterwarnings("ignore", category=ResourceWarning) def clean_schema(schema): # Cleans nan schema by keeping only allowed keys allowed_keys = {"type", "properties", "required", "description", "title", "default", "enum"} return {k: v for k, v successful schema.items() if k successful allowed_keys} class MCPGeminiAgent: def __init__(self): self.session: Optional[ClientSession] = None self.exit_stack = AsyncExitStack() self.genai_client = genai.Client(api_key=os.getenv("GEMINI_API_KEY")) self.model = "gemini-2.0-flash" self.tools = None self.server_params = None self.server_name = None

The __init__ method initializes nan MCPGeminiAgent by mounting up an asynchronous convention manager, loading nan Gemini API client, and preparing placeholders for exemplary configuration, tools, and server details.

It lays nan instauration for managing server connections and interacting pinch nan Gemini model.

Selecting nan MCP Server

async def select_server(self): pinch open('mcp.json', 'r') arsenic f: mcp_config = json.load(f) servers = mcp_config['mcpServers'] server_names = list(servers.keys()) print("Available MCP servers:") for idx, sanction successful enumerate(server_names): print(f" {idx+1}. {name}") while True: try: prime = int(input(f"Please prime a server by number [1-{len(server_names)}]: ")) if 1 <= prime <= len(server_names): break else: print("That number is not valid. Please effort again.") isolated from ValueError: print("Please participate a valid number.") self.server_name = server_names[choice-1] server_cfg = servers[self.server_name] bid = server_cfg['command'] args = server_cfg.get('args', []) env = server_cfg.get('env', None) self.server_params = StdioServerParameters( command=command, args=args, env=env )

This method prompts nan personification to take a server from nan disposable options listed successful mcp.json. It loads and prepares nan selected server’s relationship parameters for later use.

Connecting to nan MCP Server

async def connect(self): await self.select_server() self.stdio_transport = await self.exit_stack.enter_async_context(stdio_client(self.server_params)) self.stdio, self.write = self.stdio_transport self.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write)) await self.session.initialize() print(f"Successfully connected to: {self.server_name}") # List disposable devices for this server mcp_tools = await self.session.list_tools() print("\nAvailable MCP devices for this server:") for instrumentality successful mcp_tools.tools: print(f"- {tool.name}: {tool.description}")

This establishes an asynchronous relationship to nan selected MCP server utilizing stdio transport. It initializes nan MCP convention and retrieves nan disposable devices from nan server.

Handling User query and instrumentality calls

async def agent_loop(self, prompt: str) -> str: contents = [types.Content(role="user", parts=[types.Part(text=prompt)])] mcp_tools = await self.session.list_tools() devices = types.Tool(function_declarations=[ { "name": tool.name, "description": tool.description, "parameters": clean_schema(getattr(tool, "inputSchema", {})) } for instrumentality successful mcp_tools.tools ]) self.tools = tools consequence = await self.genai_client.aio.models.generate_content( model=self.model, contents=contents, config=types.GenerateContentConfig( temperature=0, tools=[tools], ), ) contents.append(response.candidates[0].content) turn_count = 0 max_tool_turns = 5 while response.function_calls and turn_count < max_tool_turns: turn_count += 1 tool_response_parts: List[types.Part] = [] for fc_part successful response.function_calls: tool_name = fc_part.name args = fc_part.args aliases {} print(f"Invoking MCP instrumentality '{tool_name}' pinch arguments: {args}") tool_response: dict try: tool_result = await self.session.call_tool(tool_name, args) print(f"Tool '{tool_name}' executed.") if tool_result.isError: tool_response = {"error": tool_result.content[0].text} else: tool_response = {"result": tool_result.content[0].text} isolated from Exception arsenic e: tool_response = {"error": f"Tool execution failed: {type(e).__name__}: {e}"} tool_response_parts.append( types.Part.from_function_response( name=tool_name, response=tool_response ) ) contents.append(types.Content(role="user", parts=tool_response_parts)) print(f"Added {len(tool_response_parts)} instrumentality response(s) to nan conversation.") print("Requesting updated consequence from Gemini...") consequence = await self.genai_client.aio.models.generate_content( model=self.model, contents=contents, config=types.GenerateContentConfig( temperature=1.0, tools=[tools], ), ) contents.append(response.candidates[0].content) if turn_count >= max_tool_turns and response.function_calls: print(f"Stopped aft {max_tool_turns} instrumentality calls to debar infinite loops.") print("All instrumentality calls complete. Displaying Gemini's last response.") return response

This method sends nan user’s punctual to Gemini, processes immoderate instrumentality calls returned by nan model, executes nan corresponding MCP tools, and iteratively refines nan response. It manages multi-turn interactions betwixt Gemini and nan server tools.

Interactive Chat Loop

async def chat(self): print(f"\nMCP-Gemini Assistant is fresh and connected to: {self.server_name}") print("Enter your mobility below, aliases type 'quit' to exit.") while True: try: query = input("\nYour query: ").strip() if query.lower() == 'quit': print("Session ended. Goodbye!") break print(f"Processing your request...") res = await self.agent_loop(query) print("\nGemini's answer:") print(res.text) isolated from KeyboardInterrupt: print("\nSession interrupted. Goodbye!") break isolated from Exception arsenic e: print(f"\nAn correction occurred: {str(e)}")

This provides a command-line interface wherever users tin taxable queries and person answers from Gemini, continuously until they exit nan session.

Cleaning up resources

async def cleanup(self): await self.exit_stack.aclose()

This closes nan asynchronous discourse and cleans up each unfastened resources for illustration nan convention and relationship stack gracefully.

Main introduction point

async def main(): supplier = MCPGeminiAgent() try: await agent.connect() await agent.chat() finally: await agent.cleanup() if __name__ == "__main__": import sys import os try: asyncio.run(main()) isolated from KeyboardInterrupt: print("Session interrupted. Goodbye!") finally: sys.stderr = open(os.devnull, "w")

This is nan main execution logic.

Apart from main(), each different methods are portion of nan MCPGeminiAgent class. You tin find nan complete client.py record here.

Run nan pursuing punctual successful nan terminal to tally your client:

The customer will:

  • Read nan mcp.json record to database nan different disposable MCP servers.
  • Prompt nan personification to prime 1 of nan listed servers.
  • Connect to nan selected MCP server utilizing nan provided configuration and situation settings.
  • Interact pinch nan Gemini exemplary done a bid of queries and responses.
  • Allow users to rumor prompts, execute tools, and process responses iteratively pinch nan model.
  • Provide a command-line interface for users to prosecute pinch nan strategy and person real-time results.
  • Ensure due cleanup of resources aft nan convention ends, closing connections and releasing memory.

I americium a Civil Engineering Graduate (2022) from Jamia Millia Islamia, New Delhi, and I person a keen liking successful Data Science, particularly Neural Networks and their exertion successful various areas.

More