top of page

Build MCP Server with Tools

  • Writer: Sumit Dey
    Sumit Dey
  • 4 days ago
  • 4 min read

An MCP server is a service that exposes structured data or functionality (like a database, API, or knowledge base) to be consumed by an MCP client and then used by an LLM.


Here’s the basic flow:

  1. MCP Server provides access to data or tools (for example, a CRM database, internal documentation, API, SharePoint, etc.).It exposes this through MCP-compatible APIs or endpoints.

  2. MCP Client connects to one or more MCP servers. It sends structured queries or requests (like “get weather info” or “fetch math calculation”) to the MCP server.

  3. LLM (Language Model) The client then takes the data returned from the server and passes it to the LLM, which uses that information to generate natural-language output or take further actions.


User Case

Step 1: The user sends a prompt → The chatbot receives it.

Step 2: According to the prompt MCP client decides which MCP server to call(Weather or Custom Math server).

Step 3: The MCP client interacts with one or more MCP servers to retrieve or perform actions, and then uses a Large Language Model (LLM) to generate a reasoning-based output based on the server’s response.


Step 4: Response sent back to the client.


ree


Custom Math MCP server


We have built a custom Math MCP server that supports addition, subtraction, multiplication, and division operations. The MCP server communicates using the stdio transport.



from mcp.server.fastmcp import FastMCP

mcp=FastMCP("Math")

@mcp.tool()
def add(a:int,b:int)->int:
    """_summary_
    Add to numbers
    """
    return a+b

@mcp.tool()
def subtract(a:int,b:int)-> int:
    """Subtract two numbers"""
    return a-b

@mcp.tool()
def multiple(a:int,b:int)-> int:
    """Multiply two numbers"""
    return a*b

@mcp.tool()
def division(a:int,b:int)-> int:
    """Division two numbers"""
    return a/b


#Use standard input/output (stdin and stdout) to receive and respond to tool function calls.
if __name__=="__main__":
    mcp.run(transport="stdio")

Weather MCP Server


A Weather MCP server is a custom MCP server designed to handle weather data requests. It provides tools for retrieving current weather conditions and forecasts for a given state code (e.g., CA, NJ, etc.), using the streamable-http transport protocol to exchange messages with the MCP client.


from mcp.server.fastmcp import FastMCP
from typing import Any
import httpx
# Initialize FastMCP server
mcp = FastMCP("weather")

# Constants
WEATHER_API_BASE = "https://api.weather.gov"
USER_AGENT = "weather-app/1.0"


async def make_nws_request(url: str) -> dict[str, Any] | None:
    """Make a request to the NWS API with proper error handling."""
    headers = {
        "User-Agent": USER_AGENT,
        "Accept": "application/geo+json"
    }
    async with httpx.AsyncClient() as client:
        try:
            response = await client.get(url, headers=headers, timeout=30.0)
            response.raise_for_status()
            return response.json()
        except Exception:
            return None
        
def format_alert(feature: dict) -> str:
    """Format an alert feature into a readable string."""
    props = feature["properties"]
    return f"""
        Event: {props.get('event', 'Unknown')}
        Area: {props.get('areaDesc', 'Unknown')}
        Severity: {props.get('severity', 'Unknown')}
        Description: {props.get('description', 'No description available')}
        Instructions: {props.get('instruction', 'No specific instructions provided')}
        """

@mcp.tool()
async def get_alerts(state: str) -> str:
    """Get weather alerts for a US state.

    Args:
        state: Two-letter US state code (e.g. CA, NY)
    """
    url = f"{WEATHER_API_BASE}/alerts/active/area/{state}"
    data = await make_nws_request(url)

    if not data or "features" not in data:
        return "Unable to fetch alerts or no alerts found."

    if not data["features"]:
        return "No active alerts for this state."

    alerts = [format_alert(feature) for feature in data["features"]]
    return "\n---\n".join(alerts)

if __name__=="__main__":
    mcp.run(transport="streamable-http")

MCP Client


An MCP client (Model Context Protocol client) is the component that connects to one or more MCP servers (like weather and custom math MCP servers) to request specialized data, tools, or computations, and then uses the results (often through an LLM, like llama, ChatGPT, etc.) to generate contextually aware outputs.


from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_groq import ChatGroq
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv
from fastapi import FastAPI, Query
import uvicorn
from pydantic import BaseModel

load_dotenv()

app = FastAPI(title='MCP AI Agent')

class ChatRequest(BaseModel):
    user_input: str

@app.post("/chat")
async def main(request: ChatRequest):
    client=MultiServerMCPClient(
        {
            "math":{
                "command":"python",
                "args":["mathserver.py"], ## Ensure correct absolute path
                "transport":"stdio",
            
            },
            "weather": {
                "url": "http://localhost:8000/mcp",  # Ensure server is running here
                "transport": "streamable_http",
            }

        }
    )

    import os
    os.environ["GROQ_API_KEY"]=os.getenv("GROQ_API_KEY") # Groq and Llama


    tools=await client.get_tools()
    model=ChatGroq(model="llama-3.3-70b-versatile")  # Groq and Llama

    agent=create_react_agent(
        model,tools
    )

    result = await agent.ainvoke({"messages": [{"role": "user", "content": request.user_input}]})
    return result

uvicorn.run(app, host='127.0.0.1', port=8002)

Now time to build the UI with streamlit


import streamlit as st
import requests

# Streamlit App Configuration
st.set_page_config(page_title="MCP MultiAgent UI", layout="centered")

# Define API endpoint
API_URL = "http://127.0.0.1:8002/chat"

# Streamlit UI Elements
st.title("MCP Chatbot Agent")
st.write("Interact with the MCP using this interface.")


# Input box for user messages
user_input = st.text_area("Enter your prompt:", height=150, placeholder="Please type your prompt here...")


# Button to send the query
if st.button("Submit"):
    if user_input.strip():
        try:

            with st.spinner("wait...", show_time=True):
                # Send the input to the FastAPI backend
                payload = {"user_input": user_input}
                response = requests.post(API_URL, json=payload)

            # Display the response
            if response.status_code == 200:
                response_data = response.json()
                if "error" in response_data:
                    st.error(response_data["error"])
                else:
                    ai_responses = [
                        message.get("content", "")
                        for message in response_data.get("messages", [])
                        if message.get("type") == "ai"
                    ]

                    if ai_responses:
                        st.subheader("Agent Response:")
                        for i, response_text in enumerate(ai_responses, 1):
                            st.markdown(f"{response_text}")
                    else:
                        st.warning("No AI response found in the agent output.")
            else:
                st.error(f"Request failed with status code {response.status_code}.")
        except Exception as e:
            st.error(f"An error occurred: {e}")
    else:
        st.warning("Please enter a message before clicking 'Send Query'.")

Output of the agent when using the Math MCP server


ree

Output of the agent when using the weather MCP server

ree

Conclusion

By separating logic and computation from the client, the MCP server enables modularity, scalability, and reusability across different applications. Whether it’s a math server performing calculations or a weather server providing forecasts, MCP servers enhance the capability of AI-driven systems by integrating external tools and data sources seamlessly.

 
 
 

Comments


© 2023 by T-MARKET. Proudly created with Wix.com

  • Facebook - Black Circle
  • Twitter - Black Circle
  • Google+ - Black Circle
bottom of page