Skip to main content

Model Context Protocol

MCP Client MCP Server

info

MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.

Learn more here.

Video Demo

Key Features

MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides:

  • A growing list of pre-built integrations that your LLM can directly plug into
  • The flexibility to switch between LLM providers and vendors
  • Best practices for securing your data within your infrastructure

Inspector

Explore community and your custom MCP servers via Inspector at http://localhost:6274 in Development.

Left Sidebar:

  • Select SSE Transport Type
  • Input http://<mcp server>:<MCP_SERVER_PORT>/sse in URL
  • Click Connect

Explore the following tabs in the Top Navbar:

  • Resources
  • Prompts
  • Tools

Community MCP Servers

Before building your own custom MCP servers, explore the growing list of hundreds of community MCP servers. With integrations spanning databases, cloud services, and web resources, the perfect fit might already exist.

DBHub

Learn more here. Explore more in Inspector.

Easily plug in this MCP into LLM to allow LLM to:

  • Perform read-only SQL query validation for secure operations
  • Enable deterministic introspection of DB
    • List schemas
    • List tables in schemas
    • Retrieve table structures
  • Enrich user queries deterministically
    • Ground DB related queries with DB schemas
    • Provide SQL templates for translating natural language to SQL

Youtube

Learn more here. Explore more in Inspector.

Instead of building logic to:

  • Scrape YouTube content
  • Adapt outputs for LLM compatibility
  • Validate tool invocation by the LLM
  • Chain these steps to fetch transcripts from URLs

Simply plug in this MCP to enable LLM to:

  • Fetch transcripts from any YouTube URL on demand

Custom MCP

Should you require a custom MCP server, a template is provided here for you to reference in development.

./backend/shared_mcp/tools.py
import os

from mcp.server.fastmcp import FastMCP

mcp = FastMCP(
"MCP Server",
port=os.environ["MCP_SERVER_PORT"],
)


@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b


@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
"""Get a personalized greeting"""
return f"Hello, {name}!"