所有文章 > 如何集成API > 构建 MCP 服务端并将其无缝接入 LangGraph
构建 MCP 服务端并将其无缝接入 LangGraph

构建 MCP 服务端并将其无缝接入 LangGraph

关键词: LangGraph MCP服务

模型上下文协议 (MCP) 是一种开放协议,它标准化了应用程序向 LLM 提供上下文的方式。MCP 提供了一种将 AI 模型连接到不同数据源和工具的标准化方法。它允许您定义可以通过编程方式访问的提示、资源和工具。当与 LangGraph(用于构建有状态、基于图形的工作流程的库)结合使用时,您可以创建利用 MCP 功能的复杂 AI 代理。在这篇文章中,我们将逐步了解如何创建 MCP 服务器、使用客户端与其交互以及将其与 LangGraph 集成,并提供示例输出来演示结果。

阅读文章后能学到什么

  • 如何使用提示、资源和工具设置 MCP 服务器。
  • 如何使用客户端与 MCP 服务器交互
  • 如何将 MCP 服务器与 LangGraph 集成以构建 AI 代理

开发环境准备

  • 安装所需软件包:mcplangchainlanggraphlangchain-google-genailangchain-mcp-adapters

第 1 步:创建 MCP 服务器

MCP 服务器是我们系统的支柱,公开提示、资源和工具。下面是用于数学助手的简单 MCP 服务器的示例。

参考资料 : https://github.com/modelcontextprotocol/python-sdk

服务器代码

我们使用 MCP Python SDK 中的 FastMCP 类创建一个名为“Math”的服务器。我们定义提示、资源和工具。

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Math")

# Prompts
@mcp.prompt()
def example_prompt(question: str) -> str:
"""Example prompt description"""
return f"""
You are a math assistant. Answer the question.
Question: {question}
"""

@mcp.prompt()
def system_prompt() -> str:
"""System prompt description"""
return """
You are an AI assistant use the tools if needed.
"""

# Resources
@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
"""Get a personalized greeting"""
return f"Hello, {name}!"

@mcp.resource("config://app")
def get_config() -> str:
"""Static configuration data"""
return "App configuration here"

# Tools
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b

@mcp.tool()
def multiply(a: int, b: int) -> int:
"""Multiply two numbers"""
return a * b

if __name__ == "__main__":
mcp.run() # Run server via stdio

此代码:

  • 初始化名为“Math”的 MCP 服务器。
  • 定义两个提示:example_prompt 用于数学问题,system_prompt 用于一般说明。
  • 定义两个资源:动态资源 greeting://{name} 和静态资源 config://app
  • 定义两种工具:用于基本数学运算的加法乘法
  • 使用 stdio 运行服务器

对于可流式 HTTP

  • mcp.run(transport="streamable-http")
...

if __name__ == "__main__":
mcp.run(transport="streamable-http") # Run server via streamable-http

另存为 math_mcp_server.py

第 2 步:创建 MCP 客户端

为了与服务器交互,我们使用 MCP 客户端。客户端通过 stdio 与服务器通信,允许我们列出提示、资源和工具,并调用它们。

from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
import asyncio

# Math Server Parameters
server_params = StdioServerParameters(
command="python",
args=["math_mcp_server.py"],
env=None,
)

async def main():
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()

# List available prompts
response = await session.list_prompts()
print("\n/////////////////prompts//////////////////")
for prompt in response.prompts:
print(prompt)

# List available resources
response = await session.list_resources()
print("\n/////////////////resources//////////////////")
for resource in response.resources:
print(resource)

# List available resource templates
response = await session.list_resource_templates()
print("\n/////////////////resource_templates//////////////////")
for resource_template in response.resourceTemplates:
print(resource_template)

# List available tools
response = await session.list_tools()
print("\n/////////////////tools//////////////////")
for tool in response.tools:
print(tool)

# Get a prompt
prompt = await session.get_prompt("example_prompt", arguments={"question": "what is 2+2"})
print("\n/////////////////prompt//////////////////")
print(prompt.messages[0].content.text)

# Read a resource
content, mime_type = await session.read_resource("greeting://Alice")
print("\n/////////////////content//////////////////")
print(mime_type[1][0].text)

# Call a tool
result = await session.call_tool("add", arguments={"a": 2, "b": 2})
print("\n/////////////////result//////////////////")
print(result.content[0].text)

if __name__ == "__main__":
asyncio.run(main())

输出

运行客户端代码将产生以下输出

Processing request of type ListPromptsRequest

/////////////////prompts//////////////////
name='example_prompt' description='Example prompt description' arguments=[PromptArgument(name='question', description=None, required=True)]
name='system_prompt' description='System prompt description' arguments=[]

Processing request of type ListResourcesRequest

/////////////////resources//////////////////
uri=AnyUrl('config://app') name='get_config' description='Static configuration data' mimeType='text/plain' size=None annotations=None

Processing request of type ListResourceTemplatesRequest

/////////////////resource_templates//////////////////
uriTemplate='greeting://{name}' name='get_greeting' description='Get a personalized greeting' mimeType=None annotations=None

Processing request of type ListToolsRequest

/////////////////tools//////////////////
name='add' description='Add two numbers' inputSchema={'properties': {'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}}, 'required': ['a', 'b'], 'title': 'addArguments', 'type': 'object'} annotations=None
name='multiply' description='Multiply two numbers' inputSchema={'properties': {'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}}, 'required': ['a', 'b'], 'title': 'multiplyArguments', 'type': 'object'} annotations=None

Processing request of type GetPromptRequest

/////////////////prompt//////////////////
You are a math assistant. Answer the question.
Question: what is 2+2

Processing request of type ReadResourceRequest

/////////////////content//////////////////
Hello, Alice!

Processing request of type CallToolRequest

/////////////////result//////////////////
4

此输出显示:

  • 可用提示(example_promptsystem_prompt)。
  • 可用资源 (config://app) 和资源模板 (greeting://{name})。
  • 可用工具( 加法乘法 )及其输入模式。
  • 用“什么是 2+2”调用 example_prompt 的结果。
  • 读取 greeting://Alice 资源的结果。
  • 调用输入 a=2b=2添加工具的结果。

对于可流式传输的 HTTP

  • 使用 streamablehttp_client 而不是 stdio_client
from mcp.client.streamable_http import streamablehttp_client

# Math server
math_server_url = "http://localhost:8000/mcp"

async def main():
async with streamablehttp_client(math_server_url) as (read, write, _):
async with ClientSession(read, write) as session:
...

第 3 步:将 MCP 与 LangGraph 集成

LangGraph 允许我们使用基于图形的方法构建有状态工作流。我们可以将 MCP 客户端与 LangGraph 集成,以创建一个使用服务器工具和提示的 AI 代理。

from typing import List
from typing_extensions import TypedDict
from typing import Annotated

from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph.prebuilt import tools_condition, ToolNode
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import AnyMessage, add_messages
from langgraph.checkpoint.memory import MemorySaver

from langchain_mcp_adapters.tools import load_mcp_tools
from langchain_mcp_adapters.resources import load_mcp_resources
from langchain_mcp_adapters.prompts import load_mcp_prompt
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

import asyncio

# Math Server Parameters
server_params = StdioServerParameters(
command="python",
args=["math_mcp_server.py"],
env=None,
)

async def create_graph(session):
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash", temperature=0, api_key="your_google_api_key")

tools = await load_mcp_tools(session)
llm_with_tool = llm.bind_tools(tools)

system_prompt = await load_mcp_prompt(session, "system_prompt")
prompt_template = ChatPromptTemplate.from_messages([
("system", system_prompt[0].content),
MessagesPlaceholder("messages")
])
chat_llm = prompt_template | llm_with_tool

# State Management
class State(TypedDict):
messages: Annotated[List[AnyMessage], add_messages]

# Nodes
def chat_node(state: State) -> State:
state["messages"] = chat_llm.invoke({"messages": state["messages"]})
return state

# Building the graph
graph_builder = StateGraph(State)
graph_builder.add_node("chat_node", chat_node)
graph_builder.add_node("tool_node", ToolNode(tools=tools))
graph_builder.add_edge(START, "chat_node")
graph_builder.add_conditional_edges("chat_node", tools_condition, {"tools": "tool_node", "__end__": END})
graph_builder.add_edge("tool_node", "chat_node")
graph = graph_builder.compile(checkpointer=MemorySaver())
return graph

async def main():
config = {"configurable": {"thread_id": 1234}}
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()

# Check available tools
tools = await load_mcp_tools(session)
print("Available tools:", [tool.name for tool in tools])

# Check available prompts
prompts = await load_mcp_prompt(session, "example_prompt", arguments={"question": "what is 2+2"})
print("Available prompts:", [prompt.content for prompt in prompts])
prompts = await load_mcp_prompt(session, "system_prompt")
print("Available prompts:", [prompt.content for prompt in prompts])

# Check available resources
resources = await load_mcp_resources(session, uris=["greeting://Alice", "config://app"])
print("Available resources:", [resource.data for resource in resources])

# Use the MCP Server in the graph
agent = await create_graph(session)
while True:
message = input("User: ")
response = await agent.ainvoke({"messages": message}, config=config)
print("AI: "+response["messages"][-1].content)

if __name__ == "__main__":
asyncio.run(main())

输出

Processing request of type ListToolsRequest
Available tools: ['add', 'multiply']

Processing request of type GetPromptRequest
Available prompts: ['\n You are a math assistant. Answer the question.\n Question: what is 2+2\n ']

Processing request of type GetPromptRequest
Available prompts: ['\n You are an AI assistant use the tools if needed.\n ']

Processing request of type ReadResourceRequest
Processing request of type ReadResourceRequest
Available resources: ['Hello, Alice!', 'App configuration here']

Processing request of type ListToolsRequest
Processing request of type GetPromptRequest

User: Hi
AI: Hi there! How can I help you today?
User: what is 2 + 4
Processing request of type CallToolRequest
AI: 2 + 4 = 6

此输出显示:

  • 代理列出可用工具( 加法 乘法 )和提示
  • 访问资源的代理(greeting://Aliceconfig://app)。
  • 代理响应用户输入,包括调用添加工具来计算 2 + 4 = 6

第 4 步:将多个 MCP 服务器与 LangGraph 集成

我们可以使用 MultiServerMCPClient 连接到多个服务器。

创建另一个 MCP 服务器

from mcp.server.fastmcp import FastMCP

mcp = FastMCP("BMI")

# Tools
@mcp.tool()
def calculate_bmi(weight: int, height: int) -> str:
"""Calculate BMI"""
return "BMI: "+str(weight/(height*height))

if __name__ == "__main__":
mcp.run(transport="streamable-http")

将其另存为 bmi_mcp_server.py。该服务器:

  • 初始化名为“BMI”的 MCP 服务器。
  • 定义一个 calculate_bmi 工具,该工具使用体重(以公斤为单位)和身高(以米为单位)来计算 BMI。
  • http://localhost:8000/mcp 通过 HTTP 运行服务器。

多 MCPServer LangGraph 代码(会话已关闭)

在下面的代码中,我们使用 client.get_tools()client.get_prompt(), 其中每个工具调用都有一个新的 MCP ClientSession

from typing import List
from typing_extensions import TypedDict
from typing import Annotated
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph.prebuilt import tools_condition, ToolNode
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import AnyMessage, add_messages
from langgraph.checkpoint.memory import MemorySaver
from langchain_mcp_adapters.client import MultiServerMCPClient
import asyncio

client = MultiServerMCPClient(
{
"math": {
"command": "python",
"args": ["math_mcp_server.py"],
"transport": "stdio",
},
"bmi": {
"url": "http://localhost:8000/mcp",
"transport": "streamable_http",
}
}
)

async def create_graph():
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash", temperature=0, api_key="your_google_api_key")
tools = await client.get_tools()
llm_with_tool = llm.bind_tools(tools)
system_prompt = await client.get_prompt(server_name="math", prompt_name="system_prompt")
prompt_template = ChatPromptTemplate.from_messages([
("system", system_prompt[0].content),
MessagesPlaceholder("messages")
])
chat_llm = prompt_template | llm_with_tool

# State Management
class State(TypedDict):
messages: Annotated[List[AnyMessage], add_messages]

# Nodes
def chat_node(state: State) -> State:
state["messages"] = chat_llm.invoke({"messages": state["messages"]})
return state

# Building the graph
graph_builder = StateGraph(State)
graph_builder.add_node("chat_node", chat_node)
graph_builder.add_node("tool_node", ToolNode(tools=tools))
graph_builder.add_edge(START, "chat_node")
graph_builder.add_conditional_edges("chat_node", tools_condition, {"tools": "tool_node", "__end__": END})
graph_builder.add_edge("tool_node", "chat_node")
graph = graph_builder.compile(checkpointer=MemorySaver())
return graph

async def main():
config = {"configurable": {"thread_id": 1234}}
agent = await create_graph()
while True:
message = input("User: ")
response = await agent.ainvoke({"messages": message}, config=config)
print("AI: "+response["messages"][-1].content)

if __name__ == "__main__":
asyncio.run(main())

多 MCPServer LangGraph 代码(持久会话)

我们可以使用 client.session 保持两台服务器的会话打开。

from typing import List
from typing_extensions import TypedDict
from typing import Annotated
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph.prebuilt import tools_condition, ToolNode
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import AnyMessage, add_messages
from langgraph.checkpoint.memory import MemorySaver
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_mcp_adapters.tools import load_mcp_tools
from langchain_mcp_adapters.prompts import load_mcp_prompt
import asyncio

client = MultiServerMCPClient(
{
"math": {
"command": "python",
"args": ["math_mcp_server.py"],
"transport": "stdio",
},
"bmi": {
"url": "http://localhost:8000/mcp",
"transport": "streamable_http",
}
}
)

async def create_graph(math_session, bmi_session):
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash", temperature=0, api_key="your_google_api_key")

math_tools = await load_mcp_tools(math_session)
bmi_tools = await load_mcp_tools(bmi_session)
tools = math_tools + bmi_tools
llm_with_tool = llm.bind_tools(tools)

system_prompt = await load_mcp_prompt(math_session, "system_prompt")
prompt_template = ChatPromptTemplate.from_messages([
("system", system_prompt[0].content),
MessagesPlaceholder("messages")
])
chat_llm = prompt_template | llm_with_tool

# State Management
class State(TypedDict):
messages: Annotated[List[AnyMessage], add_messages]

# Nodes
def chat_node(state: State) -> State:
state["messages"] = chat_llm.invoke({"messages": state["messages"]})
return state

# Building the graph
graph_builder = StateGraph(State)
graph_builder.add_node("chat_node", chat_node)
graph_builder.add_node("tool_node", ToolNode(tools=tools))
graph_builder.add_edge(START, "chat_node")
graph_builder.add_conditional_edges("chat_node", tools_condition, {"tools": "tool_node", "__end__": END})
graph_builder.add_edge("tool_node", "chat_node")
graph = graph_builder.compile(checkpointer=MemorySaver())
return graph

async def main():
config = {"configurable": {"thread_id": 1234}}
async with client.session("math") as math_session, client.session("bmi") as bmi_session:
agent = await create_graph(math_session, bmi_session)
while True:
message = input("User: ")
response = await agent.ainvoke({"messages": message}, config=config)
print("AI: "+response["messages"][-1].content)

if __name__ == "__main__":
asyncio.run(main())

输出

User: Hi
AI: Hi there! How can I help you today?
User: how many tools do you have
AI: I have 3 tools available: add, multiply, and calculate_bmi. User: find 5 * 4 Processing request of type CallToolRequest AI: The answer is 20.

此输出显示:

  • 代理识别来自两台服务器的三个工具。
  • 调用 5 * 4乘法工具。

结论

通过将 MCP 与 LangGraph 相结合,您可以构建灵活的模块化 AI 系统,在有状态工作流程中利用结构化提示和工具。MCP 服务器提供了一个干净的界面来定义 AI 功能,而 LangGraph 则协调信息流。

文章转载自:Creating an MCP Server and Integrating with LangGraph

#你可能也喜欢这些API文章!

我们有何不同?

API服务商零注册

多API并行试用

数据驱动选型,提升决策效率

查看全部API→
🔥

热门场景实测,选对API

#AI文本生成大模型API

对比大模型API的内容创意新颖性、情感共鸣力、商业转化潜力

25个渠道
一键对比试用API 限时免费

#AI深度推理大模型API

对比大模型API的逻辑推理准确性、分析深度、可视化建议合理性

10个渠道
一键对比试用API 限时免费