LogoMCP Store
icon of langchain-mcp

langchain-mcp

Langchain-mcp offers Model Context Protocol tool support in LangChain, facilitating tool calling for enhanced AI model interaction.

Introduction

langchain-mcp

Model Context Protocol tool calling support in LangChain.

Create a langchain_mcp.MCPToolkit with an mcp.ClientSession, then await toolkit.initialize() and toolkit.get_tools() to get the list of langchain_core.tools.BaseTools.

Example:

 server_params = StdioServerParameters( 
     command="npx", 
     args=["-y", "@modelcontextprotocol/server-filesystem", str(pathlib.Path(__file__).parent.parent)], 
 )
 async with stdio_client(server_params) as (read, write): 
     async with ClientSession(read, write) as session: 
         toolkit = MCPToolkit(session=session) 
         await toolkit.initialize() 
         response = await run(toolkit.get_tools(), prompt) 
         print(response)
Demo

You can run the demo against Groq llama-3.1-8b-instant:

$ export GROQ_API_KEY=xxx
$ uv run tests/demo.py "Read and summarize the file ./LICENSE"
Secure MCP Filesystem Server running on stdio
Allowed directories: [ '/users/aw/projects/rectalogic/langchain-mcp' ]
The file ./LICENSE is a MIT License agreement. It states that the software is provided "as is" without warranty and that the authors and copyright holders are not liable for any claims, damages, or other liability arising from the software or its use.

Information

Newsletter

Join the Community

Subscribe to our newsletter for the latest news and updates