Baidu Search MCP Server
This is a Baidu search server implementation for Model Context Protocol (MCP), allowing AI assistants to perform intelligent searches using the Baidu Wenxin API.
Features
- Supports intelligent search using the Baidu Wenxin API
- Supports multiple model selections (ernie-3.5-8k, ernie-4.0-8k, deepseek-r1, deepseek-v3)
- Provides search results and reference sources
- Supports deep search and time-sensitive filtering
Installation
npm install @modelcontextprotocol/sdk axios
Configuration
- First, you need to obtain the Baidu Wenxin API key:
- Visit Baidu AI Cloud
- Create an application and obtain the API key
- Set environment variables:
export BAIDU_API_KEY=your_api_key_here
Usage
Run as a standalone server
node build/index.js
Use in MCP configuration
Add the following configuration to your MCP settings file:
{
"mcpServers": {
"baidu-search": {
"command": "node",
"args": ["/path/to/baidu-search-mcp/build/index.js"],
"env": {
"BAIDU_API_KEY": "your_api_key_here"
},
"disabled": false,
"autoApprove": []
}
}
}
API
baidu_search
The search tool supports the following parameters:
query
(required): Search query textmodel
: Name of the model to use- Optional values: "ernie-3.5-8k", "ernie-4.0-8k", "deepseek-r1", "deepseek-v3"
- Default value: "ernie-3.5-8k"
search_mode
: Search mode- Optional values: "auto", "required", "disabled"
- Default value: "auto"
enable_deep_search
: Whether to enable deep search (default: false)search_recency_filter
: Time-sensitive range of search results- Optional values: "week", "month", "semiyear", "year"
Development
- Clone the repository
- Install dependencies:
npm install
- Compile TypeScript:
npm run build
License
MIT License
Contributing
Issues and Pull Requests are welcome!
Disclaimer
This project is not responsible for the use of API keys. Please ensure compliance with the Baidu Wenxin API terms of use and policies.