|
1 | | -# Simple CLI MCP Client Using LangChain / TypeScript [](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) |
| 1 | +# Simple CLI MCP Client to Explore MCP Servers [](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [](https://www.npmjs.com/package/@h1deya/mcp-try-cli) |
2 | 2 |
|
3 | | -This simple [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) |
4 | | -client with command line interface demonstrates the use of MCP server tools by the LangChain ReAct Agent. |
5 | 3 |
|
6 | | -When testing LLM and MCP servers, their settings can be conveniently configured via a configuration file, such as the following: |
| 4 | +**Quickly test and explore MCP servers from the command line!** |
7 | 5 |
|
8 | | -```json5 |
9 | | -{ |
| 6 | +A simple, text-based CLI client for [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers built with LangChain and TypeScript. |
| 7 | +Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations. |
| 8 | + |
| 9 | +## Prerequisites |
| 10 | + |
| 11 | +- Node.js 18+ |
| 12 | +- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/) |
| 13 | + installed to run Python-based local (stdio) MCP servers |
| 14 | +- API keys from [Anthropic](https://console.anthropic.com/settings/keys), |
| 15 | + [OpenAI](https://platform.openai.com/api-keys), and/or |
| 16 | + [Google GenAI](https://aistudio.google.com/apikey) |
| 17 | + as needed |
| 18 | + |
| 19 | +## Quick Start |
| 20 | + |
| 21 | +- Install `mcp-try-cli` tool. |
| 22 | + This can take up to a few minutes to complete: |
| 23 | + ```bash |
| 24 | + npm install -g @h1deya/mcp-try-cli |
| 25 | + ``` |
| 26 | + |
| 27 | +- Configure LLM and MCP Servers settings via the configuration file, `llm_mcp_config.json5` |
| 28 | + ```bash |
| 29 | + code llm_mcp_config.json5 |
| 30 | + ``` |
| 31 | + |
| 32 | + The following is a simple configuration for quick testing: |
| 33 | + ```json5 |
| 34 | + { |
10 | 35 | "llm": { |
11 | | - "model_provider": "openai", |
12 | | - "model": "gpt-4o-mini", |
| 36 | + "model_provider": "openai", |
| 37 | + "model": "gpt-4o-mini", |
| 38 | + // "model_provider": "anthropic", |
| 39 | + // "model": "claude-3-5-haiku-latest", |
| 40 | + // "model_provider": "google_genai", |
| 41 | + // "model": "gemini-2.0-flash", |
13 | 42 | }, |
14 | 43 |
|
15 | 44 | "mcp_servers": { |
16 | | - "fetch": { |
17 | | - "command": "uvx", |
18 | | - "args": [ |
19 | | - "mcp-server-fetch" |
20 | | - ] |
21 | | - }, |
22 | | - |
23 | | - "weather": { |
24 | | - "command": "npx", |
25 | | - "args": [ |
26 | | - "-y", |
27 | | - "@h1deya/mcp-server-weather" |
28 | | - ] |
29 | | - }, |
30 | | - |
31 | | - // Auto-detection: tries Streamable HTTP first, falls back to SSE |
32 | | - "remote-mcp-server": { |
33 | | - "url": "https://${SERVER_HOST}:${SERVER_PORT}/..." |
34 | | - }, |
35 | | - |
36 | | - // Example of authentication via Authorization header |
37 | | - "github": { |
38 | | - "type": "http", // recommended to specify the protocol explicitly when authentication is used |
39 | | - "url": "https://api.githubcopilot.com/mcp/", |
40 | | - "headers": { |
41 | | - "Authorization": "Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}" |
42 | | - } |
43 | | - }, |
44 | | - } |
45 | | -} |
46 | | -``` |
| 45 | + "weather": { |
| 46 | + "command": "npx", |
| 47 | + "args": ["-y", "@h1deya/mcp-server-weather"] |
| 48 | + }, |
| 49 | + }, |
47 | 50 |
|
48 | | -It leverages a utility function `convertMcpToLangchainTools()` from |
49 | | -[`@h1deya/langchain-mcp-tools`](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools). |
50 | | -This function handles parallel initialization of specified multiple MCP servers |
51 | | -and converts their available tools into an array of LangChain-compatible tools |
52 | | -([`StructuredTool[]`](https://api.js.langchain.com/classes/_langchain_core.tools.StructuredTool.html)). |
| 51 | + "example_queries": [ |
| 52 | + "Tell me how LLMs work in a few sentences", |
| 53 | + "Are there any weather alerts in California?", |
| 54 | + ], |
| 55 | + } |
| 56 | + ``` |
| 57 | + |
| 58 | +- Set up API keys |
| 59 | + ```bash |
| 60 | + echo "ANTHROPIC_API_KEY=sk-ant-... |
| 61 | + OPENAI_API_KEY=sk-proj-... |
| 62 | + GOOGLE_API_KEY=AI..." > .env |
| 63 | + |
| 64 | + code .env |
| 65 | + ``` |
| 66 | + |
| 67 | +- Run the tool |
| 68 | + ```bash |
| 69 | + mcp-try-cli |
| 70 | + ``` |
| 71 | + By default, it reads the configuration file, `llm_mcp_config.json5`, from the current directory. |
| 72 | + Then, it applies the environment variables specified in the `.env` file, |
| 73 | + as well as the ones that are already defined. |
| 74 | + |
| 75 | +## Building from Source |
| 76 | + |
| 77 | +See [README_DEV.md](https://github.com/hideya/mcp-client-langchain-ts/blob/main/README_DEV.md) |
| 78 | + |
| 79 | +## Features |
| 80 | + |
| 81 | +- **Easy setup**: Works out of the box with popular MCP servers |
| 82 | +- **Flexible configuration**: JSON5 config with environment variable support |
| 83 | +- **Multiple LLM providers**: OpenAI, Anthropic, Google Gemini |
| 84 | +- **Command & URL servers**: Support for both local and remote MCP servers |
| 85 | +- **Real-time logging**: Live stdio MCP server logs with customizable log directory |
| 86 | +- **Interactive testing**: Example queries for the convenience of repeated testing |
53 | 87 |
|
54 | | -This client supports both local (stdio) MCP servers as well as |
55 | | -remote (Streamable HTTP/SSE/WebSocket) MCP servers that are accessible via a simple URL. |
56 | | -This client only supports text results of tool calls. |
| 88 | +## Usage |
57 | 89 |
|
58 | | -For the convenience of debugging MCP servers, this client prints local (stdio) MCP server logs to the console. |
| 90 | +### Basic Usage |
59 | 91 |
|
60 | | -LLMs from Anthropic, OpenAI and Google (GenAI) are currently supported. |
| 92 | +```bash |
| 93 | +mcp-try-cli |
| 94 | +``` |
61 | 95 |
|
62 | | -A python version of this MCP client is available |
63 | | -[here](https://github.com/hideya/mcp-client-langchain-py) |
| 96 | +By default, it reads the configuration file, `llm_mcp_config.json5`, from the current directory. |
| 97 | +Then, it applies the environment variables specified in the `.env` file, |
| 98 | +as well as the ones that are already defined. |
| 99 | +It outputs local MCP server logs to the current directory. |
64 | 100 |
|
65 | | -## Prerequisites |
| 101 | +### With Options |
66 | 102 |
|
67 | | -- Node.js 16+ |
68 | | -- npm 7+ (`npx`) to run Node.js-based MCP servers |
69 | | -- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/) |
70 | | - installed to run Python-based MCP servers |
71 | | -- API keys from [Anthropic](https://console.anthropic.com/settings/keys), |
72 | | - [OpenAI](https://platform.openai.com/api-keys), and/or |
73 | | - [Google GenAI](https://aistudio.google.com/apikey) |
74 | | -<!--[Groq](https://console.groq.com/keys)--> |
75 | | - as needed. |
76 | | - |
77 | | -## Setup |
78 | | -1. Install dependencies: |
79 | | - ```bash |
80 | | - npm install |
81 | | - ``` |
82 | | - |
83 | | -2. Setup API keys: |
84 | | - ```bash |
85 | | - cp .env.template .env |
86 | | - ``` |
87 | | - - Update `.env` as needed. |
88 | | - - `.gitignore` is configured to ignore `.env` |
89 | | - to prevent accidental commits of the credentials. |
90 | | - |
91 | | -3. Configure LLM and MCP Servers settings `llm_mcp_config.json5` as needed. |
92 | | - |
93 | | - - [The configuration file format](https://github.com/hideya/mcp-client-langchain-ts/blob/main/llm_mcp_config.json5) |
94 | | - for MCP servers follows the same structure as |
95 | | - [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user), |
96 | | - with one difference: the key name `mcpServers` has been changed |
97 | | - to `mcp_servers` to follow the snake_case convention |
98 | | - commonly used in JSON configuration files. |
99 | | - - The file format is [JSON5](https://json5.org/), |
100 | | - where comments and trailing commas are allowed. |
101 | | - - The format is further extended to replace `${...}` notations |
102 | | - with the values of corresponding environment variables. |
103 | | - - Keep all the credentials and private info in the `.env` file |
104 | | - and refer to them with `${...}` notation as needed. |
| 103 | +```bash |
| 104 | +# Specify the config file to use |
| 105 | +mcp-try-cli --config my-config.json5 |
105 | 106 |
|
| 107 | +# Store local (stdio) MCP server logs in specific directory |
| 108 | +mcp-try-cli --log-dir ./logs |
106 | 109 |
|
107 | | -## Usage |
| 110 | +# Enable verbose logging |
| 111 | +mcp-try-cli --verbose |
108 | 112 |
|
109 | | -Run the app: |
110 | | -```bash |
111 | | -npm start |
| 113 | +# Show help |
| 114 | +mcp-try-cli --help |
112 | 115 | ``` |
113 | 116 |
|
114 | | -Run in verbose mode: |
115 | | -```bash |
116 | | -npm run start:v |
| 117 | +## Supported LLM Providers |
| 118 | + |
| 119 | +- **OpenAI**: `gpt-4o`, `gpt-4o-mini`, etc. |
| 120 | +- **Anthropic**: `claude-sonnet-4-0`, `claude-3-5-haiku-latest`, etc. |
| 121 | +- **Google (GenAI)**: `gemini-2.0-flash`, `gemini-1.5-pro`, etc. |
| 122 | + |
| 123 | +## Configuration |
| 124 | + |
| 125 | +Create a `llm_mcp_config.json5` file: |
| 126 | + |
| 127 | +- [The configuration file format](https://github.com/hideya/mcp-client-langchain-ts/blob/main/llm_mcp_config.json5) |
| 128 | + for MCP servers follows the same structure as |
| 129 | + [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user), |
| 130 | + with one difference: the key name `mcpServers` has been changed |
| 131 | + to `mcp_servers` to follow the snake_case convention |
| 132 | + commonly used in JSON configuration files. |
| 133 | +- The file format is [JSON5](https://json5.org/), |
| 134 | + where comments and trailing commas are allowed. |
| 135 | +- The format is further extended to replace `${...}` notations |
| 136 | + with the values of corresponding environment variables. |
| 137 | +- Keep all the credentials and private info in the `.env` file |
| 138 | + and refer to them with `${...}` notation as needed |
| 139 | + |
| 140 | +```json5 |
| 141 | +{ |
| 142 | + "llm": { |
| 143 | + "model_provider": "openai", |
| 144 | + "model": "gpt-4o-mini", |
| 145 | + // model: "o4-mini", |
| 146 | + }, |
| 147 | + |
| 148 | + // "llm": { |
| 149 | + // "model_provider": "anthropic", |
| 150 | + // "model": "claude-3-5-haiku-latest", |
| 151 | + // // "model": "claude-sonnet-4-0", |
| 152 | + // }, |
| 153 | + |
| 154 | + // "llm": { |
| 155 | + // "model_provider": "google_genai", |
| 156 | + // "model": "gemini-2.0-flash", |
| 157 | + // // "model": "gemini-2.5-pro-preview-06-05", |
| 158 | + // } |
| 159 | + |
| 160 | + "example_queries": [ |
| 161 | + "Tell me how LLMs work in a few sentences", |
| 162 | + "Are there any weather alerts in California?", |
| 163 | + "Read the news headlines on bbc.com", |
| 164 | + ], |
| 165 | + |
| 166 | + "mcp_servers": { |
| 167 | + // Local MCP server that uses `npx` |
| 168 | + "weather": { |
| 169 | + "command": "npx", |
| 170 | + "args": [ "-y", "@h1deya/mcp-server-weather" ] |
| 171 | + }, |
| 172 | + |
| 173 | + // Another local server that uses `uvx` |
| 174 | + "fetch": { |
| 175 | + "command": "uvx", |
| 176 | + "args": [ "mcp-server-fetch" ] |
| 177 | + }, |
| 178 | + |
| 179 | + "brave-search": { |
| 180 | + "command": "npx", |
| 181 | + "args": [ "-y", "@modelcontextprotocol/server-brave-search" ], |
| 182 | + "env": { "BRAVE_API_KEY": "${BRAVE_API_KEY}" } |
| 183 | + }, |
| 184 | + |
| 185 | + // Remote MCP server via URL |
| 186 | + // Auto-detection: tries Streamable HTTP first, falls back to SSE |
| 187 | + "remote-mcp-server": { |
| 188 | + "url": "https://api.example.com/..." |
| 189 | + }, |
| 190 | + |
| 191 | + // Server with authentication |
| 192 | + "github": { |
| 193 | + "type": "http", // recommended to specify the protocol explicitly when authentication is used |
| 194 | + "url": "https://api.githubcopilot.com/mcp/", |
| 195 | + "headers": { |
| 196 | + "Authorization": "Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}" |
| 197 | + } |
| 198 | + } |
| 199 | + } |
| 200 | +} |
117 | 201 | ``` |
118 | 202 |
|
119 | | -See commandline options: |
| 203 | +### Environment Variables |
| 204 | + |
| 205 | +Create a `.env` file for API keys: |
| 206 | + |
120 | 207 | ```bash |
121 | | -npm run start:h |
| 208 | +OPENAI_API_KEY=sk-ant-... |
| 209 | +ANTHROPIC_API_KEY=sk-proj-... |
| 210 | +GOOGLE_API_KEY=AI... |
| 211 | + |
| 212 | +# Other services as needed |
| 213 | +GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_... |
| 214 | +BRAVE_API_KEY=BSA... |
122 | 215 | ``` |
123 | 216 |
|
124 | | -At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations. |
| 217 | +## Popular MCP Servers to Try |
| 218 | + |
| 219 | +There are quite a few useful MCP servers already available: |
| 220 | + |
| 221 | +- [MCP Server Listing on the Official Site](https://github.com/modelcontextprotocol/servers?tab=readme-ov-file#model-context-protocol-servers) |
| 222 | + |
| 223 | +## Troubleshooting |
| 224 | + |
| 225 | +### Common Issues |
| 226 | + |
| 227 | +1. **Missing API key**: Make sure your `.env` file contains the required API key |
| 228 | +2. **Server not found**: Ensure MCP server packages are available via npx |
| 229 | +3. **Permission errors**: Check file permissions for log directory |
| 230 | + |
| 231 | +### Getting Help |
| 232 | + |
| 233 | +- Check the logs in your specified log directory |
| 234 | +- Use `--verbose` flag for detailed output |
| 235 | +- Refer to [MCP documentation](https://modelcontextprotocol.io/) |
| 236 | + |
| 237 | +## Development |
| 238 | + |
| 239 | +This tool is built with: |
| 240 | +- [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) |
| 241 | +- [LangChain](https://langchain.com/) for LLM integration |
| 242 | +- [TypeScript](https://www.typescriptlang.org/) for type safety |
| 243 | +- [Yargs](https://yargs.js.org/) for CLI parsing |
| 244 | + |
| 245 | +## License |
| 246 | + |
| 247 | +MIT License - see [LICENSE](LICENSE) file for details. |
| 248 | + |
| 249 | +## Contributing |
125 | 250 |
|
126 | | -Example queries can be configured in `llm_mcp_config.json5` |
| 251 | +Issues and pull requests welcome! This tool aims to make MCP server testing as simple as possible. |
0 commit comments