|
1 | | -# Simple CLI MCP Client to Explore MCP Servers [](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [](https://www.npmjs.com/package/@h1deya/mcp-try-cli) |
| 1 | +# Simple CLI MCP Client to Explore MCP Servers / TypeScript [](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE) [](https://www.npmjs.com/package/@h1deya/mcp-try-cli) |
2 | 2 |
|
3 | 3 |
|
4 | 4 | **Quickly test and explore MCP servers from the command line!** |
5 | 5 |
|
6 | 6 | A simple, text-based CLI client for [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers built with LangChain and TypeScript. |
7 | 7 | Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations. |
8 | 8 |
|
| 9 | +Internally it uses [LangChain ReAct Agent](https://github.com/langchain-ai/react-agent-js) and |
| 10 | +a utility function `convertMcpToLangchainTools()` from [`@h1deya/langchain-mcp-tools`](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools). |
| 11 | + |
9 | 12 | ## Prerequisites |
10 | 13 |
|
11 | 14 | - Node.js 18+ |
12 | 15 | - [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/) |
13 | 16 | installed to run Python-based local (stdio) MCP servers |
14 | | -- API keys from [Anthropic](https://console.anthropic.com/settings/keys), |
15 | | - [OpenAI](https://platform.openai.com/api-keys), and/or |
| 17 | +- LLM API keys from |
| 18 | + [OpenAI](https://platform.openai.com/api-keys), |
| 19 | + [Anthropic](https://console.anthropic.com/settings/keys), |
| 20 | + and/or |
16 | 21 | [Google GenAI](https://aistudio.google.com/apikey) |
17 | 22 | as needed |
18 | 23 |
|
@@ -74,17 +79,24 @@ Suitable for testing MCP servers, exploring their capabilities, and prototyping |
74 | 79 |
|
75 | 80 | ## Building from Source |
76 | 81 |
|
77 | | -See [README_DEV.md](https://github.com/hideya/mcp-client-langchain-ts/blob/main/README_DEV.md) |
| 82 | +See [README_DEV.md](https://github.com/hideya/mcp-client-langchain-ts/blob/main/README_DEV.md) for details. |
78 | 83 |
|
79 | 84 | ## Features |
80 | 85 |
|
81 | 86 | - **Easy setup**: Works out of the box with popular MCP servers |
82 | 87 | - **Flexible configuration**: JSON5 config with environment variable support |
83 | | -- **Multiple LLM providers**: OpenAI, Anthropic, Google Gemini |
| 88 | +- **Multiple LLM providers**: OpenAI, Anthropic, Google (GenAI) |
84 | 89 | - **Command & URL servers**: Support for both local and remote MCP servers |
85 | 90 | - **Real-time logging**: Live stdio MCP server logs with customizable log directory |
86 | 91 | - **Interactive testing**: Example queries for the convenience of repeated testing |
87 | 92 |
|
| 93 | +## Limitations |
| 94 | + |
| 95 | +- **Tool Return Types**: Currently, only text results of tool calls are supported. |
| 96 | +It uses LangChain's `response_format: 'content'` (the default) internally, which only supports text strings. |
| 97 | +While MCP tools can return multiple content types (text, images, etc.), this library currently filters and uses only text content. |
| 98 | +- **MCP Features**: Only MCP [Tools](https://modelcontextprotocol.io/docs/concepts/tools) are supported. Other MCP features like Resources, Prompts, and Sampling are not implemented. |
| 99 | + |
88 | 100 | ## Usage |
89 | 101 |
|
90 | 102 | ### Basic Usage |
|
0 commit comments