44** Quickly test and explore MCP servers from the command line!**
55
66A simple, text-based CLI client for [ Model Context Protocol (MCP)] ( https://modelcontextprotocol.io/ ) servers built with LangChain and TypeScript.
7+ This tool performs automatic schema adjustments for LLM compatibility.
78Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations.
89
910Internally it uses [ LangChain ReAct Agent] ( https://github.com/langchain-ai/react-agent-js ) and
10- a utility function ` convertMcpToLangchainTools() ` from [ ` @h1deya/langchain-mcp-tools ` ] ( https://www.npmjs.com/package/@h1deya/langchain-mcp-tools ) .
11+ a utility function ` convertMcpToLangchainTools() ` from
12+ [ ` @h1deya/langchain-mcp-tools ` ] ( https://www.npmjs.com/package/@h1deya/langchain-mcp-tools ) .
13+ This function performs the aforementioned MCP tools schema transformations for LLM compatibility.
14+ See [ this page] ( https://github.com/hideya/langchain-mcp-tools-ts/blob/main/README.md#llm-provider-schema-compatibility )
15+ for details.
16+
17+ A Python equivalent of this utility is available [ here] ( https://pypi.org/project/mcp-chat/ )
1118
1219## Prerequisites
1320
@@ -18,7 +25,7 @@ a utility function `convertMcpToLangchainTools()` from [`@h1deya/langchain-mcp-t
1825 [ OpenAI] ( https://platform.openai.com/api-keys ) ,
1926 [ Anthropic] ( https://console.anthropic.com/settings/keys ) ,
2027 and/or
21- [ Google GenAI] ( https://aistudio.google.com/apikey )
28+ [ Google AI Studio (for GenAI/Gemini) ] ( https://aistudio.google.com/apikey )
2229 as needed
2330
2431## Quick Start
@@ -43,7 +50,7 @@ a utility function `convertMcpToLangchainTools()` from [`@h1deya/langchain-mcp-t
4350 // "model_provider": "anthropic",
4451 // "model": "claude-3-5-haiku-latest",
4552 // "model_provider": "google_genai",
46- // "model": "gemini-2.0 -flash",
53+ // "model": "gemini-2.5 -flash",
4754 },
4855
4956 " mcp_servers" : {
@@ -128,9 +135,9 @@ mcp-try-cli --help
128135
129136## Supported LLM Providers
130137
131- - ** OpenAI** : ` gpt-4o ` , ` gpt-4o-mini ` , etc.
138+ - ** OpenAI** : ` o4-mini ` , ` gpt-4o-mini ` , etc.
132139- ** Anthropic** : ` claude-sonnet-4-0 ` , ` claude-3-5-haiku-latest ` , etc.
133- - ** Google (GenAI)** : ` gemini-2.0-flash ` , ` gemini-1 .5-pro ` , etc.
140+ - ** Google (GenAI)** : ` gemini-2.5-pro ` , ` gemini-2 .5-flash ` , etc.
134141
135142## Configuration
136143
@@ -153,7 +160,7 @@ Create a `llm_mcp_config.json5` file:
153160{
154161 " llm" : {
155162 " model_provider" : " openai" ,
156- " model" : " gpt-4o-mini " ,
163+ " model" : " gpt-4.1-nano " ,
157164 // model: "o4-mini",
158165 },
159166
@@ -165,8 +172,8 @@ Create a `llm_mcp_config.json5` file:
165172
166173 // "llm": {
167174 // "model_provider": "google_genai",
168- // "model": "gemini-2.0 -flash",
169- // // "model": "gemini-2.5-pro-preview-06-05 ",
175+ // "model": "gemini-2.5 -flash",
176+ // // "model": "gemini-2.5-pro",
170177 // }
171178
172179 " example_queries" : [
@@ -246,14 +253,6 @@ There are quite a few useful MCP servers already available:
246253- Use ` --verbose ` flag for detailed output
247254- Refer to [ MCP documentation] ( https://modelcontextprotocol.io/ )
248255
249- ## Development
250-
251- This tool is built with:
252- - [ Model Context Protocol (MCP)] ( https://modelcontextprotocol.io/ )
253- - [ LangChain] ( https://langchain.com/ ) for LLM integration
254- - [ TypeScript] ( https://www.typescriptlang.org/ ) for type safety
255- - [ Yargs] ( https://yargs.js.org/ ) for CLI parsing
256-
257256## License
258257
259258MIT License - see [ LICENSE] ( LICENSE ) file for details.
0 commit comments