A simple Model Context Protocol (MCP) application that provides filesystem operations using Ollama local LLM. This project demonstrates the integration of local AI models with filesystem operations through a client-server architecture.
- Client-server architecture using MCP protocol
- Integration with Ollama local LLM (Microsoft Phi-3 3.8b model)
- Directory creation capabilities
- Conversational CLI interface
- Secure file system operations within allowed directories
- Node.js
- Ollama with Phi-3 3.8b model installed
- Basic understanding of terminal operations
- Clone the repository
- Install dependencies:
npm installThe server accepts allowed directories as command-line arguments. These directories will be the only locations where the application can perform filesystem operations.
Start the server:
npm startThe client will automatically start the server and connect to Ollama. You can then interact with the system through natural language commands.
Example commands:
- "Create a directory named 'test'"
- "List the files in the current directory"
- "Delete the file 'test.txt'"
client.js: MCP client implementation with Ollama integrationserver.js: MCP server implementation with filesystem operations
- @modelcontextprotocol/sdk : Core MCP SDK
- Ollama : Local AI model for natural language processing
- Microsoft Phi-3 3.8b model : AI model for natural language processing
