You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: api/extension-guides/ai/mcp.md
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,11 +11,16 @@ MetaDescription: A comprehensive guide for developers building MCP servers that
11
11
12
12
Model Context Protocol (MCP) is an open standard that enables AI models to interact with external tools and services through a unified interface. Visual Studio Code implements the full MCP specification, enabling you to create MCP servers that provide tools, prompts, and resources for extending the capabilities of AI agents in VS Code.
13
13
14
+
MCP servers provide one of three types of tools available in VS Code, alongside built-in tools and extension-contributed tools. Learn more about [tool types](/docs/copilot/chat/chat-tools.md#types-of-tools).
15
+
14
16
This guide covers everything you need to know to build MCP servers that work seamlessly with VS Code and other MCP clients.
15
17
16
18
> [!IMPORTANT]
17
19
> MCP support in VS Code is currently in preview.
18
20
21
+
> [!TIP]
22
+
> For information about using MCP servers as an end user, see [Use MCP servers in VS Code](/docs/copilot/customization/mcp-servers.md).
23
+
19
24
## Why use MCP servers?
20
25
21
26
Implementing an MCP server to extend chat in VS Code with language model tools has the following benefits:
Copy file name to clipboardExpand all lines: api/extension-guides/ai/tools.md
+9-3Lines changed: 9 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,12 +10,16 @@ MetaDescription: A guide to creating a language model tool and how to implement
10
10
# Language Model Tool API
11
11
12
12
Language model tools enable you to extend the functionality of a large language model (LLM) in chat with domain-specific capabilities. To process a user's chat prompt, [agent mode](/docs/copilot/chat/chat-agent-mode) in VS Code can automatically invoke these tools to perform specialized tasks as part of the conversation.
13
-
By contributing a language model tool in your VS Code extension, you can extend the agentic coding workflow while also providing deep integration with the editor.
13
+
14
+
By contributing a language model tool in your VS Code extension, you can extend the agentic coding workflow while also providing deep integration with the editor. Extension tools are one of three types of tools available in VS Code, alongside [built-in tools and MCP tools](/docs/copilot/chat/chat-tools.md#types-of-tools).
14
15
15
16
In this extension guide, you learn how to create a language model tool by using the Language Model Tools API and how to implement tool calling in a chat extension.
16
17
17
18
You can also extend the chat experience with specialized tools by contributing an [MCP server](/api/extension-guides/ai/mcp). See the [AI Extensibility Overview](/api/extension-guides/ai/ai-extensibility-overview) for details on the different options and how to decide which approach to use.
18
19
20
+
> [!TIP]
21
+
> For information about using tools as an end user, see [Use tools in chat](/docs/copilot/chat/chat-tools.md).
22
+
19
23
## What is tool calling in an LLM?
20
24
21
25
A language model tool is a function that can be invoked as part of a language model request. For example, you might have a function that retrieves information from a database, performs some calculation, or calls an online API. When you contribute a tool in a VS Code extension, agent mode can then invoke the tool based on the context of the conversation.
@@ -32,7 +36,7 @@ Read more about [function calling](https://platform.openai.com/docs/guides/funct
32
36
33
37
Implementing a language model tool in your extension has several benefits:
34
38
35
-
-**Extend agent mode** with specialized, domain-specific, tools that are automatically invoked as part of responding to a user prompt. For example, enable database scaffolding and querying to dynamically provide the LLM with relevant context.
39
+
-**Extend agent mode** with specialized, domain-specific tools that are automatically invoked as part of responding to a user prompt. For example, enable database scaffolding and querying to dynamically provide the LLM with relevant context.
36
40
-**Deeply integrate with VS Code** by using the broad set of extension APIs. For example, use the [debug APIs](/api/extension-guides/debugger-extension) to get the current debugging context and use it as part of the tool's functionality.
37
41
-**Distribute and deploy** tools via the Visual Studio Marketplace, providing a reliable and seamless experience for users. Users don't need a separate installation and update process for your tool.
38
42
@@ -43,6 +47,8 @@ You might consider implementing a language model tool with an [MCP server](/api/
43
47
- Your tool is hosted remotely as a service.
44
48
- You don't need access to VS Code APIs.
45
49
50
+
Learn more about the [differences between tool types](/docs/copilot/chat/chat-tools.md#types-of-tools).
51
+
46
52
## Create a language model tool
47
53
48
54
Implementing a language model tool consists of two main parts:
@@ -67,7 +73,7 @@ The first step to define a language model tool in your extension is to define it
67
73
68
74
1. If the tool can be used in [agent mode](/docs/copilot/chat/chat-agent-mode) or referenced in a chat prompt with `#`, add the following properties:
69
75
70
-
Users can enable or disable the tool in the Chat view, similar to how this is done for [Model Context Protocol (MCP) tools](/docs/copilot/chat/mcp-servers).
76
+
Users can enable or disable the tool in the Chat view, similar to how this is done for [Model Context Protocol (MCP) tools](/docs/copilot/chat/chat-tools.md#mcp-tools).
You can start a chat session directly from the command line by using the `chat` subcommand in the VS Code CLI. This enables you to open a chat session in your current working directory with a prompt you provide.
99
+
100
+
For example, the following command opens chat for the current directory and asks "Find and fix all untyped variables":
101
+
102
+
```bash
103
+
code chat Find and fix all untyped variables
104
+
```
105
+
106
+
The `chat` subcommand has the following command-line options:
107
+
108
+
*`-m`, `--mode <mode>`: The chat mode to use for the chat session. Available options: `ask`, `edit`, `agent`, or the identifier of a custom mode. Defaults to `agent`.
109
+
*`-a`, `--add-file <path>`: Add files as context to the chat session.
110
+
*`--maximize`: Maximize the chat session view.
111
+
*`-r`, `--reuse-window`: Use the last active window for the chat session.
112
+
*`-n`, `--new-window`: Open an empty window for the chat session.
113
+
114
+
The `chat` subcommand also supports piping input from `stdin` by passing `-` at the end of the command. For example:
115
+
116
+
```bash
117
+
python app.py | code chat why does it fail -
118
+
```
119
+
96
120
## Advanced CLI options
97
121
98
122
There are several CLI options that help with reproducing errors and advanced setup.
0 commit comments