Skip to content

Commit be6c142

Browse files
committed
Merge remote-tracking branch 'public-docs/main' into sync-public-docs
2 parents 41c07df + 8efcf48 commit be6c142

File tree

82 files changed

+1189
-1384
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

82 files changed

+1189
-1384
lines changed

api/extension-guides/ai/mcp.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,13 @@ MetaDescription: A comprehensive guide for developers building MCP servers that
1111

1212
Model Context Protocol (MCP) is an open standard that enables AI models to interact with external tools and services through a unified interface. Visual Studio Code implements the full MCP specification, enabling you to create MCP servers that provide tools, prompts, and resources for extending the capabilities of AI agents in VS Code.
1313

14+
MCP servers provide one of three types of tools available in VS Code, alongside built-in tools and extension-contributed tools. Learn more about [tool types](/docs/copilot/chat/chat-tools.md#types-of-tools).
15+
1416
This guide covers everything you need to know to build MCP servers that work seamlessly with VS Code and other MCP clients.
1517

18+
> [!TIP]
19+
> For information about using MCP servers as an end user, see [Use MCP servers in VS Code](/docs/copilot/customization/mcp-servers.md).
20+
1621
## Why use MCP servers?
1722

1823
Implementing an MCP server to extend chat in VS Code with language model tools has the following benefits:

api/extension-guides/ai/tools.md

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,12 +10,16 @@ MetaDescription: A guide to creating a language model tool and how to implement
1010
# Language Model Tool API
1111

1212
Language model tools enable you to extend the functionality of a large language model (LLM) in chat with domain-specific capabilities. To process a user's chat prompt, [agent mode](/docs/copilot/chat/chat-agent-mode) in VS Code can automatically invoke these tools to perform specialized tasks as part of the conversation.
13-
By contributing a language model tool in your VS Code extension, you can extend the agentic coding workflow while also providing deep integration with the editor.
13+
14+
By contributing a language model tool in your VS Code extension, you can extend the agentic coding workflow while also providing deep integration with the editor. Extension tools are one of three types of tools available in VS Code, alongside [built-in tools and MCP tools](/docs/copilot/chat/chat-tools.md#types-of-tools).
1415

1516
In this extension guide, you learn how to create a language model tool by using the Language Model Tools API and how to implement tool calling in a chat extension.
1617

1718
You can also extend the chat experience with specialized tools by contributing an [MCP server](/api/extension-guides/ai/mcp). See the [AI Extensibility Overview](/api/extension-guides/ai/ai-extensibility-overview) for details on the different options and how to decide which approach to use.
1819

20+
> [!TIP]
21+
> For information about using tools as an end user, see [Use tools in chat](/docs/copilot/chat/chat-tools.md).
22+
1923
## What is tool calling in an LLM?
2024

2125
A language model tool is a function that can be invoked as part of a language model request. For example, you might have a function that retrieves information from a database, performs some calculation, or calls an online API. When you contribute a tool in a VS Code extension, agent mode can then invoke the tool based on the context of the conversation.
@@ -32,7 +36,7 @@ Read more about [function calling](https://platform.openai.com/docs/guides/funct
3236

3337
Implementing a language model tool in your extension has several benefits:
3438

35-
- **Extend agent mode** with specialized, domain-specific, tools that are automatically invoked as part of responding to a user prompt. For example, enable database scaffolding and querying to dynamically provide the LLM with relevant context.
39+
- **Extend agent mode** with specialized, domain-specific tools that are automatically invoked as part of responding to a user prompt. For example, enable database scaffolding and querying to dynamically provide the LLM with relevant context.
3640
- **Deeply integrate with VS Code** by using the broad set of extension APIs. For example, use the [debug APIs](/api/extension-guides/debugger-extension) to get the current debugging context and use it as part of the tool's functionality.
3741
- **Distribute and deploy** tools via the Visual Studio Marketplace, providing a reliable and seamless experience for users. Users don't need a separate installation and update process for your tool.
3842

@@ -43,6 +47,8 @@ You might consider implementing a language model tool with an [MCP server](/api/
4347
- Your tool is hosted remotely as a service.
4448
- You don't need access to VS Code APIs.
4549

50+
Learn more about the [differences between tool types](/docs/copilot/chat/chat-tools.md#types-of-tools).
51+
4652
## Create a language model tool
4753

4854
Implementing a language model tool consists of two main parts:
@@ -67,7 +73,7 @@ The first step to define a language model tool in your extension is to define it
6773

6874
1. If the tool can be used in [agent mode](/docs/copilot/chat/chat-agent-mode) or referenced in a chat prompt with `#`, add the following properties:
6975

70-
Users can enable or disable the tool in the Chat view, similar to how this is done for [Model Context Protocol (MCP) tools](/docs/copilot/chat/mcp-servers).
76+
Users can enable or disable the tool in the Chat view, similar to how this is done for [Model Context Protocol (MCP) tools](/docs/copilot/chat/chat-tools.md#mcp-tools).
7177

7278
| Property | Description |
7379
| -------- | ----------- |

api/references/contribution-points.md

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -451,15 +451,6 @@ You can set `ignoreSync` to `true` to prevent the setting from being synchronize
451451
}
452452
}
453453
}
454-
455-
```json
456-
{
457-
"remoteTunnelAccess.machineName": {
458-
"type": "string",
459-
"default": '',
460-
"ignoreSync": true
461-
}
462-
}
463454
```
464455

465456
#### Linking to settings
Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
---
2+
Order: 107
3+
TOCTitle: Expanding Model Choice
4+
PageTitle: Expanding Model Choice in VS Code with Bring Your Own Key
5+
MetaDescription: Learn how the new Language Model Chat Provider API in VS Code is enabling more model choice and extensibility for chat experiences via the Bring Your Own Key experience.
6+
MetaSocialImage: expanding-model-choice.png
7+
Date: 2025-10-22
8+
Author: Olivia Guzzardo McVicker, Pierce Boggan
9+
---
10+
11+
# Expanding Model Choice in VS Code with Bring Your Own Key
12+
13+
October 22, 2025 by [Olivia Guzzardo McVicker](https://github.com/olguzzar), [Pierce Boggan](https://github.com/pierceboggan)
14+
15+
We know that model choice is important to you. Our team has been hard at work making the latest models like [Claude Haiku 4.5](https://github.blog/changelog/2025-10-15-anthropics-claude-haiku-4-5-is-in-public-preview-for-github-copilot/) and [GPT 5 available](https://github.blog/changelog/2025-08-07-openai-gpt-5-is-now-in-public-preview-for-github-copilot/) to you on the same day they were announced. But we've also heard your feedback that you want support for even more models in VS Code, be it locally or in the cloud.
16+
17+
In March, we released the [bring your own key (BYOK)](https://code.visualstudio.com/docs/copilot/customization/language-models#_bring-your-own-language-model-key) functionality to let you pick from hundreds of models from supported providers like OpenRouter, Ollama, Google, OpenAI, and more to power chat experiences in VS Code.
18+
19+
Now, we're taking BYOK to the next level. In the [v1.104 release](https://code.visualstudio.com/updates/v1_104), we introduced the [Language Model Chat Provider API](https://code.visualstudio.com/api/extension-guides/ai/language-model-chat-provider) that enables model providers to contribute their models directly through VS Code extensions.
20+
21+
<iframe width="560" height="315" src="https://www.youtube-nocookie.com//embed/xXFTlPZJJoo?si=UrgdYjNbOzVbSysl" title="BYOK in VS Code" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
22+
23+
24+
## What is Bring Your Own Key (BYOK)?
25+
26+
BYOK lets you use any model from a supported provider by bringing your own API key for that provider. This means you can access a vast ecosystem of models beyond those built into VS Code. Whether you want to use a specialized model for code generation, a different model for general chat, or experiment with local models through providers like Ollama, BYOK makes it possible with just your API key. You can configure this through the **Chat: Manage Language Models** command.
27+
28+
<video src="manage-language-models-command.mp4" title="Video demonstrating the Chat: Manage Language Models command in VS Code." autoplay muted controls></video>
29+
30+
But managing an ever-growing list of supported providers presented challenges for both users and our team. That's why we've released the Language Model Chat Provider API, allowing model providers to contribute their models directly through VS Code extensions.
31+
32+
## The Language Model Chat Provider API
33+
34+
The [Language Model Chat Provider API](https://code.visualstudio.com/api/extension-guides/ai/language-model-chat-provider) shifts BYOK from a centralized system to an open, extensible ecosystem where any provider can offer their models with a simple extension install. We will still support a subset of built-in providers, but this extensible ecosystem will allow us to scale out our model choice to meet developers' needs.
35+
36+
> [!NOTE]
37+
> Models provided through the Language Model Chat Provider API are currently available to users on individual GitHub Copilot plans (Free, Pro, and Pro+).
38+
39+
Here are some of our favorite extensions you can install right now to get access to more models in VS Code:
40+
41+
* [The AI Toolkit for Visual Studio Code extension](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio&ssr=false#overview) gives you access to its provided models directly in VS Code, whether it's a custom model you've tuned in Azure AI Foundry, a local model via Foundry Local, or any of the models in GitHub Models.
42+
43+
* [Cerebras Inference](https://marketplace.visualstudio.com/items?itemName=cerebras.cerebras-chat) powers the world's top coding models, making code generation pretty much instant, great for rapid iteration. It runs Qwen3 Coder and GPT OSS 120B at 2,000 tokens/s which is 20x faster than most inference APIs.
44+
45+
* [The Hugging Face Provider for GitHub Copilot Chat extension](https://marketplace.visualstudio.com/items?itemName=HuggingFace.huggingface-vscode-chat) enables you to use frontier open LLMs like Kimi K2, DeepSeek V3.1, GLM 4.5 directly in VS Code. Hugging Face’s Inference Providers give developers access to hundreds of LLMs, powered by world-class inference providers built for high availability and low latency.
46+
47+
For extension developers interested in contributing their own models, check out our [Language Model Chat Provider API documentation](https://code.visualstudio.com/api/extension-guides/ai/language-model-chat-provider) and [sample extension](https://github.com/microsoft/vscode-extension-samples/tree/main/chat-model-provider-sample) to get started building today.
48+
49+
## OpenAI-compatible Models
50+
51+
For developers using OpenAI-compatible models, you can use the custom **OpenAI Compatible** provider for any OpenAI-compatible API endpoint and [configure the models for use in chat](https://code.visualstudio.com/docs/copilot/customization/language-models#_use-an-openaicompatible-model). This feature is currently available in VS Code Insiders only.
52+
53+
![Screenshot showing OpenAI-compatible model configuration in VS Code.](manage-openai-compatible.png)
54+
55+
Additionally, you can explicitly configure the list of edit tools through the `github.copilot.chat.customOAIModels` setting, giving you fine-grained control over which capabilities are available for your custom models.
56+
57+
58+
## What's Next?
59+
60+
The Language Model Chat Provider API is just the beginning of bringing more model choice to you. As this ecosystem grows, we expect to see:
61+
62+
* Model management UI that allows you to learn about model capabilities and manage models
63+
* Smoother flow for installing extensions that contribute language models
64+
* Improvements to the built-in language model providers, using latest provider APIs and having specialized prompts depending on the model
65+
66+
We're continuously investing in the BYOK experience. [Recent enhancements](https://code.visualstudio.com/updates/v1_105#_improved-edit-tools-for-custom-models) include improved edit tools for better integration with VS Code's built-in tools, but we know there's work to be done to make the experience feel more native to VS Code - for example, BYOK does not currently work with completions. We'd love to hear your feedback on our [GitHub repository](https://github.com/microsoft/vscode)!
67+
68+
Happy coding!
Lines changed: 3 additions & 0 deletions
Loading
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
version https://git-lfs.github.com/spec/v1
2+
oid sha256:d8b605fb707e5ae74e082842707eb0a5eca64bdfc8c8e51f21b4cfeb6fe1a595
3+
size 294086
Lines changed: 3 additions & 0 deletions
Loading

build/sitemap.xml

Lines changed: 17 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -356,32 +356,37 @@
356356
<priority>0.8</priority>
357357
</url>
358358
<url>
359-
<loc>https://code.visualstudio.com/docs/copilot/chat/chat-agent-mode</loc>
359+
<loc>https://code.visualstudio.com/docs/copilot/chat/chat-checkpoints</loc>
360360
<changefreq>weekly</changefreq>
361361
<priority>0.8</priority>
362362
</url>
363363
<url>
364-
<loc>https://code.visualstudio.com/docs/copilot/chat/chat-ask-mode</loc>
364+
<loc>https://code.visualstudio.com/docs/copilot/chat/chat-sessions</loc>
365365
<changefreq>weekly</changefreq>
366366
<priority>0.8</priority>
367367
</url>
368368
<url>
369-
<loc>https://code.visualstudio.com/docs/copilot/chat/copilot-chat-context</loc>
369+
<loc>https://code.visualstudio.com/docs/copilot/chat/chat-debug-view</loc>
370370
<changefreq>weekly</changefreq>
371371
<priority>0.8</priority>
372372
</url>
373373
<url>
374-
<loc>https://code.visualstudio.com/docs/copilot/chat/copilot-chat</loc>
374+
<loc>https://code.visualstudio.com/docs/copilot/chat/chat-tools</loc>
375+
<changefreq>weekly</changefreq>
376+
<priority>0.8</priority>
377+
</url>
378+
<url>
379+
<loc>https://code.visualstudio.com/docs/copilot/chat/review-code-edits</loc>
375380
<changefreq>weekly</changefreq>
376381
<priority>0.8</priority>
377382
</url>
378383
<url>
379-
<loc>https://code.visualstudio.com/docs/copilot/chat/copilot-edits</loc>
384+
<loc>https://code.visualstudio.com/docs/copilot/chat/copilot-chat-context</loc>
380385
<changefreq>weekly</changefreq>
381386
<priority>0.8</priority>
382387
</url>
383388
<url>
384-
<loc>https://code.visualstudio.com/docs/copilot/chat/getting-started-chat</loc>
389+
<loc>https://code.visualstudio.com/docs/copilot/chat/copilot-chat</loc>
385390
<changefreq>weekly</changefreq>
386391
<priority>0.8</priority>
387392
</url>
@@ -391,7 +396,12 @@
391396
<priority>0.8</priority>
392397
</url>
393398
<url>
394-
<loc>https://code.visualstudio.com/docs/copilot/chat/prompt-crafting</loc>
399+
<loc>https://code.visualstudio.com/docs/copilot/guides/prompt-engineering-guide</loc>
400+
<changefreq>weekly</changefreq>
401+
<priority>0.8</priority>
402+
</url>
403+
<url>
404+
<loc>https://code.visualstudio.com/docs/copilot/chat/prompt-examples</loc>
395405
<changefreq>weekly</changefreq>
396406
<priority>0.8</priority>
397407
</url>

docs/configure/command-line.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -93,6 +93,30 @@ Argument|Description
9393

9494
![install extension](images/command-line/install-extension.png)
9595

96+
## Start chat from the command line
97+
98+
You can start a chat session directly from the command line by using the `chat` subcommand in the VS Code CLI. This enables you to open a chat session in your current working directory with a prompt you provide.
99+
100+
For example, the following command opens chat for the current directory and asks "Find and fix all untyped variables":
101+
102+
```bash
103+
code chat Find and fix all untyped variables
104+
```
105+
106+
The `chat` subcommand has the following command-line options:
107+
108+
* `-m`, `--mode <mode>`: The chat mode to use for the chat session. Available options: `ask`, `edit`, `agent`, or the identifier of a custom mode. Defaults to `agent`.
109+
* `-a`, `--add-file <path>`: Add files as context to the chat session.
110+
* `--maximize`: Maximize the chat session view.
111+
* `-r`, `--reuse-window`: Use the last active window for the chat session.
112+
* `-n`, `--new-window`: Open an empty window for the chat session.
113+
114+
The `chat` subcommand also supports piping input from `stdin` by passing `-` at the end of the command. For example:
115+
116+
```bash
117+
python app.py | code chat why does it fail -
118+
```
119+
96120
## Advanced CLI options
97121

98122
There are several CLI options that help with reproducing errors and advanced setup.

0 commit comments

Comments
 (0)