diff --git a/docs/platforms/dotnet/common/tracing/instrumentation/ai-agents-module.mdx b/docs/platforms/dotnet/common/tracing/instrumentation/ai-agents-module.mdx
new file mode 100644
index 00000000000000..7641108d9f061c
--- /dev/null
+++ b/docs/platforms/dotnet/common/tracing/instrumentation/ai-agents-module.mdx
@@ -0,0 +1,158 @@
+---
+title: Instrument AI Agents
+sidebar_order: 500
+description: "Learn how to instrument your code to use Sentry's AI Agents module with Microsoft.Extensions.AI."
+---
+
+With Sentry AI Agent Monitoring, you can monitor and debug your AI systems with full-stack context. You'll be able to track key insights like token usage, latency, tool usage, and error rates. AI Agent Monitoring data will be fully connected to your other Sentry data like logs, errors, and traces.
+
+As a prerequisite to setting up AI Agent Monitoring with .NET, you'll need to first set up tracing. Once this is done, you can use the `Sentry.Extensions.AI` package to automatically instrument AI agents created with `Microsoft.Extensions.AI`.
+
+## Installation
+
+Install the `Sentry.Extensions.AI` package:
+
+```shell {tabTitle:.NET CLI}
+dotnet add package Sentry.Extensions.AI
+```
+
+```shell {tabTitle:Package Manager}
+Install-Package Sentry.Extensions.AI
+```
+
+The `Sentry.Extensions.AI` integration depends on the `Microsoft.Extensions.AI.Abstractions` package (version 9.7.0 or higher).
+
+## Automatic Instrumentation
+
+The `Sentry.Extensions.AI` package provides automatic instrumentation for AI agents built with [Microsoft.Extensions.AI](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/). This works with any AI provider that implements the `IChatClient` interface, including:
+
+- [Microsoft.Extensions.AI.OpenAI](https://www.nuget.org/packages/Microsoft.Extensions.AI.OpenAI/)
+- [Microsoft.Extensions.AI.AzureAIInference](https://www.nuget.org/packages/Microsoft.Extensions.AI.AzureAIInference/https://www.nuget.org/packages/Microsoft.Extensions.AI.AzureAIInference/)
+- [Anthropic.SDK](https://www.nuget.org/packages/Anthropic.SDK)
+
+### Basic Setup
+
+
+AI Agent monitoring is marked as experimental.
+
+
+To instrument your AI agent, wrap your `IChatClient` with the `AddSentry()` extension method:
+
+If your AI agent uses tools (function calling), you can instrument them using the `AddSentryToolInstrumentation()` extension method on `ChatOptions`:
+
+
+You must wrap your `IChatClient` before creating a `ChatClientBuilder` with it. If you run `AddSentry()` on an `IChatClient` that already has function invocation, spans will not show up correctly.
+
+
+```csharp
+// Wrap your IChatClient with Sentry instrumentation
+var openAiClient = new OpenAI.Chat.ChatClient("gpt-4o-mini", apiKey)
+ .AsIChatClient()
+ .AddSentry(options =>
+ {
+ options.Experimental.RecordInputs = true;
+ options.Experimental.RecordOutputs = true;
+ options.Experimental.AgentName = "MyAgent";
+ });
+
+// Wrap your client with FunctionInvokingChatClient
+var chatClient = new ChatClientBuilder(openAiClient)
+ .UseFunctionInvocation()
+ .Build();
+
+// Create chat options with tools and add Sentry instrumentation
+var options = new ChatOptions
+{
+ ModelId = "gpt-4o-mini",
+ MaxOutputTokens = 1024,
+ Tools =
+ [
+ // Sample Tool
+ AIFunctionFactory.Create(async (string location) =>
+ {
+ await Task.Delay(500);
+ return $"The weather in {location} is sunny";
+ }, "GetWeather", "Gets the current weather for a location")
+ ]
+}.AddSentryToolInstrumentation();
+
+var response = await chatClient.GetResponseAsync(
+ "What's the weather in New York?",
+ options);
+```
+
+
+## Configuration Options
+
+The `AddSentry()` method accepts an optional configuration delegate to customize the instrumentation:
+
+
+
+Whether to include request messages in spans. When enabled, the content of messages sent to the AI model will be recorded in the span data.
+
+
+
+
+
+Whether to include response content in spans. When enabled, the content of responses from the AI model will be recorded in the span data.
+
+
+
+
+
+Name of the AI Agent. This name will be used to identify the agent in the Sentry UI and helps differentiate between multiple agents in your application.
+
+
+
+
+
+## ASP.NET Core Integration
+
+For ASP.NET Core applications, you can integrate Sentry AI Agent monitoring as follows:
+
+```csharp
+var builder = WebApplication.CreateBuilder(args);
+
+// Initialize Sentry for ASP.NET Core
+builder.WebHost.UseSentry(options =>
+{
+ options.Dsn = "___PUBLIC_DSN___";
+ options.TracesSampleRate = 1.0;
+});
+
+// Set up the AI client with Sentry instrumentation
+var openAiClient = new OpenAI.Chat.ChatClient("gpt-4o-mini", apiKey)
+ .AsIChatClient()
+ .AddSentry(options =>
+ {
+ options.Experimental.RecordInputs = true;
+ options.Experimental.RecordOutputs = true;
+ });
+
+var chatClient = new ChatClientBuilder(openAiClient)
+ .UseFunctionInvocation()
+ .Build();
+
+// Register as a singleton
+builder.Services.AddSingleton(chatClient);
+
+var app = builder.Build();
+
+// Use in endpoints
+app.MapGet("/chat", async (IChatClient client, string message) =>
+{
+ var options = new ChatOptions
+ {
+ ModelId = "gpt-4o-mini",
+ Tools = [ /* your tools */ ]
+ }.AddSentryToolInstrumentation();
+
+ var response = await client.GetResponseAsync(message, options);
+ return Results.Ok(response.Message.Text);
+});
+
+app.Run();
+```
+
+
+
diff --git a/docs/product/insights/ai/agents/getting-started.mdx b/docs/product/insights/ai/agents/getting-started.mdx
index c390ea6ab88346..f429674f24e34d 100644
--- a/docs/product/insights/ai/agents/getting-started.mdx
+++ b/docs/product/insights/ai/agents/getting-started.mdx
@@ -194,6 +194,58 @@ result = await agents.Runner.run(
```
+### .NET - Microsoft.Extensions.AI SDK
+
+The Sentry .NET SDK supports AI agent monitoring through the Microsoft.Extensions.AI integration, which automatically captures spans for your AI agent workflows using the library's built-in telemetry.
+
+#### Supported Platforms
+
+-
+
+#### Quick Start with Microsoft.Extensions.AI
+
+```csharp
+// Wrap your IChatClient with Sentry instrumentation
+var openAiClient = new OpenAI.Chat.ChatClient("gpt-4o-mini", apiKey)
+ .AsIChatClient()
+ .AddSentry(options =>
+ {
+ options.Experimental.RecordInputs = true;
+ options.Experimental.RecordOutputs = true;
+ options.Experimental.AgentName = "MyAgent";
+ });
+
+// Wrap your client with FunctionInvokingChatClient
+var chatClient = new ChatClientBuilder(openAiClient)
+ .UseFunctionInvocation()
+ .Build();
+
+// Create chat options with tools and add Sentry instrumentation
+var options = new ChatOptions
+{
+ ModelId = "gpt-4o-mini",
+ MaxOutputTokens = 1024,
+ Tools =
+ [
+ // Sample Tool
+ AIFunctionFactory.Create(async (string location) =>
+ {
+ await Task.Delay(500);
+ return $"The weather in {location} is sunny";
+ }, "GetWeather", "Gets the current weather for a location")
+ ]
+}.AddSentryToolInstrumentation();
+
+var response = await chatClient.GetResponseAsync(
+ "What's the weather in New York?",
+ options);
+```
+
+
You can also instrument AI agents manually by following our [manual instrumentation guides](/platforms/python/tracing/instrumentation/custom-instrumentation/ai-agents-module).