diff --git a/agent-framework/user-guide/agents/agent-observability.md b/agent-framework/user-guide/agents/agent-observability.md index ed8b3205..abcaa991 100644 --- a/agent-framework/user-guide/agents/agent-observability.md +++ b/agent-framework/user-guide/agents/agent-observability.md @@ -332,6 +332,18 @@ This trace shows: We have a number of samples in our repository that demonstrate these capabilities, see the [observability samples folder](https://github.com/microsoft/agent-framework/tree/main/python/samples/getting_started/observability) on Github. That includes samples for using zero-code telemetry as well. +## Third-party observability integrations + +Traces generated by Agent Framework can be exported to your desired backend compatible with OpenTelemetry. + +### MLflow + +[MLflow](https://mlflow.org/) is a popular open-source platform that provides observability and reproducibility for LLM applications. Agent Framework can export traces to MLflow through its OTLP endpoint to keep a durable record of workflow runs, inputs/outputs, and derived metrics. + +![MLflow Traces](https://mlflow.org/docs/latest/images/llms/tracing/microsoft-agent-framework-tracing.png) + +See [MLflow Microsoft Agent Framework integration](https://mlflow.org/docs/latest/genai/tracing/integrations/listing/microsoft-agent-framework/) for how to set up MLflow to collect traces from Agent Framework. + ::: zone-end ## Next steps diff --git a/agent-framework/user-guide/workflows/observability.md b/agent-framework/user-guide/workflows/observability.md index 4f173118..dc03069c 100644 --- a/agent-framework/user-guide/workflows/observability.md +++ b/agent-framework/user-guide/workflows/observability.md @@ -49,6 +49,18 @@ For example: ![Span Relationships](./resources/images/workflow-trace.png) +## Third-party observability integrations + +Traces generated by Agent Framework can be exported to your desired backend compatible with OpenTelemetry. + +### MLflow + +[MLflow](https://mlflow.org/) is a popular open-source platform that provides observability and reproducibility for LLM applications. Agent Framework can export traces to MLflow through its OTLP endpoint to keep a durable record of workflow runs, inputs/outputs, and derived metrics. + +![MLflow Traces](https://mlflow.org/docs/latest/images/llms/tracing/microsoft-agent-framework-tracing.png) + +See [MLflow Microsoft Agent Framework integration](https://mlflow.org/docs/latest/genai/tracing/integrations/listing/microsoft-agent-framework/) for how to set up MLflow to collect traces from Agent Framework. + ## Next Steps - [Learn how to use agents in workflows](./using-agents.md) to build intelligent workflows. diff --git a/semantic-kernel/concepts/enterprise-readiness/observability/TOC.yml b/semantic-kernel/concepts/enterprise-readiness/observability/TOC.yml index 15b0b896..bced2a36 100644 --- a/semantic-kernel/concepts/enterprise-readiness/observability/TOC.yml +++ b/semantic-kernel/concepts/enterprise-readiness/observability/TOC.yml @@ -8,5 +8,7 @@ href: telemetry-with-aspire-dashboard.md - name: 'Example: Azure AI Foundry Tracing' href: telemetry-with-azure-ai-foundry-tracing.md +- name: 'Example: MLflow Tracing' + href: mlflow.mdx - name: 'Advanced telemetry with Semantic Kernel' href: telemetry-advanced.md \ No newline at end of file diff --git a/semantic-kernel/concepts/enterprise-readiness/observability/index.md b/semantic-kernel/concepts/enterprise-readiness/observability/index.md index 3363d039..2c57856b 100644 --- a/semantic-kernel/concepts/enterprise-readiness/observability/index.md +++ b/semantic-kernel/concepts/enterprise-readiness/observability/index.md @@ -73,6 +73,24 @@ Semantic Kernel follows the [OpenTelemetry Semantic Convention](https://opentele > [!Note] > Currently, the [Semantic Conventions for Generative AI](https://github.com/open-telemetry/semantic-conventions/blob/main/docs/gen-ai/README.md) are in experimental status. Semantic Kernel strives to follow the OpenTelemetry Semantic Convention as closely as possible, and provide a consistent and meaningful observability experience for AI solutions. +## Third-party Observability Integrations + +Traces generated by Semantic Kernel can be exported to your desired backend compatible with OpenTelemetry. + +### MLflow + +[MLflow](https://mlflow.org/) is a popular open-source platform that provides observability and reproducibility for LLM applications. Semantic Kernel is supported by MLflow for the one-line automatic tracing setup. + +```python +import mlflow + +mlflow.semantic_kernel.autolog() +``` + +![MLflow Traces](https://mlflow.org/docs/latest/images/llms/tracing/semantic-kernel-tracing.png) + +See [Telemetry with MLflow](telemetry-with-mlflow.md) for more details on how to setup MLflow to collect traces from Semantic Kernel. + ## Next steps Now that you have a basic understanding of observability in Semantic Kernel, you can learn more about how to output telemetry data to the console or use APM tools to visualize and analyze telemetry data. diff --git a/semantic-kernel/concepts/enterprise-readiness/observability/telemetry-with-mlflow.md b/semantic-kernel/concepts/enterprise-readiness/observability/telemetry-with-mlflow.md new file mode 100644 index 00000000..bd2e569b --- /dev/null +++ b/semantic-kernel/concepts/enterprise-readiness/observability/telemetry-with-mlflow.md @@ -0,0 +1,94 @@ +--- +title: Telemetry with MLflow Tracing +description: Collect Semantic Kernel traces in MLflow using autologging. +zone_pivot_groups: programming-languages +author: TaoChenOSU +ms.topic: conceptual +ms.author: taochen +ms.date: 11/25/2025 +ms.service: semantic-kernel +--- + +# Inspection of telemetry data with MLflow + +[MLflow](https://mlflow.org/) provides tracing for LLM applications and includes a built‑in integration for Microsoft Semantic Kernel. With a single line of code, you can capture spans from Semantic Kernel and browse them in the MLflow UI alongside parameters, metrics, and artifacts. + +## Prerequisites + +- Python 3.10, 3.11, or 3.12. +- An LLM provider. The example below uses Azure OpenAI chat completions. +- MLflow UI or Tracking Server (local UI shown below). + +## Setup + +::: zone pivot="programming-language-python" + +### 1) Install packages + +```bash +pip install semantic-kernel mlflow +``` + +### 2) Start the MLflow Tracking Server (local) + +```bash +mlflow sever --port 5000 --backend-store-uri sqlite:///mlflow.db +``` + +### 3) Create a simple Semantic Kernel script and enable MLflow autologging + +Create `telemetry_mlflow_quickstart.py` with the content below and fill in the environment variables for your Azure OpenAI deployment. + +```python +import os +import asyncio +import mlflow + +from semantic_kernel import Kernel +from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion + +# One-line enablement of MLflow tracing for Semantic Kernel +mlflow.semantic_kernel.autolog() + +# Set the tracking URI and experiment name (optional) +mlflow.set_tracking_uri("http://127.0.0.1:5000") +mlflow.set_experiment("telemetry-mlflow-quickstart") + + +async def main(): + # Configure the kernel and add an Azure OpenAI chat service + kernel = Kernel() + kernel.add_service(AzureChatCompletion( + api_key=os.environ.get("AZURE_OPENAI_API_KEY"), + endpoint=os.environ.get("AZURE_OPENAI_ENDPOINT"), + deployment_name=os.environ.get("AZURE_OPENAI_CHAT_DEPLOYMENT_NAME"), + )) + + # Issue a simple prompt; MLflow records spans automatically + answer = await kernel.invoke_prompt("Why is the sky blue in one sentence?") + print(answer) + + +if __name__ == "__main__": + asyncio.run(main()) +``` + +Run the script: + +```bash +python telemetry_mlflow_quickstart.py +``` + +### 4) Inspect traces in MLflow + +Open the MLflow UI (default at `http://127.0.0.1:5000`). Navigate to the Traces view to see spans emitted by Semantic Kernel, including function execution and model calls. + +![MLflow Traces](https://mlflow.org/docs/latest/images/llms/tracing/semantic-kernel-tracing.png) + +::: zone-end + +## Next steps + +- Explore the [Observability overview](./index.md) for additional exporters and patterns. +- Review [Advanced telemetry with Semantic Kernel](./telemetry-advanced.md) to customize signals and attributes. +- Visit [MLflow Semantic Kernel integration](https://mlflow.org/docs/latest/genai/tracing/integrations/listing/semantic-kernel/) for more detailed information on how to use MLflow with Semantic Kernel.