Skip to content

Commit bf1de41

Browse files
authored
updating readme for tracing (#43877)
1 parent 14c1d13 commit bf1de41

File tree

1 file changed

+104
-4
lines changed

1 file changed

+104
-4
lines changed

sdk/ai/azure-ai-projects/README.md

Lines changed: 104 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -384,7 +384,7 @@ The code below shows some evaluation operations. Full list of sample can be foun
384384

385385
**Note:** Tracing functionality is in preliminary preview and is subject to change. Spans, attributes, and events may be modified in future versions.
386386

387-
You can add an Application Insights Azure resource to your Azure AI Foundry project. See the Tracing tab in your AI Foundry project. If one was enabled, you can get the Application Insights connection string, configure your AI Projects client, and observe the full execution path through Azure Monitor. Typically, you might want to start tracing before you create a client or Agent.
387+
You can add an Application Insights Azure resource to your Azure AI Foundry project. See the Tracing tab in your AI Foundry project. If one was enabled, you can get the Application Insights connection string, configure your AI Projects client, and observe traces in Azure Monitor. Typically, you might want to start tracing before you create a client or Agent.
388388

389389
### Installation
390390

@@ -404,7 +404,53 @@ pip install opentelemetry-exporter-otlp
404404

405405
### How to enable tracing
406406

407-
TBD
407+
Here is a code sample that shows how to enable Azure Monitor tracing:
408+
409+
<!-- SNIPPET:sample_agent_basic_with_azure_monitor_tracing.setup_azure_monitor_tracing -->
410+
411+
```python
412+
# Enable Azure Monitor tracing
413+
application_insights_connection_string = project_client.telemetry.get_application_insights_connection_string()
414+
configure_azure_monitor(connection_string=application_insights_connection_string)
415+
```
416+
417+
<!-- END SNIPPET -->
418+
419+
You may also want to create a span for your scenario:
420+
421+
<!-- SNIPPET:sample_agent_basic_with_azure_monitor_tracing.create_span_for_scenario -->
422+
423+
```python
424+
tracer = trace.get_tracer(__name__)
425+
scenario = os.path.basename(__file__)
426+
427+
with tracer.start_as_current_span(scenario):
428+
```
429+
430+
<!-- END SNIPPET -->
431+
432+
See the full sample code in [sample_agent_basic_with_azure_monitor_tracing.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/agents/telemetry/sample_agent_basic_with_azure_monitor_tracing.py).
433+
434+
In addition, you might find it helpful to see the tracing logs in the console. You can achieve this with the following code:
435+
436+
<!-- SNIPPET:sample_agent_basic_with_console_tracing.setup_console_tracing -->
437+
438+
```python
439+
# Setup tracing to console
440+
# Requires opentelemetry-sdk
441+
span_exporter = ConsoleSpanExporter()
442+
tracer_provider = TracerProvider()
443+
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter))
444+
trace.set_tracer_provider(tracer_provider)
445+
tracer = trace.get_tracer(__name__)
446+
447+
# Enable instrumentation with content tracing
448+
AIProjectInstrumentor().instrument()
449+
```
450+
451+
<!-- END SNIPPET -->
452+
453+
See the full sample code in [sample_agent_basic_with_console_tracing.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/agents/telemetry/sample_agent_basic_with_console_tracing.py).
408454

409455
### Enabling content recording
410456

@@ -427,13 +473,67 @@ Binary data are images and files sent to the service as input messages. When you
427473

428474
**Important:** Binary data can contain sensitive information and may significantly increase trace size. Some trace backends and tracing implementations may have limitations on the maximum size of trace data that can be sent to and/or supported by the backend. Ensure your observability backend and tracing implementation support the expected trace payload sizes when enabling binary data tracing.
429475

476+
### How to trace your own functions
477+
478+
The decorator `trace_function` is provided for tracing your own function calls using OpenTelemetry. By default the function name is used as the name for the span. Alternatively you can provide the name for the span as a parameter to the decorator.
479+
480+
**Note:** The `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` environment variable does not affect custom function tracing. When you use the `trace_function` decorator, all parameters and return values are always traced by default.
481+
482+
This decorator handles various data types for function parameters and return values, and records them as attributes in the trace span. The supported data types include:
483+
* Basic data types: str, int, float, bool
484+
* Collections: list, dict, tuple, set
485+
* Special handling for collections:
486+
- If a collection (list, dict, tuple, set) contains nested collections, the entire collection is converted to a string before being recorded as an attribute.
487+
- Sets and dictionaries are always converted to strings to ensure compatibility with span attributes.
488+
489+
Object types are omitted, and the corresponding parameter is not traced.
490+
491+
The parameters are recorded in attributes `code.function.parameter.<parameter_name>` and the return value is recorder in attribute `code.function.return.value`
492+
493+
#### Adding custom attributes to spans
494+
495+
You can add custom attributes to spans by creating a custom span processor. Here's how to define one:
496+
497+
<!-- SNIPPET:sample_agent_basic_with_console_tracing_custom_attributes.custom_attribute_span_processor -->
498+
499+
```python
500+
class CustomAttributeSpanProcessor(SpanProcessor):
501+
def __init__(self):
502+
pass
503+
504+
def on_start(self, span: Span, parent_context=None):
505+
# Add this attribute to all spans
506+
span.set_attribute("trace_sample.sessionid", "123")
507+
508+
# Add another attribute only to create_thread spans
509+
if span.name == "create_thread":
510+
span.set_attribute("trace_sample.create_thread.context", "abc")
511+
512+
def on_end(self, span: ReadableSpan):
513+
# Clean-up logic can be added here if necessary
514+
pass
515+
```
516+
517+
<!-- END SNIPPET -->
518+
519+
Then add the custom span processor to the global tracer provider:
520+
521+
<!-- SNIPPET:sample_agent_basic_with_console_tracing_custom_attributes.add_custom_span_processor_to_tracer_provider -->
522+
523+
```python
524+
provider = cast(TracerProvider, trace.get_tracer_provider())
525+
provider.add_span_processor(CustomAttributeSpanProcessor())
526+
```
527+
528+
<!-- END SNIPPET -->
529+
530+
See the full sample code in [sample_agent_basic_with_console_tracing_custom_attributes.py](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/agents/telemetry/sample_agent_basic_with_console_tracing_custom_attributes.py).
531+
430532
### Additional resources
431533

432534
For more information see:
433535

434536
* [Trace AI applications using OpenAI SDK](https://learn.microsoft.com/azure/ai-foundry/how-to/develop/trace-application)
435-
* Chat-completion samples with console or Azure Monitor tracing enabled. See `samples\inference\azure-openai` folder.
436-
* The Tracing section in the [README.md file of the azure-ai-agents package](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-agents/README.md#tracing).
437537

438538
## Troubleshooting
439539

0 commit comments

Comments
 (0)