Skip to content

Commit 3e8cb2f

Browse files
CakeCrushercursoragentcoderabbitai[bot]
authored
OpenAI agents docs (#99)
Co-authored-by: Cursor Agent <cursoragent@cursor.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
1 parent 60c8e56 commit 3e8cb2f

File tree

7 files changed

+7
-7
lines changed

7 files changed

+7
-7
lines changed

monitoring/introduction.mdx

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,7 @@ description: "Detect hallucinations and regressions in the quality of your LLMs"
66
One of the key features of Traceloop is the ability to monitor the quality of your LLM outputs. It helps you to detect hallucinations and regressions in the quality of your models and prompts.
77

88
To start monitoring your LLM outputs, make sure you installed OpenLLMetry and configured it to send data to Traceloop. If you haven't done that yet, you can follow the instructions in the [Getting Started](/openllmetry/getting-started) guide.
9-
Next, if you're not using a framework like LangChain or LlamaIndex, [make sure to annotate workflows and tasks](/openllmetry/tracing/decorators).
10-
9+
Next, if you're not using a [supported LLM framework](/openllmetry/tracing/supported#frameworks), [make sure to annotate workflows and tasks](/openllmetry/tracing/annotations).
1110
You can then define any of the following [monitors](https://app.traceloop.com/monitors/prd) to track the quality of your LLM outputs.
1211

1312
<Frame>

openllmetry/getting-started-nextjs.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -175,7 +175,7 @@ Assume you have a function that renders a prompt and calls an LLM, simply wrap i
175175
We also have compatible Typescript decorators for class methods which are more convenient.
176176

177177
<Tip>
178-
If you're using an LLM framework like Haystack, Langchain or LlamaIndex -
178+
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) -
179179
we'll do that for you. No need to add any annotations to your code.
180180
</Tip>
181181

openllmetry/getting-started-python.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ Assume you have a function that renders a prompt and calls an LLM, simply add `@
5858
</Warning>
5959

6060
<Tip>
61-
If you're using an LLM framework like Haystack, Langchain or LlamaIndex -
61+
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) -
6262
we'll do that for you. No need to add any annotations to your code.
6363
</Tip>
6464

openllmetry/getting-started-ts.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ Assume you have a function that renders a prompt and calls an LLM, simply wrap i
7373
We also have compatible Typescript decorators for class methods which are more convenient.
7474

7575
<Tip>
76-
If you're using an LLM framework like Haystack, Langchain or LlamaIndex -
76+
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) -
7777
we'll do that for you. No need to add any annotations to your code.
7878
</Tip>
7979

openllmetry/introduction.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Tracing is done in a non-intrusive way, built on top of OpenTelemetry.
1212
You can choose to export the traces to Traceloop, or to your existing observability stack.
1313

1414
<Tip>
15-
You can use OpenLLMetry whether you use a framework like LangChain, or
15+
You can use OpenLLMetry whether you use a [supported LLM framework](/openllmetry/tracing/supported#frameworks), or
1616
directly interact with a foundation model API.
1717
</Tip>
1818

openllmetry/tracing/annotations.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ description: "Enrich your traces by annotating chains and workflows in your app"
1111
Traceloop SDK supports several ways to annotate workflows, tasks, agents and tools in your code to get a more complete picture of your app structure.
1212

1313
<Tip>
14-
If you're using a framework like Langchain, Haystack or LlamaIndex - no need
14+
If you're using a [supported LLM framework](/openllmetry/tracing/supported#frameworks) - no need
1515
to do anything! OpenLLMetry will automatically detect the framework and
1616
annotate your traces.
1717
</Tip>

openllmetry/tracing/supported.mdx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -50,3 +50,4 @@ In the meantime, you can still use OpenLLMetry to report the [LLM and vector DB
5050
| [Haystack by deepset](https://haystack.deepset.ai/) |||
5151
| [Langchain](https://www.langchain.com/) |||
5252
| [LlamaIndex](https://www.llamaindex.ai/) |||
53+
| [OpenAI Agents](https://github.com/openai/openai-agents-python) |||

0 commit comments

Comments
 (0)