Skip to content

Commit e20a448

Browse files
viniciusdsmellostainless-app[bot]
authored andcommitted
refactor: Implement conditional imports for all integration modules (#480)
* feat(tests): add integration tests for conditional imports in modules - Introduced a new test suite to validate that integration modules handle optional dependencies correctly. - Ensured modules can be imported when dependencies are missing and provide helpful error messages. - Verified that all integration modules exist and can be imported when dependencies are available. - Implemented comprehensive checks for availability flags and graceful import handling. - This addition prevents regressions in conditional import handling across all integrations. * feat(tracer): enhance conditional imports and type hinting for Anthropic integration - Implemented conditional import handling for the `anthropic` library, allowing for graceful degradation when the library is not installed. - Added type hints for `anthropic` types using forward references to improve code clarity and maintainability. - Introduced an informative error message when the `anthropic` library is missing, guiding users on how to install it. - This update ensures better compatibility and user experience when working with optional dependencies in the Anthropic integration. * feat(tracer): improve conditional imports and type hinting for OpenAI integration - Implemented conditional import handling for the `openai` library, allowing for graceful degradation when the library is not installed. - Enhanced type hints using forward references for `openai` types to improve code clarity and maintainability. - Introduced informative error messages when the `openai` library is missing, guiding users on how to install it. - This update ensures better compatibility and user experience when working with optional dependencies in the OpenAI integration. * feat(tracer): enhance conditional imports and type hinting for Mistral integration - Implemented conditional import handling for the `mistralai` library, allowing for graceful degradation when the library is not installed. - Improved type hints using forward references for `mistralai` types to enhance code clarity and maintainability. - Introduced an informative error message when the `mistralai` library is missing, guiding users on how to install it. - This update ensures better compatibility and user experience when working with optional dependencies in the Mistral integration. * feat(tracer): enhance conditional imports and type hinting for Groq integration - Implemented conditional import handling for the `groq` library, allowing for graceful degradation when the library is not installed. - Improved type hints using forward references for `groq` types to enhance code clarity and maintainability. - Introduced an informative error message when the `groq` library is missing, guiding users on how to install it. - This update ensures better compatibility and user experience when working with optional dependencies in the Groq integration. * feat(tracer): enhance conditional imports and type hinting for OpenAI integration - Improved conditional import handling for the `openai` library, ensuring graceful degradation when the library is not installed. - Enhanced type hints using forward references for `openai` types to improve code clarity and maintainability. - Added an informative error message when the `openai` library is missing, guiding users on how to install it. - This update ensures better compatibility and user experience when working with optional dependencies in the OpenAI integration. * feat(langchain): enhance conditional imports and type hinting for LangChain integration - Implemented conditional import handling for the `langchain` library, allowing for graceful degradation when the library is not installed. - Improved type hints using forward references for `langchain` types to enhance code clarity and maintainability. - Introduced an informative error message when the `langchain` library is missing, guiding users on how to install it. - This update ensures better compatibility and user experience when working with optional dependencies in the LangChain integration. * fix(tests): improve exception handling in integration test for conditional imports - Enhanced exception handling in the `run_integration_test` function by specifying `FileNotFoundError` and `OSError` in the exception clause, ensuring more precise error management. - This update prevents potential silent failures when attempting to unlink temporary files, improving the robustness of the integration tests for conditional imports.
1 parent 6cb0cd6 commit e20a448

File tree

7 files changed

+437
-39
lines changed

7 files changed

+437
-39
lines changed

src/openlayer/lib/integrations/anthropic_tracer.py

Lines changed: 18 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,25 @@
44
import logging
55
import time
66
from functools import wraps
7-
from typing import Any, Dict, Iterator, Optional, Union
7+
from typing import Any, Dict, Iterator, Optional, Union, TYPE_CHECKING
88

9-
import anthropic
9+
try:
10+
import anthropic
11+
HAVE_ANTHROPIC = True
12+
except ImportError:
13+
HAVE_ANTHROPIC = False
14+
15+
if TYPE_CHECKING:
16+
import anthropic
1017

1118
from ..tracing import tracer
1219

1320
logger = logging.getLogger(__name__)
1421

1522

1623
def trace_anthropic(
17-
client: anthropic.Anthropic,
18-
) -> anthropic.Anthropic:
24+
client: "anthropic.Anthropic",
25+
) -> "anthropic.Anthropic":
1926
"""Patch the Anthropic client to trace chat completions.
2027
2128
The following information is collected for each chat completion:
@@ -42,6 +49,11 @@ def trace_anthropic(
4249
anthropic.Anthropic
4350
The patched Anthropic client.
4451
"""
52+
if not HAVE_ANTHROPIC:
53+
raise ImportError(
54+
"Anthropic library is not installed. Please install it with: pip install anthropic"
55+
)
56+
4557
create_func = client.messages.create
4658

4759
@wraps(create_func)
@@ -180,7 +192,7 @@ def handle_non_streaming_create(
180192
*args,
181193
inference_id: Optional[str] = None,
182194
**kwargs,
183-
) -> anthropic.types.Message:
195+
) -> "anthropic.types.Message":
184196
"""Handles the create method when streaming is disabled.
185197
186198
Parameters
@@ -227,7 +239,7 @@ def handle_non_streaming_create(
227239

228240

229241
def parse_non_streaming_output_data(
230-
response: anthropic.types.Message,
242+
response: "anthropic.types.Message",
231243
) -> Union[str, Dict[str, Any], None]:
232244
"""Parses the output data from a non-streaming completion.
233245

src/openlayer/lib/integrations/async_openai_tracer.py

Lines changed: 16 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,16 @@
44
import logging
55
import time
66
from functools import wraps
7-
from typing import Any, AsyncIterator, Optional, Union
7+
from typing import Any, AsyncIterator, Optional, Union, TYPE_CHECKING
88

9-
import openai
9+
try:
10+
import openai
11+
HAVE_OPENAI = True
12+
except ImportError:
13+
HAVE_OPENAI = False
14+
15+
if TYPE_CHECKING:
16+
import openai
1017

1118
from .openai_tracer import (
1219
get_model_parameters,
@@ -19,8 +26,8 @@
1926

2027

2128
def trace_async_openai(
22-
client: Union[openai.AsyncOpenAI, openai.AsyncAzureOpenAI],
23-
) -> Union[openai.AsyncOpenAI, openai.AsyncAzureOpenAI]:
29+
client: Union["openai.AsyncOpenAI", "openai.AsyncAzureOpenAI"],
30+
) -> Union["openai.AsyncOpenAI", "openai.AsyncAzureOpenAI"]:
2431
"""Patch the AsyncOpenAI or AsyncAzureOpenAI client to trace chat completions.
2532
2633
The following information is collected for each chat completion:
@@ -47,6 +54,11 @@ def trace_async_openai(
4754
Union[openai.AsyncOpenAI, openai.AsyncAzureOpenAI]
4855
The patched AsyncOpenAI client.
4956
"""
57+
if not HAVE_OPENAI:
58+
raise ImportError(
59+
"OpenAI library is not installed. Please install it with: pip install openai"
60+
)
61+
5062
is_azure_openai = isinstance(client, openai.AsyncAzureOpenAI)
5163
create_func = client.chat.completions.create
5264

src/openlayer/lib/integrations/groq_tracer.py

Lines changed: 16 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,25 @@
44
import logging
55
import time
66
from functools import wraps
7-
from typing import Any, Dict, Iterator, Optional, Union
7+
from typing import Any, Dict, Iterator, Optional, Union, TYPE_CHECKING
88

9-
import groq
9+
try:
10+
import groq
11+
HAVE_GROQ = True
12+
except ImportError:
13+
HAVE_GROQ = False
14+
15+
if TYPE_CHECKING:
16+
import groq
1017

1118
from ..tracing import tracer
1219

1320
logger = logging.getLogger(__name__)
1421

1522

1623
def trace_groq(
17-
client: groq.Groq,
18-
) -> groq.Groq:
24+
client: "groq.Groq",
25+
) -> "groq.Groq":
1926
"""Patch the Groq client to trace chat completions.
2027
2128
The following information is collected for each chat completion:
@@ -42,6 +49,11 @@ def trace_groq(
4249
groq.Groq
4350
The patched Groq client.
4451
"""
52+
if not HAVE_GROQ:
53+
raise ImportError(
54+
"Groq library is not installed. Please install it with: pip install groq"
55+
)
56+
4557
create_func = client.chat.completions.create
4658

4759
@wraps(create_func)

src/openlayer/lib/integrations/langchain_callback.py

Lines changed: 31 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,19 @@
22

33
# pylint: disable=unused-argument
44
import time
5-
from typing import Any, Dict, List, Optional, Union
5+
from typing import Any, Dict, List, Optional, Union, TYPE_CHECKING
66
from uuid import UUID
77

8-
from langchain import schema as langchain_schema
9-
from langchain.callbacks.base import BaseCallbackHandler
8+
try:
9+
from langchain import schema as langchain_schema
10+
from langchain.callbacks.base import BaseCallbackHandler
11+
HAVE_LANGCHAIN = True
12+
except ImportError:
13+
HAVE_LANGCHAIN = False
14+
15+
if TYPE_CHECKING:
16+
from langchain import schema as langchain_schema
17+
from langchain.callbacks.base import BaseCallbackHandler
1018

1119
from ..tracing import tracer, steps, traces, enums
1220
from .. import utils
@@ -18,10 +26,20 @@
1826
}
1927

2028

21-
class OpenlayerHandler(BaseCallbackHandler):
29+
if HAVE_LANGCHAIN:
30+
BaseCallbackHandlerClass = BaseCallbackHandler
31+
else:
32+
BaseCallbackHandlerClass = object
33+
34+
35+
class OpenlayerHandler(BaseCallbackHandlerClass): # type: ignore[misc]
2236
"""LangChain callback handler that logs to Openlayer."""
2337

2438
def __init__(self, **kwargs: Any) -> None:
39+
if not HAVE_LANGCHAIN:
40+
raise ImportError(
41+
"LangChain library is not installed. Please install it with: pip install langchain"
42+
)
2543
super().__init__()
2644
self.metadata: Dict[str, Any] = kwargs or {}
2745
self.steps: Dict[UUID, steps.Step] = {}
@@ -197,7 +215,7 @@ def _convert_step_objects_recursively(self, step: steps.Step) -> None:
197215
def _convert_langchain_objects(self, obj: Any) -> Any:
198216
"""Recursively convert LangChain objects to JSON-serializable format."""
199217
# Explicit check for LangChain BaseMessage and its subclasses
200-
if isinstance(obj, langchain_schema.BaseMessage):
218+
if HAVE_LANGCHAIN and isinstance(obj, langchain_schema.BaseMessage):
201219
return self._message_to_dict(obj)
202220

203221
# Handle ChatPromptValue objects which contain messages
@@ -249,7 +267,7 @@ def _convert_langchain_objects(self, obj: Any) -> Any:
249267
# For everything else, convert to string
250268
return str(obj)
251269

252-
def _message_to_dict(self, message: langchain_schema.BaseMessage) -> Dict[str, str]:
270+
def _message_to_dict(self, message: "langchain_schema.BaseMessage") -> Dict[str, str]:
253271
"""Convert a LangChain message to a JSON-serializable dictionary."""
254272
message_type = getattr(message, "type", "user")
255273

@@ -262,7 +280,7 @@ def _message_to_dict(self, message: langchain_schema.BaseMessage) -> Dict[str, s
262280
return {"role": role, "content": str(message.content)}
263281

264282
def _messages_to_prompt_format(
265-
self, messages: List[List[langchain_schema.BaseMessage]]
283+
self, messages: List[List["langchain_schema.BaseMessage"]]
266284
) -> List[Dict[str, str]]:
267285
"""Convert LangChain messages to Openlayer prompt format using
268286
unified conversion."""
@@ -302,7 +320,7 @@ def _extract_model_info(
302320
}
303321

304322
def _extract_token_info(
305-
self, response: langchain_schema.LLMResult
323+
self, response: "langchain_schema.LLMResult"
306324
) -> Dict[str, Any]:
307325
"""Extract token information generically from LLM response."""
308326
llm_output = response.llm_output or {}
@@ -340,7 +358,7 @@ def _extract_token_info(
340358
"tokens": token_usage.get("total_tokens", 0),
341359
}
342360

343-
def _extract_output(self, response: langchain_schema.LLMResult) -> str:
361+
def _extract_output(self, response: "langchain_schema.LLMResult") -> str:
344362
"""Extract output text from LLM response."""
345363
output = ""
346364
for generations in response.generations:
@@ -384,7 +402,7 @@ def on_llm_start(
384402
def on_chat_model_start(
385403
self,
386404
serialized: Dict[str, Any],
387-
messages: List[List[langchain_schema.BaseMessage]],
405+
messages: List[List["langchain_schema.BaseMessage"]],
388406
*,
389407
run_id: UUID,
390408
parent_run_id: Optional[UUID] = None,
@@ -414,7 +432,7 @@ def on_chat_model_start(
414432

415433
def on_llm_end(
416434
self,
417-
response: langchain_schema.LLMResult,
435+
response: "langchain_schema.LLMResult",
418436
*,
419437
run_id: UUID,
420438
parent_run_id: Optional[UUID] = None,
@@ -590,7 +608,7 @@ def on_text(self, text: str, **kwargs: Any) -> Any:
590608

591609
def on_agent_action(
592610
self,
593-
action: langchain_schema.AgentAction,
611+
action: "langchain_schema.AgentAction",
594612
*,
595613
run_id: UUID,
596614
parent_run_id: Optional[UUID] = None,
@@ -612,7 +630,7 @@ def on_agent_action(
612630

613631
def on_agent_finish(
614632
self,
615-
finish: langchain_schema.AgentFinish,
633+
finish: "langchain_schema.AgentFinish",
616634
*,
617635
run_id: UUID,
618636
parent_run_id: Optional[UUID] = None,

src/openlayer/lib/integrations/mistral_tracer.py

Lines changed: 18 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,25 @@
44
import logging
55
import time
66
from functools import wraps
7-
from typing import Any, Dict, Iterator, Optional, Union
7+
from typing import Any, Dict, Iterator, Optional, Union, TYPE_CHECKING
88

9-
import mistralai
9+
try:
10+
import mistralai
11+
HAVE_MISTRAL = True
12+
except ImportError:
13+
HAVE_MISTRAL = False
14+
15+
if TYPE_CHECKING:
16+
import mistralai
1017

1118
from ..tracing import tracer
1219

1320
logger = logging.getLogger(__name__)
1421

1522

1623
def trace_mistral(
17-
client: mistralai.Mistral,
18-
) -> mistralai.Mistral:
24+
client: "mistralai.Mistral",
25+
) -> "mistralai.Mistral":
1926
"""Patch the Mistral client to trace chat completions.
2027
2128
The following information is collected for each chat completion:
@@ -42,6 +49,11 @@ def trace_mistral(
4249
mistralai.Mistral
4350
The patched Mistral client.
4451
"""
52+
if not HAVE_MISTRAL:
53+
raise ImportError(
54+
"Mistral library is not installed. Please install it with: pip install mistralai"
55+
)
56+
4557
stream_func = client.chat.stream
4658
create_func = client.chat.complete
4759

@@ -184,7 +196,7 @@ def handle_non_streaming_create(
184196
*args,
185197
inference_id: Optional[str] = None,
186198
**kwargs,
187-
) -> mistralai.models.ChatCompletionResponse:
199+
) -> "mistralai.models.ChatCompletionResponse":
188200
"""Handles the create method when streaming is disabled.
189201
190202
Parameters
@@ -231,7 +243,7 @@ def handle_non_streaming_create(
231243

232244

233245
def parse_non_streaming_output_data(
234-
response: mistralai.models.ChatCompletionResponse,
246+
response: "mistralai.models.ChatCompletionResponse",
235247
) -> Union[str, Dict[str, Any], None]:
236248
"""Parses the output data from a non-streaming completion.
237249

src/openlayer/lib/integrations/openai_tracer.py

Lines changed: 23 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,25 @@
44
import logging
55
import time
66
from functools import wraps
7-
from typing import Any, Dict, Iterator, List, Optional, Union
7+
from typing import Any, Dict, Iterator, List, Optional, Union, TYPE_CHECKING
88

9-
import openai
9+
try:
10+
import openai
11+
HAVE_OPENAI = True
12+
except ImportError:
13+
HAVE_OPENAI = False
14+
15+
if TYPE_CHECKING:
16+
import openai
1017

1118
from ..tracing import tracer
1219

1320
logger = logging.getLogger(__name__)
1421

1522

1623
def trace_openai(
17-
client: Union[openai.OpenAI, openai.AzureOpenAI],
18-
) -> Union[openai.OpenAI, openai.AzureOpenAI]:
24+
client: Union["openai.OpenAI", "openai.AzureOpenAI"],
25+
) -> Union["openai.OpenAI", "openai.AzureOpenAI"]:
1926
"""Patch the OpenAI or AzureOpenAI client to trace chat completions.
2027
2128
The following information is collected for each chat completion:
@@ -42,6 +49,11 @@ def trace_openai(
4249
Union[openai.OpenAI, openai.AzureOpenAI]
4350
The patched OpenAI client.
4451
"""
52+
if not HAVE_OPENAI:
53+
raise ImportError(
54+
"OpenAI library is not installed. Please install it with: pip install openai"
55+
)
56+
4557
is_azure_openai = isinstance(client, openai.AzureOpenAI)
4658
create_func = client.chat.completions.create
4759

@@ -358,12 +370,17 @@ def parse_non_streaming_output_data(
358370

359371
# --------------------------- OpenAI Assistants API -------------------------- #
360372
def trace_openai_assistant_thread_run(
361-
client: openai.OpenAI, run: "openai.types.beta.threads.run.Run"
373+
client: "openai.OpenAI", run: "openai.types.beta.threads.run.Run"
362374
) -> None:
363375
"""Trace a run from an OpenAI assistant.
364376
365377
Once the run is completed, the thread data is published to Openlayer,
366378
along with the latency, and number of tokens used."""
379+
if not HAVE_OPENAI:
380+
raise ImportError(
381+
"OpenAI library is not installed. Please install it with: pip install openai"
382+
)
383+
367384
_type_check_run(run)
368385

369386
# Do nothing if the run is not completed
@@ -398,7 +415,7 @@ def trace_openai_assistant_thread_run(
398415

399416
def _type_check_run(run: "openai.types.beta.threads.run.Run") -> None:
400417
"""Validate the run object."""
401-
if not isinstance(run, openai.types.beta.threads.run.Run):
418+
if HAVE_OPENAI and not isinstance(run, openai.types.beta.threads.run.Run):
402419
raise ValueError(f"Expected a Run object, but got {type(run)}.")
403420

404421

0 commit comments

Comments
 (0)