Skip to content

Conversation

@xiangnuans
Copy link

Fixes #34153

This PR ensures that reasoning blocks are preserved in AIMessage and AIMessageChunk when using OpenAI compatible models. Previously, reasoning blocks in response content were being filtered out or not properly normalized, causing them to be missing from the message content.

Changes:

  • Updated _convert_dict_to_message to detect and normalize reasoning blocks in content lists
  • Modified _format_message_content to preserve reasoning blocks instead of filtering them
  • Enhanced _convert_delta_to_message_chunk to handle reasoning blocks in streaming scenarios
  • Added test_reasoning_blocks_compatible_models to verify reasoning blocks are preserved

Testing:

  • Ran make format, make lint, and make test in libs/partners/openai
  • All checks pass successfully

AI Usage:
Part of this change was drafted with the help of an AI assistant and then reviewed and edited by me.

@github-actions github-actions bot added integration Related to a provider partner package integration openai fix labels Dec 1, 2025
@codspeed-hq
Copy link

codspeed-hq bot commented Dec 1, 2025

CodSpeed Performance Report

Merging #34157 will not alter performance

Comparing xiangnuans:fix/openai-reasoning-blocks (1b359f7) with master (78c10f8)

Summary

✅ 6 untouched
⏩ 28 skipped1

Footnotes

  1. 28 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

Filter out reasoning_content from user messages to match expected behavior
in test__get_request_payload. Reasoning blocks should only be preserved
for assistant messages from OpenAI compatible models.
@TomaszKaleczyc
Copy link

is it possible to make this compatible with the OpenRouter format for passing reasoning details:

# Preserve the assistant message with reasoning_details
messages = [
  {"role": "user", "content": "How many r's are in the word 'strawberry'?"},
  {
    "role": "assistant",
    "content": response.get('content'),
    "reasoning_details": response.get('reasoning_details')  # Pass back unmodified
  },
  {"role": "user", "content": "Are you sure? Think carefully."}
]

for example see here

@xiangnuans
Copy link
Author

@TomaszKaleczyc Thanks for the suggestion!

This PR focuses specifically on fixing the missing reasoning blocks in OpenAI-compatible responses — ensuring that reasoning content is correctly preserved in AIMessage and AIMessageChunk.

The reasoning_details field used by OpenRouter is an extended, provider-specific compatibility feature. I agree it’s valuable, but it would be cleaner to handle that as a separate, dedicated PR so it doesn’t scope-creep this fix.

Happy to discuss approaches for supporting that extension in a follow-up PR!

@TomaszKaleczyc
Copy link

OK - but just want to put it out there this would really be valuable

@xiangnuans
Copy link
Author

Happy to open a follow-up PR for OpenRouter-style reasoning_details once this one lands — just let me know if you’d like me to draft an RFC first.

@sa411022
Copy link
Contributor

sa411022 commented Dec 5, 2025

Does this work with the chat completions API?

@xiangnuans
Copy link
Author

Yes — it works with the Chat Completions API.

This PR preserves reasoning blocks in both AIMessage and AIMessageChunk as long as the provider follows the OpenAI-compatible response format. For streaming responses, the reasoning deltas are also handled correctly.

If the provider sends reasoning content using the standard response_format shape, the logic applies automatically without additional configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

fix integration Related to a provider partner package integration openai

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Openai compatible models: "reasoning" block missing in AIMessageChunk, and AIMessage

4 participants