Skip to content

[Feature] Enable Prompt caching for anthropic / openai #8925

@onel

Description

@onel

What feature would you like to see?

I couldn't find anything in the docs / code regarding a way to enable prompt caching with a model provider.
Is this possible with dspy?

Sending cache=True to dspy.LM enables local caching but not prompt caching.

This is extremely useful when using dspy.ReAct() with the same prompt being sent multiple times

Would you like to contribute?

  • Yes, I'd like to help implement this.
  • No, I just want to request it.

Additional Context

Using dspy 2.6

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions