Skip to content

Conversation

@pawarbi
Copy link

@pawarbi pawarbi commented Nov 13, 2025

Hi, I work at Microsoft Fabric and have been using DSPy. I would like to add Microsoft Fabric client to DSPy so our users can use it seamlessly. I have added the client and tested in Fabric as below:

import dspy


print("Using GPT-4.1")
lm = dspy.LM("microsoftfabric/gpt-4.1")
dspy.configure(lm=lm)

predictor = dspy.Predict("question -> answer")
result = predictor(question="What is Microsoft Fabric?")
print(result.answer)

print("=" * 250)
print("Using GPT-5")
lm = dspy.LM("microsoftfabric/gpt-5")
dspy.configure(lm=lm)

predictor = dspy.Predict("question -> answer")
result = predictor(question="What is Microsoft Fabric?")
print(result.answer)

image

I have also updated the Getting Started page to add Microsoft Fabric. Please review and let me know if you have any changes. Thanks.

Copilot AI and others added 5 commits November 10, 2025 23:51
Co-authored-by: pawarbi <62612119+pawarbi@users.noreply.github.com>
…ureOpenAI routing

Co-authored-by: pawarbi <62612119+pawarbi@users.noreply.github.com>
Co-authored-by: pawarbi <62612119+pawarbi@users.noreply.github.com>
feat(dspy): Add Microsoft Fabric Azure OpenAI integration with automatic authentication
@beapirate
Copy link

beapirate commented Nov 13, 2025

You'll probably want to elaborate on how this is different from the azure provider support already by provided by the litellm LM backend (https://docs.litellm.ai/docs/providers/azure) and why Microsoft does not update the Azure provider support to include the new APIs in litellm when you deprecate old versions.

dspy.configure(lm=lm)
```

Learn more about [Azure OpenAI in Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/data-science/ai-services/ai-services-overview).
Copy link

@beapirate beapirate Nov 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing information about requirement to install additional packages. And why add the advertisement for your service?

"""

# Supported models in Microsoft Fabric
SUPPORTED_MODELS: ClassVar[set[str]] = {"gpt-5", "gpt-4.1", "gpt-4.1-mini"}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why hardcode a static list of supported models? It's already outdated after the release of GPT 5.1

"""

# Supported models in Microsoft Fabric
SUPPORTED_MODELS: ClassVar[set[str]] = {"gpt-5", "gpt-4.1", "gpt-4.1-mini"}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why hardcode a static list of supported models? It's already outdated after the release of GPT 5.1

raise ImportError(
"Microsoft Fabric SDK packages are required to use FabricAzureOpenAI. "
"These packages are only available in Microsoft Fabric notebooks. "
"Please ensure you are running in a Fabric environment."

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes it virtually impossible for any project to accept and maintain this LM provider. Not only would you need a Microsoft account tot test it but you'd also need to figure out how to run parts of CI/CD inside your proprietary notebook service?

A language model supporting chat or text completion requests for use with DSPy modules.
"""

def __new__(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this was the right way to do it you wouldn't have had to define new to add this. No other client has done this so why should this client be an exception?

messages: list[dict[str, Any]] | None = None,
**kwargs
):
def forward(self, prompt: str | None = None, messages: list[dict[str, Any]] | None = None, **kwargs):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

formatting only change. Why mess with this?

@beapirate
Copy link

Considering the limited audience this would be interesting to and the dependency on paid proprietary services perhaps this would be more suitable to add support for this in your own synapse modules or publish it as an add-on package on PyPi that your users can install?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants