-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Adding Microsoft Fabric to the list of supported providers and clients #9052
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: pawarbi <62612119+pawarbi@users.noreply.github.com>
…ureOpenAI routing Co-authored-by: pawarbi <62612119+pawarbi@users.noreply.github.com>
Co-authored-by: pawarbi <62612119+pawarbi@users.noreply.github.com>
feat(dspy): Add Microsoft Fabric Azure OpenAI integration with automatic authentication
|
You'll probably want to elaborate on how this is different from the azure provider support already by provided by the litellm LM backend (https://docs.litellm.ai/docs/providers/azure) and why Microsoft does not update the Azure provider support to include the new APIs in litellm when you deprecate old versions. |
| dspy.configure(lm=lm) | ||
| ``` | ||
|
|
||
| Learn more about [Azure OpenAI in Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/data-science/ai-services/ai-services-overview). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing information about requirement to install additional packages. And why add the advertisement for your service?
| """ | ||
|
|
||
| # Supported models in Microsoft Fabric | ||
| SUPPORTED_MODELS: ClassVar[set[str]] = {"gpt-5", "gpt-4.1", "gpt-4.1-mini"} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why hardcode a static list of supported models? It's already outdated after the release of GPT 5.1
| """ | ||
|
|
||
| # Supported models in Microsoft Fabric | ||
| SUPPORTED_MODELS: ClassVar[set[str]] = {"gpt-5", "gpt-4.1", "gpt-4.1-mini"} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why hardcode a static list of supported models? It's already outdated after the release of GPT 5.1
| raise ImportError( | ||
| "Microsoft Fabric SDK packages are required to use FabricAzureOpenAI. " | ||
| "These packages are only available in Microsoft Fabric notebooks. " | ||
| "Please ensure you are running in a Fabric environment." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This makes it virtually impossible for any project to accept and maintain this LM provider. Not only would you need a Microsoft account tot test it but you'd also need to figure out how to run parts of CI/CD inside your proprietary notebook service?
| A language model supporting chat or text completion requests for use with DSPy modules. | ||
| """ | ||
|
|
||
| def __new__( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this was the right way to do it you wouldn't have had to define new to add this. No other client has done this so why should this client be an exception?
| messages: list[dict[str, Any]] | None = None, | ||
| **kwargs | ||
| ): | ||
| def forward(self, prompt: str | None = None, messages: list[dict[str, Any]] | None = None, **kwargs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
formatting only change. Why mess with this?
|
Considering the limited audience this would be interesting to and the dependency on paid proprietary services perhaps this would be more suitable to add support for this in your own synapse modules or publish it as an add-on package on PyPi that your users can install? |
Hi, I work at Microsoft Fabric and have been using DSPy. I would like to add Microsoft Fabric client to DSPy so our users can use it seamlessly. I have added the client and tested in Fabric as below:
I have also updated the Getting Started page to add Microsoft Fabric. Please review and let me know if you have any changes. Thanks.