Skip to content

Commit 4fa1e33

Browse files
committed
update availability & UI for llm providers
1 parent b96f5ba commit 4fa1e33

File tree

2 files changed

+34
-12
lines changed

2 files changed

+34
-12
lines changed

docs/ai-interfaces/port-ai/llm-providers-management/overview.md

Lines changed: 9 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,6 @@ import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
99

1010
<BetaFeatureNotice id="ai-form" />
1111

12-
:::warning Limited Availability
13-
The ability to configure your own LLM providers has limited availability. Please reach out to the Port support team for additional information and access.
14-
:::
15-
16-
1712
Manage and configure the Large Language Model (LLM) providers that power all AI interactions in Port. This feature gives you control over which AI models are used across Port AI Assistant, AI Agents, and other AI-powered features.
1813

1914
## LLM Approach Overview
@@ -76,8 +71,6 @@ Consider bringing your own LLM provider when you need:
7671
- **Custom models**: Define custom configuration on models not available through Port's managed infrastructure.
7772
- **Integration requirements**: Connect with existing AI infrastructure.
7873

79-
**Note**: This feature has limited availability. Contact the Port support team for access.
80-
8174
</details>
8275

8376
<details>
@@ -144,13 +137,17 @@ Yes, you can opt out of data storage even when using your own LLM provider. Howe
144137
</details>
145138

146139
<details>
147-
<summary><b>How do I get access to bring your own LLM functionality? (Click to expand)</b></summary>
140+
<summary><b>How do I configure my own LLM providers? (Click to expand)</b></summary>
141+
142+
To configure your own LLM providers:
148143

149-
The bring your own LLM feature has limited availability. To get access:
144+
1. **Configure your providers** - Set up your preferred LLM providers using the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API endpoint.
145+
2. **Select defaults** - Once providers are configured, you can view and select default providers and models through the UI (**Builder****Organization Settings****AI** tab) or via the [Change default LLM provider and model](/api-reference/change-default-llm-provider-and-model) API.
150146

151-
1. **Contact Port support** - Reach out to the Port support team or your account manager for additional information.
152-
2. **Get approval** - If approved, you'll receive access to configure your own providers.
153-
4. **Configure your providers** - Set up your preferred LLM providers and models using the API endpoints.
147+
:::info UI vs API
148+
- **Viewing and selecting defaults**: Available in both UI and API
149+
- **Adding new custom providers**: Requires the API
150+
:::
154151

155152
This feature is designed for organizations with specific compliance, privacy, or integration requirements that cannot be met by Port's managed infrastructure.
156153

docs/ai-interfaces/port-ai/llm-providers-management/setup-and-configuration.md

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,12 +115,24 @@ For more details on managing secrets, see the [Port Secrets documentation](/sso-
115115

116116
Use the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API to configure your providers. The interactive API reference provides detailed examples and allows you to test the configuration for each provider type (OpenAI, Anthropic, Azure OpenAI, AWS Bedrock).
117117

118+
:::info After configuration
119+
Once providers are configured, you can view and select default providers and models through the UI (**Builder****Organization Settings****AI** tab) or continue using the API for all operations.
120+
:::
121+
118122
## Step 3: Validate Configuration
119123

120124
Test your provider configuration with connection validation using the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API with the `validate_connection=true` parameter. The interactive API reference shows how to test your configuration before saving it.
121125

122126
## Getting Your Current Configuration
123127

128+
You can view your organization's current LLM provider defaults through the UI or API:
129+
130+
**Using the UI:**
131+
1. Go to **Builder****Organization Settings****AI** tab.
132+
2. View all configured providers and models.
133+
3. See which provider and model are currently set as defaults.
134+
135+
**Using the API:**
124136
Retrieve your organization's current LLM provider defaults using the [Get default LLM provider and model](/api-reference/get-default-llm-provider-and-model) API. The interactive API reference shows the response format and allows you to test the endpoint.
125137

126138
### System Defaults
@@ -131,6 +143,19 @@ When no organization-specific defaults are configured, Port uses these system de
131143

132144
## Changing Default Providers
133145

146+
You can change your organization's default LLM provider and model through the UI or API:
147+
148+
**Using the UI:**
149+
1. Go to **Builder****Organization Settings****AI** tab.
150+
2. Select your preferred **Default LLM provider** from the dropdown.
151+
3. Select your preferred **Default model** from the dropdown.
152+
4. Click **Save** to apply your changes.
153+
154+
:::info Adding new providers
155+
To add a new custom LLM provider, you still need to use the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API. Once a provider is configured, it will appear in the UI dropdown for selection.
156+
:::
157+
158+
**Using the API:**
134159
Update your organization's default LLM provider and model using the [Change default LLM provider and model](/api-reference/change-default-llm-provider-and-model) API. The interactive API reference provides the request format and response examples.
135160

136161
## Validation Flow

0 commit comments

Comments
 (0)