You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/ai-interfaces/port-ai/llm-providers-management/overview.md
+9-12Lines changed: 9 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,11 +9,6 @@ import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
9
9
10
10
<BetaFeatureNoticeid="ai-form" />
11
11
12
-
:::warning Limited Availability
13
-
The ability to configure your own LLM providers has limited availability. Please reach out to the Port support team for additional information and access.
14
-
:::
15
-
16
-
17
12
Manage and configure the Large Language Model (LLM) providers that power all AI interactions in Port. This feature gives you control over which AI models are used across Port AI Assistant, AI Agents, and other AI-powered features.
18
13
19
14
## LLM Approach Overview
@@ -76,8 +71,6 @@ Consider bringing your own LLM provider when you need:
76
71
-**Custom models**: Define custom configuration on models not available through Port's managed infrastructure.
77
72
-**Integration requirements**: Connect with existing AI infrastructure.
78
73
79
-
**Note**: This feature has limited availability. Contact the Port support team for access.
80
-
81
74
</details>
82
75
83
76
<details>
@@ -144,13 +137,17 @@ Yes, you can opt out of data storage even when using your own LLM provider. Howe
144
137
</details>
145
138
146
139
<details>
147
-
<summary><b>How do I get access to bring your own LLM functionality? (Click to expand)</b></summary>
140
+
<summary><b>How do I configure my own LLM providers? (Click to expand)</b></summary>
141
+
142
+
To configure your own LLM providers:
148
143
149
-
The bring your own LLM feature has limited availability. To get access:
144
+
1.**Configure your providers** - Set up your preferred LLM providers using the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API endpoint.
145
+
2.**Select defaults** - Once providers are configured, you can view and select default providers and models through the UI (**Builder** → **Organization Settings** → **AI** tab) or via the [Change default LLM provider and model](/api-reference/change-default-llm-provider-and-model) API.
150
146
151
-
1.**Contact Port support** - Reach out to the Port support team or your account manager for additional information.
152
-
2.**Get approval** - If approved, you'll receive access to configure your own providers.
153
-
4.**Configure your providers** - Set up your preferred LLM providers and models using the API endpoints.
147
+
:::info UI vs API
148
+
-**Viewing and selecting defaults**: Available in both UI and API
149
+
-**Adding new custom providers**: Requires the API
150
+
:::
154
151
155
152
This feature is designed for organizations with specific compliance, privacy, or integration requirements that cannot be met by Port's managed infrastructure.
Copy file name to clipboardExpand all lines: docs/ai-interfaces/port-ai/llm-providers-management/setup-and-configuration.md
+25Lines changed: 25 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -115,12 +115,24 @@ For more details on managing secrets, see the [Port Secrets documentation](/sso-
115
115
116
116
Use the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API to configure your providers. The interactive API reference provides detailed examples and allows you to test the configuration for each provider type (OpenAI, Anthropic, Azure OpenAI, AWS Bedrock).
117
117
118
+
:::info After configuration
119
+
Once providers are configured, you can view and select default providers and models through the UI (**Builder** → **Organization Settings** → **AI** tab) or continue using the API for all operations.
120
+
:::
121
+
118
122
## Step 3: Validate Configuration
119
123
120
124
Test your provider configuration with connection validation using the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API with the `validate_connection=true` parameter. The interactive API reference shows how to test your configuration before saving it.
121
125
122
126
## Getting Your Current Configuration
123
127
128
+
You can view your organization's current LLM provider defaults through the UI or API:
129
+
130
+
**Using the UI:**
131
+
1. Go to **Builder** → **Organization Settings** → **AI** tab.
132
+
2. View all configured providers and models.
133
+
3. See which provider and model are currently set as defaults.
134
+
135
+
**Using the API:**
124
136
Retrieve your organization's current LLM provider defaults using the [Get default LLM provider and model](/api-reference/get-default-llm-provider-and-model) API. The interactive API reference shows the response format and allows you to test the endpoint.
125
137
126
138
### System Defaults
@@ -131,6 +143,19 @@ When no organization-specific defaults are configured, Port uses these system de
131
143
132
144
## Changing Default Providers
133
145
146
+
You can change your organization's default LLM provider and model through the UI or API:
147
+
148
+
**Using the UI:**
149
+
1. Go to **Builder** → **Organization Settings** → **AI** tab.
150
+
2. Select your preferred **Default LLM provider** from the dropdown.
151
+
3. Select your preferred **Default model** from the dropdown.
152
+
4. Click **Save** to apply your changes.
153
+
154
+
:::info Adding new providers
155
+
To add a new custom LLM provider, you still need to use the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API. Once a provider is configured, it will appear in the UI dropdown for selection.
156
+
:::
157
+
158
+
**Using the API:**
134
159
Update your organization's default LLM provider and model using the [Change default LLM provider and model](/api-reference/change-default-llm-provider-and-model) API. The interactive API reference provides the request format and response examples.
0 commit comments