diff --git a/docs/ai-interfaces/AGENTS.md b/docs/ai-interfaces/AGENTS.md
new file mode 100644
index 0000000000..1d4c1bc15e
--- /dev/null
+++ b/docs/ai-interfaces/AGENTS.md
@@ -0,0 +1,148 @@
+---
+sidebar_class_name: hidden
+title: AGENTS.md - AI Documentation Maintenance Guide
+---
+
+# AGENTS.md - AI Documentation Maintenance Guide
+
+This is a hidden guide for AI agents to understand how to maintain and update the AI interfaces documentation in Port Docs.
+
+## File Structure
+
+The AI interfaces documentation files are located in:
+- `/docs/ai-interfaces/` - Main documentation files
+- `/src/data/mcpTools.js` - MCP tool definitions (JSON file)
+- `/src/components/ToolsList/` - React component that displays tools from `mcpTools.js`
+
+## Common Tasks
+
+### Update MCP Tools
+
+To add, update, or remove MCP tools:
+
+1. **Update the file**: `src/data/mcpTools.js`
+2. **Ask for clarification**: Determine if the tool is for `builder` only or for both `developer` and `builder` roles
+3. The `ToolsList` component automatically reads from this file and displays tools in the documentation
+
+### Set Feature Beta Status
+
+When asked to mark a feature as **open beta** or **closed beta**, use the appropriate component:
+
+#### For Open Beta Features
+
+Use the `BetaFeatureNotice` component:
+
+```markdown
+import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
+
+
+```
+
+**Location**: Place this after the page title (H1) and before the main content.
+
+#### For Closed Beta Features
+
+Use the `ClosedBetaFeatureNotice` component:
+
+```markdown
+import ClosedBetaFeatureNotice from '/docs/generalTemplates/_closed_beta_feature_notice.md'
+
+
+```
+
+**Location**: Place this after the page title (H1) and before the main content.
+
+**ID**: Use a specific feature identifier (e.g., `"slack-app"`). Check `/docs/generalTemplates/_closed_beta_feature_notice_defs.js` for available IDs and their custom messages. You might need to create a new definition, ask the user for this.
+
+### Add New LLM Model
+
+To add a new LLM model to the documentation:
+
+1. **Identify the model name and providers**: Determine the exact model identifier (e.g., `gpt-5`) and which providers support it (e.g., Anthropic, AWS Bedrock).
+
+2. **Update the main documentation file**: Edit `docs/ai-interfaces/port-ai/llm-providers-management/overview.md`:
+
+3. **Note about API reference files**: The API reference files in `docs/api-reference/` (like `general-purpose-ai-interactions.api.mdx`, `invoke-a-specific-agent.api.mdx`, etc.) are auto-generated from OpenAPI specifications. These will be updated automatically when the backend API is updated. You don't need to manually edit these files.
+
+### Update MCP Installation Instructions
+
+The MCP installation instructions are maintained in a reusable component that is imported into the main documentation page.
+
+**Location**: `/docs/generalTemplates/_mcp-installation.md`
+
+**Usage**: This component is imported in `docs/ai-interfaces/port-mcp-server/overview-and-installation.md`.
+
+**What you could update**:
+- Add new client support
+- Update client instructions
+- Update disclaimers/warnings/admonitions
+
+### Update Feature Support Matrix
+
+The Feature Support Matrix table shows which capabilities are supported across Port's AI interfaces.
+
+**Location**: `docs/ai-interfaces/overview.md` - in the "Feature Support Matrix" section
+
+**Structure**:
+- **Rows**: Each Port AI interface (Port MCP Server, Port AI Invocation, Port AI Agents, Port AI Chat Widget, Port Slack App, Port AI Assistant)
+- **Columns**: Capabilities/features (Context Lake Query, Run Actions, Manage Blueprints, etc.)
+- **Indicators**:
+ - ✅ = Supported
+ - ❌ = Not supported
+
+**Column Ordering Rules**:
+1. **First columns**: Features that have at least one ✅ (supported by at least one interface)
+2. **Last columns**: Features that have all ❌ (not supported by any interface)
+3. **Final column**: "Manage Data Mapping" should always be the last column (it has all ❌)
+
+**Current column order** (from first to last):
+1. Feature (row identifier)
+2. Context Lake Query
+3. Run Actions
+4. Manage Blueprints
+5. Manage Entities
+6. Manage Scorecards
+7. Manage Actions
+8. Reuse Prompts
+9. Invoke AI Agents
+10. Manage Pages & Widgets (all ❌)
+11. Manage Integrations (all ❌)
+12. Manage Data Mapping (all ❌ - must be last)
+
+**When updating**:
+- If a new feature is added that has all ❌, add it before "Manage Data Mapping" (which must remain last)
+- If a feature's support status changes from all ❌ to having at least one ✅, move it to the appropriate position in the first group
+- If a new AI interface is added, add it as a new row
+- If a new capability is added, determine its position based on whether it has any ✅ or all ❌
+
+**Format**:
+- Use markdown table format
+- Wrap the table in `
` for horizontal scrolling on smaller screens
+- Use emojis ✅ and ❌ (not text symbols)
+
+### Update Monthly Limits
+
+When asked to update the monthly quota/limit for AI invocations:
+
+1. **Identify all locations**: Search for mentions of monthly limits/quota in the AI interfaces documentation:
+ - `docs/ai-interfaces/port-ai/api-interaction.md` - Contains rate limits section and example JSON responses
+ - `docs/ai-interfaces/port-ai/overview.md` - Contains limits section and FAQ entries
+
+2. **Update text descriptions**: Replace the quota number in all text descriptions (e.g., "20 AI invocations per month" → "50 AI invocations per month")
+
+3. **Update example JSON**: Update the `monthlyLimit` value in example JSON responses to match the new limit. Also update `remainingQuota` to be one less than the limit (e.g., if limit is 50, remainingQuota should be 49)
+
+4. **Files to check**:
+ - `docs/ai-interfaces/port-ai/api-interaction.md` - Lines with "Monthly Quota" section and JSON examples
+ - `docs/ai-interfaces/port-ai/overview.md` - Lines with "Monthly Quota" section and FAQ entries
+
+5. **Search pattern**: Look for patterns like:
+ - "20 AI invocations per month"
+ - `"monthlyLimit": 20`
+ - "Default quota: 20"
+
+**Note**: The API reference files in `docs/api-reference/` are auto-generated from OpenAPI specifications and will be updated automatically when the backend API is updated. You don't need to manually edit these files.
+
+## Other Tasks
+
+For other tasks not listed above, ask for clarification from the user. Once understood, add a new section to the "Common Tasks" section of this file.
diff --git a/docs/ai-interfaces/ai-agents/build-an-ai-agent.md b/docs/ai-interfaces/ai-agents/build-an-ai-agent.md
index b2e31a0cc6..47baf88669 100644
--- a/docs/ai-interfaces/ai-agents/build-an-ai-agent.md
+++ b/docs/ai-interfaces/ai-agents/build-an-ai-agent.md
@@ -5,12 +5,11 @@ title: Build an AI agent
# Build an AI agent
-:::info Closed Beta
-Port's AI offerings are currently in closed beta and will be gradually rolled out to users by the end of 2025.
-:::
-
import Tabs from "@theme/Tabs"
import TabItem from "@theme/TabItem"
+import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
+
+
:::info Built on Port AI
AI Agents are built on top of [Port AI](/ai-interfaces/port-ai/overview) and leverage its underlying capabilities for data access, security, and execution. This guide focuses on agent-specific building techniques.
@@ -118,7 +117,7 @@ From [john-123](https://github.com/john-123)
I don't see an option to add an AI agent (Click to expand)
-Make sure you have [access to the AI agents feature](/ai-interfaces/ai-agents/overview#access-to-the-feature). Note that it's currently in closed beta and requires special access. If you believe you should have access, please contact our support.
+Make sure you have [access to the AI agents feature](/ai-interfaces/ai-agents/overview#access-to-the-feature). If you believe you should have access, please contact our support.
diff --git a/docs/ai-interfaces/ai-agents/interact-with-ai-agents.md b/docs/ai-interfaces/ai-agents/interact-with-ai-agents.md
index d9cad197ef..a0eec86598 100644
--- a/docs/ai-interfaces/ai-agents/interact-with-ai-agents.md
+++ b/docs/ai-interfaces/ai-agents/interact-with-ai-agents.md
@@ -5,12 +5,11 @@ title: Interact with AI agents
# Interact with AI agents
-:::info Closed Beta
-Port's AI offerings are currently in closed beta and will be gradually rolled out to users by the end of 2025.
-:::
-
import Tabs from "@theme/Tabs"
import TabItem from "@theme/TabItem"
+import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
+
+
:::info Built on Port AI
AI Agents are specialized implementations built on top of [Port AI](/ai-interfaces/port-ai/overview), designed for machine-to-machine communication and autonomous operations within defined domains.
@@ -54,6 +53,8 @@ The widget provides a chat interface where you can ask questions and receive res
The widget will inherit all the agent's configuration including the prompts, conversation starters, tool access, etc.
+**Context Awareness**: The widget automatically understands the context of the page and entity where it's located. For example, when placed on a team entity page, you can ask questions like "What is this team's lead time for change?" or "How many open bugs does the team have?" without needing to specify the team name.
+
Conversation starters appear in the initial state, helping users understand what they can ask the agent. Users can either click a starter to begin a new chat or type their own question.
diff --git a/docs/ai-interfaces/ai-agents/overview.md b/docs/ai-interfaces/ai-agents/overview.md
index 85ee137bcb..0738b625c6 100644
--- a/docs/ai-interfaces/ai-agents/overview.md
+++ b/docs/ai-interfaces/ai-agents/overview.md
@@ -6,13 +6,11 @@ title: Overview
import Tabs from "@theme/Tabs"
import TabItem from "@theme/TabItem"
import PortTooltip from "/src/components/tooltip/tooltip.jsx"
-import ClosedBetaFeatureNotice from '/docs/generalTemplates/_closed_beta_feature_notice.md'
+import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
# AI agents overview
-:::info Closed Beta
-Port's AI offerings are currently in closed beta and will be gradually rolled out to users by the end of 2025.
-:::
+
:::info Built on Port AI
AI Agents are specialized implementations built on top of [Port AI](/ai-interfaces/port-ai/overview), designed for specific domains and machine-to-machine communication. For general AI capabilities and human interaction, see [Port AI](/ai-interfaces/port-ai/overview).
diff --git a/docs/ai-interfaces/ai-chat-widget.md b/docs/ai-interfaces/ai-chat-widget.md
index da4d27dbea..c3dad74575 100644
--- a/docs/ai-interfaces/ai-chat-widget.md
+++ b/docs/ai-interfaces/ai-chat-widget.md
@@ -24,6 +24,7 @@ The AI Chat Widget is a dashboard component that:
- **Uses configured tools**: Operates with the specific [Port AI tools](/ai-interfaces/port-ai/overview#port-ai-tools) you select.
- **Customizable placement**: Can be embedded in any dashboard layout.
- **Respects permissions**: Only accesses data based on user permissions.
+- **Context-aware**: Automatically understands the page and entity context where it's located, allowing you to ask questions about the specific entity or page without explicitly mentioning it.
## Widget Configuration
@@ -82,6 +83,17 @@ Users can type their own questions and requests directly into the chat interface
- Show visual indicators when tools are being used.
- Provide links to relevant Port pages and actions.
+### Context Awareness
+
+The AI Chat Widget automatically understands the context of the page and entity where it's located. This means you can ask questions about the specific entity or page without needing to explicitly mention it.
+
+**Examples:**
+- On a **team entity page**: "What is this team's lead time for change?" or "How many open bugs does the team have?"
+- On a **service entity page**: "What's the deployment status?" or "Show me recent incidents"
+- On a **dashboard**: Questions will be answered in the context of the dashboard's scope
+
+The widget uses this context to provide more relevant and accurate responses, making it easier to get information about the specific entity or page you're viewing.
+
### Tool Transparency
The widget interface provides enhanced capabilities and visual indicators showing which tools are being used:
@@ -314,4 +326,17 @@ For comprehensive information, see [Security Considerations](#security-considera
- [Data Privacy & Retention](/ai-interfaces/port-ai/security-and-data-controls#data-privacy--retention) - How your data is handled and stored.
+
+Does the widget understand the context of the page it's on? (Click to expand)
+
+Yes! The AI Chat Widget automatically understands the context of the page and entity where it's located. This means you can ask questions about the specific entity or page without needing to explicitly mention it.
+
+**Examples:**
+- On a **team entity page**: "What is this team's lead time for change?" or "How many open bugs does the team have?"
+- On a **service entity page**: "What's the deployment status?" or "Show me recent incidents"
+- On a **dashboard**: Questions will be answered in the context of the dashboard's scope
+
+The widget uses this context to provide more relevant and accurate responses. Learn more in [Context Awareness](#context-awareness).
+
+
The AI Chat Widget provides a powerful way to bring [Port AI](/ai-interfaces/port-ai/overview) capabilities directly into your team's daily workflows through customized dashboard experiences.
diff --git a/docs/ai-interfaces/overview.md b/docs/ai-interfaces/overview.md
index c925c26da2..ddc06f5942 100644
--- a/docs/ai-interfaces/overview.md
+++ b/docs/ai-interfaces/overview.md
@@ -5,7 +5,7 @@ title: Overview
# AI Interfaces Overview
-Port's AI interfaces provide intelligent assistance across your entire software development lifecycle. Our AI capabilities are currently in **open beta**, with Slack App in **closed beta**.
+Port's AI interfaces provide intelligent assistance across your entire software development lifecycle. All AI features are currently in **open beta**.
We're committed to developing AI responsibly, maintaining high standards of data privacy and security. **[Learn more about our security & data controls →](/ai-interfaces/port-ai/security-and-data-controls)**
@@ -82,7 +82,7 @@ The MCP Server provides AI machine interface capabilities that are compatible wi
**[Set up MCP Server →](/ai-interfaces/port-mcp-server/overview-and-installation)**
### Slack App
-Interact with Port's AI capabilities directly from Slack. Ask questions, run actions, and get insights without leaving your team communication platform. **Currently in closed beta** - we are not accepting new applications at the moment and will update once it moves to open beta.
+Interact with Port's AI capabilities directly from Slack. Ask questions, run actions, and get insights without leaving your team communication platform.
**[Learn about the Slack App →](/ai-interfaces/slack-app)**
@@ -100,6 +100,23 @@ Set up the **MCP Server** to bring Port's AI capabilities directly into your dev
### For Team Collaboration
Use the **Slack App** to make AI insights available to your entire team in your communication platform.
+## Feature Support Matrix
+
+The following table shows which capabilities are supported across Port's AI interfaces:
+
+
+
+| Feature | Context Lake Query | Run Actions | Manage Blueprints | Manage Entities | Manage Scorecards | Manage Actions | Reuse Prompts | Invoke AI Agents | Manage Pages & Widgets | Manage Integrations | Manage Data Mapping |
+|--------------------------------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|
+| **Port MCP Server** | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ |
+| **Port AI Invocation** | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
+| **Port AI Agents** | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
+| **Port AI Chat Widget** | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ |
+| **Port Slack App** | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
+| **Port AI Assistant** | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
+
+
+
## Frequently Asked Questions
@@ -109,7 +126,7 @@ Port offers four main AI interfaces:
- **Port AI Assistant**: Chat interface for quick questions and insights.
- **AI Agents**: Customizable workflows for automations.
- **MCP Server**: IDE integration for development workflows.
-- **Slack App**: Team collaboration interface (closed beta).
+- **Slack App**: Team collaboration interface.
@@ -129,7 +146,7 @@ If you're a developer who works primarily in an IDE, consider starting with the
For custom workflows or automation, explore **AI Agents**. **[Learn about AI Agents →](/ai-interfaces/ai-agents/overview)**
-For team collaboration, try the **Slack App** to bring AI insights into your communication platform (closed beta). **[Explore Slack App →](/ai-interfaces/slack-app)**
+For team collaboration, try the **Slack App** to bring AI insights into your communication platform. **[Explore Slack App →](/ai-interfaces/slack-app)**
@@ -138,7 +155,7 @@ For team collaboration, try the **Slack App** to bring AI insights into your com
- **Port AI Assistant**: Open beta - available to all users.
- **MCP Server**: Open beta - available to all users.
- **AI Agents**: Open beta - available to all users.
-- **Slack App**: Closed beta - not accepting new applications at the moment.
+- **Slack App**: Open beta - available to all users.
diff --git a/docs/ai-interfaces/port-ai/api-interaction.md b/docs/ai-interfaces/port-ai/api-interaction.md
index 1d25595064..f094efe45f 100644
--- a/docs/ai-interfaces/port-ai/api-interaction.md
+++ b/docs/ai-interfaces/port-ai/api-interaction.md
@@ -54,7 +54,7 @@ curl 'https://api.port.io/v1/ai/invoke' \
-H 'Content-Type: application/json' \
--data-raw '{
"prompt":"What services are failing health checks?",
- "tools": ["^(list|get|search)_.*"],
+ "tools": ["^(list|get|search|count)_.*"],
"labels": {
"source": "monitoring_system",
"environment": "production",
@@ -103,8 +103,8 @@ data: {
"remainingTimeMs": 903
},
"monthlyQuotaUsage": {
- "monthlyLimit": 20,
- "remainingQuota": 19,
+ "monthlyLimit": 50,
+ "remainingQuota": 49,
"month": "2025-09",
"remainingTimeMs": 1766899073
}
@@ -174,8 +174,8 @@ Signals that Port AI has finished processing and the response stream is complete
"remainingTimeMs": 903
},
"monthlyQuotaUsage": {
- "monthlyLimit": 20,
- "remainingQuota": 19,
+ "monthlyLimit": 50,
+ "remainingQuota": 49,
"month": "2025-09",
"remainingTimeMs": 1766899073
}
@@ -245,7 +245,7 @@ Port acts as a bridge to leading LLM providers and doesn't host LLM models inter
- These limits reset hourly.
### Monthly Quota
-- **Default quota**: 20 AI invocations per month.
+- **Default quota**: 50 AI invocations per month.
- Each interaction with Port AI counts as one request against your quota.
- Quota resets monthly.
@@ -322,14 +322,15 @@ Include a `tools` parameter in your API request with an array of regex patterns.
Perfect for monitoring dashboards and reporting systems where no modifications should be made.
```json
-["^(list|get|search|track|describe)_.*"]
+["^(list|get|search|count|track|describe)_.*"]
```
**What this matches:**
- `get_entities`, `get_blueprint`, `get_scorecard`.
- `list_entities`, `search_entities`.
+- `count_entities`.
- `describe_user_details`.
-- `search_port_docs_sources`, `ask_port_docs`.
+- `search_port_sources`.
@@ -371,11 +372,12 @@ Target specific third-party service integrations.
Enables entity operations while preventing accidental deletions.
```json
-["(?!delete_)\\w+_entity$", "get_.*", "list_.*"]
+["(?!delete_)\\w+_entity$", "get_.*", "list_.*", "count_.*"]
```
**What this matches:**
- `get_entity`, `list_entities`, `create_entity`, `update_entity`.
+- `count_entities`.
- **Excludes:** `delete_entity`.
@@ -390,7 +392,7 @@ Focus on documentation search and help functionality.
```
**What this matches:**
-- `search_port_docs_sources`, `ask_port_docs`.
+- `search_port_sources`.
- `describe_user_details`.
@@ -401,13 +403,13 @@ Focus on documentation search and help functionality.
Focus on catalog structure and quality metrics without action execution.
```json
-[".*blueprint.*", ".*scorecard.*", "^(get|list)_.*"]
+[".*blueprint.*", ".*scorecard.*", "^(get|list|count)_.*"]
```
**What this matches:**
- `get_blueprints`, `get_blueprint`.
- `get_scorecards`, `get_scorecard`.
-- All get/list operations.
+- All get/list/count operations.
@@ -448,7 +450,7 @@ curl 'https://api.port.io/v1/ai/invoke' \
-H 'Content-Type: application/json' \
--data-raw '{
"prompt": "What services are failing health checks?",
- "tools": ["^(list|get|search)_.*"],
+ "tools": ["^(list|get|search|count)_.*"],
"labels": {
"source": "monitoring_system",
"check_type": "health_analysis"
@@ -476,7 +478,7 @@ async function checkServiceHealth(serviceName) {
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
prompt: `Analyze the health of service ${serviceName}`,
- tools: ['^(list|get|search)_.*'],
+ tools: ['^(list|get|search|count)_.*'],
labels: {
source: 'monitoring_dashboard',
service: serviceName,
@@ -531,7 +533,7 @@ Automatically trigger Port AI based on catalog events using Port's automation sy
},
"body": {
"prompt": "Infrastructure component {{ .event.diff.after.title }} is unhealthy. Analyze the issue and suggest remediation steps based on current state and recent changes.",
- "tools": ["^(list|get|search)_.*", "run_.*incident.*", "run_.*notification.*"],
+ "tools": ["^(list|get|search|count)_.*", "run_.*incident.*", "run_.*notification.*"],
"labels": {
"source": "automation",
"entity_type": "{{ .event.diff.after.blueprint }}",
@@ -571,7 +573,7 @@ Create actions that invoke Port AI for on-demand analysis:
},
"body": {
"prompt": "Analyze the health of service {{ .entity.title }}. Check metrics, recent deployments, incidents, and provide actionable recommendations.",
- "tools": ["^(list|get|search)_.*", "run_.*incident.*"],
+ "tools": ["^(list|get|search|count)_.*", "run_.*incident.*"],
"labels": {
"source": "self_service",
"service_name": "{{ .entity.identifier }}",
diff --git a/docs/ai-interfaces/port-ai/llm-providers-management/overview.md b/docs/ai-interfaces/port-ai/llm-providers-management/overview.md
index 435d39f05a..aff8b1ba09 100644
--- a/docs/ai-interfaces/port-ai/llm-providers-management/overview.md
+++ b/docs/ai-interfaces/port-ai/llm-providers-management/overview.md
@@ -9,11 +9,6 @@ import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
-:::warning Limited Availability
-The ability to configure your own LLM providers has limited availability. Please reach out to the Port support team for additional information and access.
-:::
-
-
Manage and configure the Large Language Model (LLM) providers that power all AI interactions in Port. This feature gives you control over which AI models are used across Port AI Assistant, AI Agents, and other AI-powered features.
## LLM Approach Overview
@@ -39,9 +34,9 @@ For organizations requiring additional control, Port also supports configuring y
Port supports the following LLM providers and models:
- **OpenAI**: `gpt-5`
-- **Anthropic**: `claude-sonnet-4-20250514`
+- **Anthropic**: `claude-sonnet-4-20250514`, `claude-haiku-4-5-20251001`
- **Azure OpenAI**: `gpt-5`
-- **AWS Bedrock**: `claude-sonnet-4-20250514`
+- **AWS Bedrock**: `claude-sonnet-4-20250514`, `claude-haiku-4-5-20251001`
Port AI leverages `gpt-5` and `claude-sonnet-4-20250514` by default when no custom provider is configured.
@@ -76,8 +71,6 @@ Consider bringing your own LLM provider when you need:
- **Custom models**: Define custom configuration on models not available through Port's managed infrastructure.
- **Integration requirements**: Connect with existing AI infrastructure.
-**Note**: This feature has limited availability. Contact the Port support team for access.
-
@@ -144,13 +137,17 @@ Yes, you can opt out of data storage even when using your own LLM provider. Howe
-How do I get access to bring your own LLM functionality? (Click to expand)
+How do I configure my own LLM providers? (Click to expand)
+
+To configure your own LLM providers:
-The bring your own LLM feature has limited availability. To get access:
+1. **Configure your providers** - Set up your preferred LLM providers using the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API endpoint.
+2. **Select defaults** - Once providers are configured, you can view and select default providers and models through the UI (**Builder** → **Organization Settings** → **AI** tab) or via the [Change default LLM provider and model](/api-reference/change-default-llm-provider-and-model) API.
-1. **Contact Port support** - Reach out to the Port support team or your account manager for additional information.
-2. **Get approval** - If approved, you'll receive access to configure your own providers.
-4. **Configure your providers** - Set up your preferred LLM providers and models using the API endpoints.
+:::info UI vs API
+- **Viewing and selecting defaults**: Available in both UI and API.
+- **Adding new custom providers**: Requires the API.
+:::
This feature is designed for organizations with specific compliance, privacy, or integration requirements that cannot be met by Port's managed infrastructure.
diff --git a/docs/ai-interfaces/port-ai/llm-providers-management/setup-and-configuration.md b/docs/ai-interfaces/port-ai/llm-providers-management/setup-and-configuration.md
index 1f7725608e..14c62ccdfb 100644
--- a/docs/ai-interfaces/port-ai/llm-providers-management/setup-and-configuration.md
+++ b/docs/ai-interfaces/port-ai/llm-providers-management/setup-and-configuration.md
@@ -115,12 +115,24 @@ For more details on managing secrets, see the [Port Secrets documentation](/sso-
Use the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API to configure your providers. The interactive API reference provides detailed examples and allows you to test the configuration for each provider type (OpenAI, Anthropic, Azure OpenAI, AWS Bedrock).
+:::info After configuration
+Once providers are configured, you can view and select default providers and models through the UI (**Builder** → **Organization Settings** → **AI** tab) or continue using the API for all operations.
+:::
+
## Step 3: Validate Configuration
Test your provider configuration with connection validation using the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API with the `validate_connection=true` parameter. The interactive API reference shows how to test your configuration before saving it.
## Getting Your Current Configuration
+You can view your organization's current LLM provider defaults through the UI or API:
+
+**Using the UI:**
+1. Go to **Builder** → **Organization Settings** → **AI** tab.
+2. View all configured providers and models.
+3. See which provider and model are currently set as defaults.
+
+**Using the API:**
Retrieve your organization's current LLM provider defaults using the [Get default LLM provider and model](/api-reference/get-default-llm-provider-and-model) API. The interactive API reference shows the response format and allows you to test the endpoint.
### System Defaults
@@ -131,6 +143,19 @@ When no organization-specific defaults are configured, Port uses these system de
## Changing Default Providers
+You can change your organization's default LLM provider and model through the UI or API:
+
+**Using the UI:**
+1. Go to **Builder** → **Organization Settings** → **AI** tab.
+2. Select your preferred **Default LLM provider** from the dropdown.
+3. Select your preferred **Default model** from the dropdown.
+4. Click **Save** to apply your changes.
+
+:::info Adding new providers
+To add a new custom LLM provider, you still need to use the [Create or connect an LLM provider](/api-reference/create-or-connect-an-llm-provider) API. Once a provider is configured, it will appear in the UI dropdown for selection.
+:::
+
+**Using the API:**
Update your organization's default LLM provider and model using the [Change default LLM provider and model](/api-reference/change-default-llm-provider-and-model) API. The interactive API reference provides the request format and response examples.
## Validation Flow
diff --git a/docs/ai-interfaces/port-ai/overview.md b/docs/ai-interfaces/port-ai/overview.md
index 7b4bae7808..282a99887c 100644
--- a/docs/ai-interfaces/port-ai/overview.md
+++ b/docs/ai-interfaces/port-ai/overview.md
@@ -162,7 +162,7 @@ Port acts as a bridge to leading LLM providers and doesn't host LLM models inter
- These limits reset hourly.
### Monthly Quota
-- **Default quota**: 20 AI invocations per month.
+- **Default quota**: 50 AI invocations per month.
- Each interaction with Port AI counts as one request against your quota.
- Quota resets monthly.
@@ -335,7 +335,7 @@ Yes, Port AI has usage limits to ensure fair usage across all customers:
- These limits reset hourly
**Monthly Quota:**
-- Default quota: 20 AI invocations per month
+- Default quota: 50 AI invocations per month
- Each interaction with Port AI counts as one request
- Quota resets monthly
diff --git a/docs/ai-interfaces/port-ai/security-and-data-controls.md b/docs/ai-interfaces/port-ai/security-and-data-controls.md
index 95959e1496..110b052fdf 100644
--- a/docs/ai-interfaces/port-ai/security-and-data-controls.md
+++ b/docs/ai-interfaces/port-ai/security-and-data-controls.md
@@ -161,11 +161,11 @@ AI features typically start streaming responses within 5 seconds and complete wi
- Current system load
- Which AI interface is being used
-During the closed beta, response times may occasionally be longer as we optimize performance. This is expected behavior and will improve over time.
+Response times may occasionally be longer as we optimize performance. This is expected behavior and will improve over time.
### What should I do if AI responses seem slow?
-Response times up to 30 seconds are normal and expected for AI processing during the closed beta. If you experience consistently longer response times:
+Response times up to 30 seconds are normal and expected for AI processing. If you experience consistently longer response times:
- Check the AI invocation details for any errors
- Verify your usage hasn't hit rate limits
- Contact support if problems persist
diff --git a/docs/ai-interfaces/port-mcp-server/overview-and-installation.md b/docs/ai-interfaces/port-mcp-server/overview-and-installation.md
index 9a8e642a15..68f6e58826 100644
--- a/docs/ai-interfaces/port-mcp-server/overview-and-installation.md
+++ b/docs/ai-interfaces/port-mcp-server/overview-and-installation.md
@@ -28,9 +28,9 @@ import MCPInstallation from '/docs/generalTemplates/_mcp-installation.md'
The Port Model Context Protocol (MCP) Server acts as a bridge, enabling Large Language Models (LLMs)—like those powering Claude, Cursor, or GitHub Copilot—to interact directly with your Port.io developer portal. This allows you to leverage natural language to query your software catalog, analyze service health, manage resources, and even streamline development workflows, all from your preferred interfaces.
:::info AI Agents vs. MCP Server
-The Port MCP Server is currently in open beta and provides significant standalone value, independent of our [AI Agents feature](/ai-interfaces/ai-agents/overview). Port AI Agents are currently in closed beta with limited access, while the MCP Server gives you immediate access to streamline building in Port, query your catalog, analyze service health, and streamline development workflows using natural language.
+The Port MCP Server is currently in open beta and provides significant standalone value, independent of our [AI Agents feature](/ai-interfaces/ai-agents/overview). Both the MCP Server and AI Agents are in open beta and available to all users. The MCP Server gives you immediate access to streamline building in Port, query your catalog, analyze service health, and streamline development workflows using natural language.
-While the MCP Server can interact with Port AI Agents when available, the core MCP functionality can be used freely without requiring access to the closed beta AI Agents feature.
+While the MCP Server can interact with Port AI Agents when available, the core MCP functionality can be used freely on its own.
:::
## Why integrate LLMs with your developer portal?
diff --git a/docs/ai-interfaces/slack-app.md b/docs/ai-interfaces/slack-app.md
index e656c5393a..5c3b63671f 100644
--- a/docs/ai-interfaces/slack-app.md
+++ b/docs/ai-interfaces/slack-app.md
@@ -3,13 +3,13 @@ sidebar_position: 7
title: Slack App
---
-import ClosedBetaFeatureNotice from '/docs/generalTemplates/_closed_beta_feature_notice.md'
+import BetaFeatureNotice from '/docs/generalTemplates/_beta_feature_notice.md'
# Slack Application
-
+
-Port's Slack app brings your developer portal experience into your team's daily communication flow — allowing you to interact with Port directly from Slack and receive real-time notifications from Port right where your team collaborates.
+Port's Slack app brings your developer portal experience into your team's daily communication flow — allowing you to interact with Port directly from Slack and receive real-time notifications from Port right where your team collaborates. The Slack app uses the [Port AI API](/ai-interfaces/port-ai/overview) to provide intelligent answers about your software catalog, similar to the [Port AI Assistant](/ai-interfaces/port-ai-assistant).
@@ -27,7 +27,7 @@ This can be used to communicate important notifications to people in your organi
### Interact with AI capabilities
-Another powerful use-case of the Slack app is to interact with Port's AI capabilities directly from Slack.
+Another powerful use-case of the Slack app is to interact with Port's AI capabilities directly from Slack. The Slack app uses the **[Port AI API](/ai-interfaces/port-ai/overview)** (similar to the **[Port AI Assistant](/ai-interfaces/port-ai-assistant)**) to provide intelligent answers about your software catalog.
This can be used to get quick answers to questions about your resources, such as:
@@ -39,16 +39,11 @@ This can be used to get quick answers to questions about your resources, such as
- A Port account with **admin** permissions.
-- To install the Slack app, you will first need to apply for access to Port's AI program by filling out [this form](https://forms.gle/krhMY7c9JM8MyJJf7).
-
-- To interact with AI agents, you need to have at least one agent deployed in your portal.
- See the [Build an AI agent](/ai-interfaces/ai-agents/build-an-ai-agent) page to learn more.
-
## Installation
To install the Slack app, follow these steps:
-- Navigate to the [Slack app installation page](https://app.port.io/settings/slack-app). This page will be accessible only after being approved for the AI program (see prerequisites above).
+- Navigate to the [Slack app installation page](https://app.port.io/settings/slack-app).
- Click on the "Add to Slack" button.
@@ -132,7 +127,6 @@ Once the user is authenticated, they can:
The Slack app responds to the `/port` slash command with these options:
- `/port` (or `/port help`) - Shows general help and available actions.
-- `/port agents` - Lists all Port AI agents in your organization.
To ask the app a question, simply mention `@Port` and ask away, for example:
diff --git a/docs/generalTemplates/_mcp-installation.md b/docs/generalTemplates/_mcp-installation.md
index 498247b44f..c700d50e99 100644
--- a/docs/generalTemplates/_mcp-installation.md
+++ b/docs/generalTemplates/_mcp-installation.md
@@ -62,12 +62,6 @@ To connect Cursor to Port's remote MCP, follow these steps:
-:::warning Authentication window behavior
-In some cases, after clicking "Accept" in the authentication popup, the window won't get closed but the connection is established successfully. You can safely close the window.
-
-If you still don't see the tool, try it a couple of times. We are aware of this behavior and working to improve it.
-:::
-
To connect VSCode to Port's remote MCP server, follow these detailed steps. For complete instructions, refer to the [official VS Code MCP documentation](https://code.visualstudio.com/docs/copilot/chat/mcp-servers).
@@ -89,14 +83,6 @@ If you encounter errors:
- **Permission issues**: You may need to run with appropriate permissions
:::
-:::warning VSCode action tool issue
-In some versions of VS Code, the MCP server might not start or return an error in the chat because of a configuration issue with the action related tools. To deal with it, [deselect](/ai-interfaces/port-mcp-server/available-tools#select-which-tools-to-use) the tools `create_action`, `update_action`, and `delete_action`.
-This is relevant for cases where you see an error like this one:
-```
-Failed to validate tool mcp_port_create_action: Error: tool parameters array type must have items. Please open a Github issue for the MCP server or extension which provides this tool
-```
-:::
-
**Step 1: Configure MCP Server Settings**
1. Open VS Code settings
diff --git a/docs/guides/all/send-slack-message-to-user.md b/docs/guides/all/send-slack-message-to-user.md
index f1b494ef8b..8d0f2edda2 100644
--- a/docs/guides/all/send-slack-message-to-user.md
+++ b/docs/guides/all/send-slack-message-to-user.md
@@ -29,10 +29,6 @@ In this guide, we will use the email addresses from Port integrations (such as G
- Port's [Slack app](/ai-interfaces/slack-app) installed in your workspace.
- Access to the Slack app bot token (automatically created as a system secret).
-:::tip Slack app setup
-If you haven't installed Port's Slack app yet, you'll need to apply for access to Port's AI program by filling out [this form](https://forms.gle/krhMY7c9JM8MyJJf7) first.
-:::
-
## Set up the automations
This guide creates two automations that work together:
diff --git a/src/data/mcpTools.js b/src/data/mcpTools.js
index f650b4e477..ea775f6adb 100644
--- a/src/data/mcpTools.js
+++ b/src/data/mcpTools.js
@@ -53,6 +53,12 @@ export const mcpTools = [
apiReference: '/api-reference/get-an-entity',
roles: ['developer', 'builder']
},
+ {
+ name: 'count_entities',
+ description: 'Count entities matching specified filters without retrieving entity data. Returns only the count number for efficient queries like "how many services are in production?"',
+ apiReference: '/api-reference/get-all-entities-of-a-blueprint',
+ roles: ['developer', 'builder']
+ },
{
name: 'update_entity',
description: 'Update an existing entity. Only the fields provided will be updated.',
@@ -157,13 +163,8 @@ export const mcpTools = [
// Documentation and user tools
{
- name: 'ask_port_docs',
- description: 'Ask a question about Port documentation',
- roles: ['developer', 'builder']
- },
- {
- name: 'search_port_docs_sources',
- description: 'Search for relevant Port documentation sources based on a query',
+ name: 'search_port_sources',
+ description: 'Search the official Port documentation and return the most relevant sections from it for a user query. Each returned section includes the url and its actual content in markdown. Use this tool for all queries that require Port knowledge.',
roles: ['developer', 'builder']
},
{