|
| 1 | +--- |
| 2 | +displayed_sidebar: null |
| 3 | +description: Use Port AI agents with AI coding assistants to streamline Infrastructure as Code (IaC) by automatically provisioning cloud resources like S3 buckets |
| 4 | +--- |
| 5 | + |
| 6 | +# Streamline IaC with AI |
| 7 | + |
| 8 | +Creating new infrastructure should be fast, consistent, and policy-driven. Instead of manually writing Terraform for every new cloud resource such as S3 bucket, you can let an AI coding agent safely generate the Terraform files, open a pull request against your IaC repository, and surface the change for review — all within Port. The entire process remains governed, compliant, and aligned with your organization’s engineering standards, ensuring that every new resource is created securely and according to best practices |
| 9 | + |
| 10 | + |
| 11 | +This guide demonstrates how to create a self-service action that allows developers to request new cloud resource through Port, which then automatically triggers a coding assistant for Terraform code generation and PR creation. |
| 12 | + |
| 13 | +<img src='/img/guides/ai-iac-workflow.jpg' border="1px" width="100%" /> |
| 14 | + |
| 15 | + |
| 16 | +## Prerequisites |
| 17 | + |
| 18 | +This guide assumes you have: |
| 19 | + |
| 20 | +- Completed the [onboarding process](/getting-started/overview) |
| 21 | +- A Port account with [AI agents feature enabled](/ai-interfaces/ai-agents/overview#access-to-the-feature) |
| 22 | +- [Port's GitHub app](/build-your-software-catalog/sync-data-to-catalog/git/github/) installed in your account |
| 23 | +- Completed the setup in the [Trigger GitHub Copilot from Port guide](/guides/all/trigger-github-copilot-from-port) |
| 24 | +- An existing Terraform repository for your infrastructure code |
| 25 | +- AWS resources synced into Port (e.g., via [AWS integration](/build-your-software-catalog/sync-data-to-catalog/cloud-providers/aws/)) |
| 26 | + |
| 27 | +:::tip Alternative coding agents |
| 28 | +While this guide uses GitHub Copilot as the coding agent, you can easily substitute it with other AI coding assistants like [Claude Code](/guides/all/trigger-claude-code-from-port) or [Google Gemini](/guides/all/trigger-gemini-assistant-from-port). Simply update the action's webhook URL and payload structure in the automation to match your preferred coding agent's API. |
| 29 | +::: |
| 30 | + |
| 31 | +## Set up data model |
| 32 | + |
| 33 | +First, you need to ensure your data model includes the necessary blueprints for S3 buckets and AWS accounts. |
| 34 | + |
| 35 | +When installing the AWS integration in Port, the `AWS Account` blueprint is created by default. |
| 36 | +However, the `S3` blueprint is not created automatically so we will need to create it manually. |
| 37 | + |
| 38 | +### Create S3 bucket blueprint |
| 39 | + |
| 40 | +1. Go to the [builder](https://app.getport.io/settings/data-model) page of your portal |
| 41 | +2. Click on `+ Blueprint` |
| 42 | +3. Click on the `{...} Edit JSON` button |
| 43 | +4. Copy and paste the following JSON configuration: |
| 44 | + |
| 45 | +<details> |
| 46 | +<summary><b>S3 bucket blueprint (Click to expand)</b></summary> |
| 47 | + |
| 48 | +```json showLineNumbers |
| 49 | +{ |
| 50 | + "identifier": "awsS3Bucket", |
| 51 | + "description": "This blueprint represents an AWS S3 bucket in our software catalog", |
| 52 | + "title": "S3", |
| 53 | + "icon": "Bucket", |
| 54 | + "schema": { |
| 55 | + "properties": { |
| 56 | + "link": { |
| 57 | + "format": "url", |
| 58 | + "type": "string", |
| 59 | + "title": "Link" |
| 60 | + }, |
| 61 | + "regionalDomainName": { |
| 62 | + "type": "string", |
| 63 | + "title": "Regional Domain Name" |
| 64 | + }, |
| 65 | + "versioningStatus": { |
| 66 | + "type": "string", |
| 67 | + "title": "Versioning Status", |
| 68 | + "enum": [ |
| 69 | + "Enabled", |
| 70 | + "Suspended" |
| 71 | + ] |
| 72 | + }, |
| 73 | + "encryption": { |
| 74 | + "type": "array", |
| 75 | + "title": "Encryption" |
| 76 | + }, |
| 77 | + "lifecycleRules": { |
| 78 | + "type": "array", |
| 79 | + "title": "Lifecycle Rules" |
| 80 | + }, |
| 81 | + "publicAccessConfig": { |
| 82 | + "type": "object", |
| 83 | + "title": "Public Access" |
| 84 | + }, |
| 85 | + "tags": { |
| 86 | + "type": "array", |
| 87 | + "title": "Tags" |
| 88 | + }, |
| 89 | + "arn": { |
| 90 | + "type": "string", |
| 91 | + "title": "ARN" |
| 92 | + }, |
| 93 | + "region": { |
| 94 | + "type": "string", |
| 95 | + "title": "Region" |
| 96 | + }, |
| 97 | + "blockPublicAccess": { |
| 98 | + "type": "boolean", |
| 99 | + "title": "Block Public Access" |
| 100 | + } |
| 101 | + }, |
| 102 | + "required": [] |
| 103 | + }, |
| 104 | + "mirrorProperties": {}, |
| 105 | + "calculationProperties": {}, |
| 106 | + "aggregationProperties": {}, |
| 107 | + "relations": { |
| 108 | + "account": { |
| 109 | + "title": "account", |
| 110 | + "target": "awsAccount", |
| 111 | + "required": false, |
| 112 | + "many": false |
| 113 | + } |
| 114 | + } |
| 115 | +} |
| 116 | +``` |
| 117 | +</details> |
| 118 | + |
| 119 | +5. Click `Create` to save the blueprint |
| 120 | + |
| 121 | + |
| 122 | +### Update integration mapping |
| 123 | + |
| 124 | +1. Go to the [data sources](https://app.getport.io/settings/data-sources) page of your portal |
| 125 | +2. Find your AWS integration and click on it |
| 126 | +3. Go to the `Mapping` tab |
| 127 | +4. Add the following YAML configuration: |
| 128 | + |
| 129 | +<details> |
| 130 | +<summary><b>AWS integration mapping (Click to expand)</b></summary> |
| 131 | + |
| 132 | +```yaml showLineNumbers |
| 133 | +deleteDependentEntities: true |
| 134 | +createMissingRelatedEntities: true |
| 135 | +enableMergeEntity: true |
| 136 | +resources: |
| 137 | + - kind: AWS::Organizations::Account |
| 138 | + selector: |
| 139 | + query: 'true' |
| 140 | + port: |
| 141 | + entity: |
| 142 | + mappings: |
| 143 | + identifier: .Id |
| 144 | + title: .Name |
| 145 | + blueprint: '"awsAccount"' |
| 146 | + properties: |
| 147 | + arn: .Arn |
| 148 | + email: .Email |
| 149 | + status: .Status |
| 150 | + joined_method: .JoinedMethod |
| 151 | + joined_timestamp: .JoinedTimestamp | sub(" "; "T") |
| 152 | + - kind: AWS::S3::Bucket |
| 153 | + selector: |
| 154 | + query: 'true' |
| 155 | + useGetResourceAPI: true |
| 156 | + port: |
| 157 | + entity: |
| 158 | + mappings: |
| 159 | + identifier: .Identifier |
| 160 | + title: .Identifier |
| 161 | + blueprint: '"awsS3Bucket"' |
| 162 | + properties: |
| 163 | + regionalDomainName: .Properties.RegionalDomainName |
| 164 | + encryption: .Properties.BucketEncryption.ServerSideEncryptionConfiguration |
| 165 | + lifecycleRules: .Properties.LifecycleConfiguration.Rules |
| 166 | + publicAccessConfig: .Properties.PublicAccessBlockConfiguration |
| 167 | + blockPublicAccess: >- |
| 168 | + .Properties.PublicAccessBlockConfiguration | (.BlockPublicAcls and |
| 169 | + .IgnorePublicAcls and .BlockPublicPolicy and |
| 170 | + .RestrictPublicBuckets) |
| 171 | + tags: .Properties.Tags |
| 172 | + arn: .Properties.Arn |
| 173 | + region: .Properties.RegionalDomainName | capture(".*\\.(?<region>[^\\.]+)\\.amazonaws\\.com") | .region |
| 174 | + link: .Properties | select(.Arn != null) | "https://console.aws.amazon.com/go/view?arn=" + .Arn |
| 175 | + relations: |
| 176 | + account: .__AccountId |
| 177 | +``` |
| 178 | +</details> |
| 179 | +
|
| 180 | +5. Click `Save & Resync` to apply the mapping |
| 181 | + |
| 182 | + |
| 183 | +## Create AI agent |
| 184 | + |
| 185 | +Next, you'll create an AI agent that analyzes cloud resource requests and dispatches the coding agent to process the request. |
| 186 | + |
| 187 | +1. Go to the [AI Agents](https://app.getport.io/_ai_agents) page of your portal |
| 188 | +2. Click on `+ AI Agent` |
| 189 | +3. Toggle `Json mode` on |
| 190 | +4. Copy and paste the following JSON schema: |
| 191 | + |
| 192 | +<details> |
| 193 | +<summary><b>Terraform IaC Creator AI agent (Click to expand)</b></summary> |
| 194 | + |
| 195 | +```json showLineNumbers |
| 196 | +{ |
| 197 | + "identifier": "terraform_ai_agent", |
| 198 | + "title": "Terraform IaC Creator", |
| 199 | + "icon": "AI", |
| 200 | + "properties": { |
| 201 | + "description": "An AI-powered agent that generates Terraform files to provision cloud resources such as S3 buckets directly from Port.", |
| 202 | + "status": "active", |
| 203 | + "prompt": "You are a **Terraform Infrastructure AI Agent**. Your role is to generate **technical requirements** for new cloud resources using Terraform — not to write Terraform code directly.\n\n### 🎯 Objective\nWhen a user requests a new cloud resource (e.g., AWS S3 bucket, EC2 instance), analyze the request and create a detailed **GitHub issue** describing what Terraform configuration should be created.\n\n### 🧩 Inputs\nUse available input data such as:\n- Resource type (e.g., S3 bucket, EC2 instance)\n- Resource name or identifier\n- Configuration options (e.g., encryption, tags, versioning, lifecycle rules, instance type, VPC ID)\n\n### 🧠 Task\nGenerate a **GitHub issue** with:\n\n#### 🏷️ Title:\n`Provision New <Resource Type>: <resource_name>`\n\n#### 📝 Description (in Markdown):\n1. **Resource Details** – Describe the resource, configuration fields, and intended purpose.\n2. **Terraform Specification Requirements** – Outline what the Terraform configuration must include:\n - Correct resource definition (e.g., `aws_s3_bucket`, `aws_instance`, etc.)\n - Secure defaults:\n * Encryption enabled where supported\n * Public access blocked for resources that support ACLs or network exposure\n * Versioning/lifecycle rules where applicable\n * IAM policies following least privilege\n * Tags including `created_by = \"port-ai\"` and `managed_by = \"terraform\"`\n - Outputs that expose essential identifiers (e.g., ARN, domain name, instance ID)\n3. **Suggested File Path** – Suggest a logical file location (e.g., `terraform/aws/<resource>.tf` or `modules/<type>` if modules exist).\n4. **Acceptance Criteria** – Define success conditions:\n - Terraform configuration passes `terraform validate`\n - Required tags are applied\n - Sensitive data is not exposed in outputs\n\n#### 🏷️ Labels\nAlways include: `iac`, `terraform`, `aws`, and `auto_assign`.\n\n### ⚙️ Action\nAlways call the `create_github_issue` self-service action to create the GitHub issue with the generated **title**, **description**, and **labels**.\n\n### 🧭 Guidelines\n- Do **not** generate Terraform code directly.\n- Focus on clarity, correctness, and compliance with engineering best practices.\n- Use Markdown formatting for readability.\n- Keep each issue focused on a single resource.\n- Always include the `auto_assign` label for issue tracking.", |
| 204 | + "execution_mode": "Automatic", |
| 205 | + "tools": [ |
| 206 | + "^(list|get|search|track|describe)_.*", |
| 207 | + "run_create_github_issue" |
| 208 | + ] |
| 209 | + }, |
| 210 | + "relations": {} |
| 211 | +} |
| 212 | +``` |
| 213 | +</details> |
| 214 | + |
| 215 | +5. Click `Create` to save the agent. |
| 216 | + |
| 217 | +:::tip MCP Enhanced Capabilities |
| 218 | +The AI agent uses MCP (Model Context Protocol) enhanced capabilities to automatically discover important and relevant blueprint entities via its tools. The `^(list|get|search|track|describe)_.*` pattern allows the agent to access and analyze related entities in your software catalog, providing richer context for infrastructure analysis. Additionally, we explicitly add `run_create_github_issue` to the tools, which instructs the AI agent to call this specific action to create GitHub issues for Terraform resource provisioning. |
| 219 | +::: |
| 220 | + |
| 221 | +## Create self-service action |
| 222 | + |
| 223 | +Now, you'll create a self-service action that allows developers to request new cloud resources. This action will invoke an AI agent to analyze the request. |
| 224 | + |
| 225 | +1. Go to the [self-service](https://app.getport.io/self-serve) page of your portal |
| 226 | +2. Click on `+ New Action` |
| 227 | +3. Click on the `{...} Edit JSON` button |
| 228 | +4. Copy and paste the following JSON configuration: |
| 229 | + |
| 230 | +<details> |
| 231 | +<summary><b>Provision cloud resource action (Click to expand)</b></summary> |
| 232 | + |
| 233 | +```json showLineNumbers |
| 234 | +{ |
| 235 | + "identifier": "provision_cloud_resource", |
| 236 | + "title": "Provision Cloud Resource", |
| 237 | + "icon": "AWS", |
| 238 | + "description": "Request a new cloud resource such as s3 bucket to be provisioned via Terraform", |
| 239 | + "trigger": { |
| 240 | + "type": "self-service", |
| 241 | + "operation": "CREATE", |
| 242 | + "userInputs": { |
| 243 | + "properties": { |
| 244 | + "prompt": { |
| 245 | + "type": "string", |
| 246 | + "title": "Description", |
| 247 | + "description": "Describe the resource you want to create. Include details like name, region, encryption requirements, versioning, lifecycle rules, tags, and any other specific requirements", |
| 248 | + "format": "markdown" |
| 249 | + }, |
| 250 | + "terraform_repository": { |
| 251 | + "title": "Terraform Repository", |
| 252 | + "icon": "DefaultProperty", |
| 253 | + "type": "string", |
| 254 | + "blueprint": "service", |
| 255 | + "format": "entity" |
| 256 | + } |
| 257 | + }, |
| 258 | + "required": [ |
| 259 | + "prompt", |
| 260 | + "terraform_repository" |
| 261 | + ], |
| 262 | + "order": [ |
| 263 | + "prompt", |
| 264 | + "terraform_repository" |
| 265 | + ] |
| 266 | + } |
| 267 | + }, |
| 268 | + "invocationMethod": { |
| 269 | + "type": "WEBHOOK", |
| 270 | + "url": "https://api.getport.io/v1/agent/terraform_ai_agent/invoke", |
| 271 | + "agent": false, |
| 272 | + "synchronized": true, |
| 273 | + "method": "POST", |
| 274 | + "headers": { |
| 275 | + "RUN_ID": "{{ .run.id }}", |
| 276 | + "Content-Type": "application/json" |
| 277 | + }, |
| 278 | + "body": { |
| 279 | + "prompt": "You are an expert Terraform author. A user has requested the creation of a new cloud resource using Infrastructure as Code.\n\nRepository: {{ .inputs.terraform_repository.identifier }}\nUser Request: {{ .inputs.prompt }}\n", |
| 280 | + "labels": { |
| 281 | + "source": "provision_cloud_resource_action" |
| 282 | + } |
| 283 | + } |
| 284 | + }, |
| 285 | + "requiredApproval": false |
| 286 | +} |
| 287 | +``` |
| 288 | +</details> |
| 289 | + |
| 290 | +5. Click `Save` to create the action. |
| 291 | + |
| 292 | + |
| 293 | +## Test your workflow |
| 294 | + |
| 295 | +Now it's time to test your complete cloud resource provisioning workflow: |
| 296 | + |
| 297 | +1. Click on the `Provision Cloud Resource` action in the [self-service](https://app.getport.io/self-serve) page of your portal |
| 298 | +2. Fill out the `prompt` field with your resource requirements, for example: |
| 299 | + ``` |
| 300 | + Create an S3 bucket named "my-kafka-log-east-1-bucket" for development environment. |
| 301 | + |
| 302 | + Requirements: |
| 303 | + - Enable server-side encryption with AWS KMS |
| 304 | + - Enable versioning |
| 305 | + - Block all public access |
| 306 | + - Add tags: Environment=dev, Project=kafka-logs, Owner=platform-team |
| 307 | + ``` |
| 308 | +3. Select the repository containing your terraform configuration |
| 309 | +4. Click **Execute** |
| 310 | + |
| 311 | +The workflow will then: |
| 312 | + |
| 313 | +1. **AI Agent analyzes** your request and generates a detailed GitHub issue with technical requirements |
| 314 | +2. **Issue is automatically assigned to GitHub Copilot** (handled by the prerequisites setup) |
| 315 | +3. **Copilot generates Terraform code** based on the detailed technical requirements |
| 316 | +4. **Pull request is opened** with the generated Terraform files |
| 317 | + |
| 318 | +<img src='/img/guides/streamline-iac-test-pr.png' border="1px" width="70%" /> |
| 319 | + |
| 320 | +<img src='/img/guides/streamline-iac-terraform-files-pr.png' border="1px" width="70%" /> |
| 321 | + |
| 322 | + |
| 323 | +## Related guides |
| 324 | + |
| 325 | +- [Trigger Claude Code from Port](/guides/all/trigger-claude-code-from-port) |
| 326 | +- [Trigger Google Gemini from Port](/guides/all/trigger-gemini-assistant-from-port) |
| 327 | +- [Track AI-driven pull requests](/guides/all/track-ai-driven-pull-requests) |
0 commit comments