Skip to content

Commit 3c664d5

Browse files
committed
Added new pattern for SQS-Lambda-S3 using Terraform and Python
1 parent 5c54b45 commit 3c664d5

File tree

5 files changed

+450
-0
lines changed

5 files changed

+450
-0
lines changed
Lines changed: 170 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,170 @@
1+
# Amazon SQS to Amazon S3 integration using AWS Lambda
2+
3+
This pattern creates an SQS queue, a Lambda function, an S3 bucket along with event source mapping for the Lambda function and appropriate permissions to enable the interfacing between these resources.
4+
5+
Learn more about this pattern at Serverless Land Patterns: [SQS to Lambda to S3](https://serverlessland.com/patterns/sqs-lambda-s3)
6+
7+
**Important:** this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.
8+
9+
## Requirements
10+
11+
* **AWS Resources**<br>
12+
Creation of AWS resources requires the following:
13+
* [AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) - An AWS account is required for creating the various resources. If you do not already have one, then create an account and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources.
14+
* [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) - This is required for cloning this repo.
15+
* [Terraform](https://learn.hashicorp.com/tutorials/terraform/install-cli?in=terraform/aws-get-started) - Terraform is an IaC (Infrastructure as Code) software tool used for creating and managing AWS resources using a declarative configuration language.
16+
17+
* **Test Setup**<br>
18+
In order to test this integration, the following are required:
19+
* [Python](https://wiki.python.org/moin/BeginnersGuide/Download) is required to run the test script.
20+
* [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html) is a prerequisite for using boto3 module in the test script.
21+
22+
## Deployment Instructions
23+
24+
1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:
25+
```
26+
git clone https://github.com/aws-samples/serverless-patterns
27+
```
28+
1. Change directory to the pattern directory:
29+
```
30+
cd sqs-lambda-s3-terraform-python
31+
```
32+
1. Pick a unique name for the target S3 bucket eg. `my-bucket-20250329`. Replace the bucket name in following files:
33+
34+
* Lambda handler - `handler.py`
35+
36+
```
37+
resp = s3_client.put_object(
38+
Body=str(request_body).encode(encoding="utf-8"),
39+
Bucket='my-bucket-20250329',
40+
Key=file_name
41+
)
42+
```
43+
* Terraform configuration - `main.tf`
44+
45+
```
46+
# S3 bucket
47+
resource "aws_s3_bucket" "event-storage" {
48+
bucket = "my-bucket-20250329"
49+
force_destroy = true
50+
tags = {
51+
Name = "event-storage"
52+
}
53+
}
54+
```
55+
1. Update the AWS region in the following files with the region, in which the resources will be created:
56+
57+
* Lambda handler - `handler.py`
58+
59+
```
60+
config = Config(region_name='ap-south-1')
61+
```
62+
* Terraform configuration - `main.tf`
63+
64+
```
65+
provider "aws" {
66+
region = "ap-south-1"
67+
}
68+
```
69+
70+
1. Compress the lambda handler to generate a zip file:
71+
72+
```
73+
cp handler.py handler_1.py
74+
gzip -S .zip handler.py
75+
mv handler.py.zip sqs-lambda-s3.zip
76+
mv handler_1.py handler.py
77+
```
78+
79+
1. Deploy the AWS resources through Terraform:
80+
81+
```
82+
terraform init -upgrade
83+
terraform fmt
84+
terraform validate
85+
terraform apply -auto-approve
86+
```
87+
88+
## How it works
89+
90+
The AWS resources created as a part of this integration are as follows:
91+
92+
* Amazon SQS queue
93+
* AWS Lambda function
94+
* Amazon S3 bucket
95+
* IAM policies and roles
96+
97+
The SQS queue is configured as a trigger for the Lambda function. Whenever a message is posted to the SQS queue, the Lambda function is invoked synchronously. This is useful in scenarios, where the message requires some pre-processing before storage.
98+
99+
## Testing
100+
101+
1. Create an IAM user which will be used for writing messages on the SQS queue
102+
103+
2. Add persmissions for the IAM user through the following inline policy:
104+
105+
```
106+
{
107+
"Version": "2012-10-17",
108+
"Statement": [
109+
{
110+
"Sid": "VisualEditor0",
111+
"Effect": "Allow",
112+
"Action": [
113+
"sqs:DeleteMessage",
114+
"sqs:TagQueue",
115+
"sqs:UntagQueue",
116+
"sqs:ReceiveMessage",
117+
"sqs:SendMessage"
118+
],
119+
"Resource": "arn:aws:sqs:[AWS_REGION]:[AWS_ACCOUNT]:event-collector-queue"
120+
}
121+
]
122+
}
123+
```
124+
Replace `[AWS_REGION]` and `[AWS_ACCOUNT]` in the above policy before attaching with the IAM user.
125+
126+
1. Generate an access key pair (access key and secret access key) for IAM user in the AWS CLI. The key pair will be used while running the test script.
127+
128+
1. Update the AWS region in the test script `send_sqs_event.py` with the region, in which the SQS queue will be created:
129+
130+
```
131+
config = Config(region_name='ap-south-1')
132+
```
133+
134+
1. Run the test script:
135+
136+
```
137+
python send_sqs_event.py
138+
```
139+
140+
1. Enter the Access Key and Secret Access Key when prompted:
141+
142+
```
143+
Enter Access Key: ********************
144+
Enter Secret Access Key: ****************************************
145+
```
146+
147+
1. Check the S3 bucket to see if a new JSON object has been created:
148+
149+
```
150+
aws s3 ls --region ap-south-1 my-bucket-20250329
151+
```
152+
153+
## Cleanup
154+
155+
1. Delete the AWS resources through Terraform:
156+
157+
```
158+
terraform apply -destroy -auto-approve
159+
```
160+
161+
## Resources
162+
163+
* [Amazon Simple Queue Service (Amazon SQS)](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/welcome.html)
164+
* [AWS Lambda](https://docs.aws.amazon.com/lambda/latest/dg/welcome.html)
165+
* [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/userguide/Welcome.html)
166+
167+
----
168+
Copyright 2025 Amazon.com, Inc. or its affiliates. All Rights Reserved.
169+
170+
SPDX-License-Identifier: MIT-0
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
import boto3
2+
from botocore.config import Config
3+
import json
4+
5+
import boto3.s3
6+
7+
def lambda_handler(event, context):
8+
print(event['Records'][0]['body'])
9+
print(context)
10+
file_name = 'request_' + json.loads(event['Records'][0]['body'])["uniqueID"] + '.json'
11+
request_body = event['Records'][0]['body']
12+
13+
config = Config(region_name='ap-south-1')
14+
s3_client = boto3.client('s3',config=config)
15+
resp = s3_client.put_object(
16+
Body=str(request_body).encode(encoding="utf-8"),
17+
Bucket='my-bucket-20250329',
18+
Key=file_name
19+
)
20+
21+
print(resp)
Lines changed: 186 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,186 @@
1+
terraform {
2+
required_providers {
3+
aws = {
4+
source = "hashicorp/aws"
5+
version = "~>5.41"
6+
}
7+
}
8+
9+
required_version = ">=1.2.0"
10+
}
11+
12+
provider "aws" {
13+
region = "ap-south-1"
14+
}
15+
16+
data "archive_file" "lambda_handler_zip_file" {
17+
type = "zip"
18+
source_file = "${path.module}/handler.py"
19+
output_path = "${path.module}/sqs-lambda-s3.zip"
20+
}
21+
22+
# Lambda function
23+
resource "aws_lambda_function" "event-processor" {
24+
function_name = "event-processor"
25+
filename = data.archive_file.lambda_handler_zip_file.output_path
26+
source_code_hash = filebase64sha256(data.archive_file.lambda_handler_zip_file.output_path)
27+
handler = "handler.lambda_handler"
28+
runtime = "python3.12"
29+
role = aws_iam_role.event-processor-exec-role.arn
30+
}
31+
32+
# Lambda execution role
33+
resource "aws_iam_role" "event-processor-exec-role" {
34+
name = "event-processor-exec-role"
35+
assume_role_policy = jsonencode({
36+
Version = "2012-10-17",
37+
Statement = [
38+
{
39+
Effect = "Allow"
40+
Principal = {
41+
Service = "lambda.amazonaws.com"
42+
}
43+
Action = [
44+
"sts:AssumeRole"
45+
]
46+
}
47+
]
48+
})
49+
}
50+
51+
# Lambda exec role policy
52+
resource "aws_iam_policy" "event-processor-policy" {
53+
name = "event-processor-policy"
54+
policy = jsonencode({
55+
Version = "2012-10-17"
56+
Statement = [
57+
{
58+
Effect = "Allow"
59+
Action = [
60+
"sts:AssumeRole"
61+
]
62+
Resource = [aws_lambda_function.event-processor.arn]
63+
},
64+
{
65+
Effect = "Allow"
66+
Action = [
67+
"sqs:ReceiveMessage",
68+
"sqs:GetQueueAttributes",
69+
"sqs:DeleteMessage"
70+
]
71+
Resource = aws_sqs_queue.event-collector.arn
72+
},
73+
{
74+
Effect = "Allow"
75+
Action = [
76+
"s3:PutObject"
77+
]
78+
Resource = [
79+
"${aws_s3_bucket.event-storage.arn}",
80+
"${aws_s3_bucket.event-storage.arn}/*",
81+
]
82+
}
83+
]
84+
})
85+
}
86+
87+
# Attach policy to Lambda execution role for SQS permissions
88+
resource "aws_iam_role_policy_attachment" "lambda-exec-role-policy" {
89+
policy_arn = aws_iam_policy.event-processor-policy.arn
90+
role = aws_iam_role.event-processor-exec-role.name
91+
}
92+
93+
# Attach policy to Lambda exec role for CloudWatch permissions
94+
resource "aws_iam_role_policy_attachment" "lambda-policy" {
95+
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
96+
role = aws_iam_role.event-processor-exec-role.name
97+
}
98+
99+
# Event source mapping to create a trigger for Lambda to read from SQS queue
100+
resource "aws_lambda_event_source_mapping" "event-processor-event-src-map" {
101+
function_name = aws_lambda_function.event-processor.arn
102+
event_source_arn = aws_sqs_queue.event-collector.arn
103+
enabled = true
104+
depends_on = [
105+
aws_lambda_function.event-processor,
106+
aws_sqs_queue.event-collector,
107+
aws_sqs_queue_policy.event-collector-policy,
108+
aws_iam_policy.event-processor-policy
109+
]
110+
}
111+
112+
# SQS Queue
113+
resource "aws_sqs_queue" "event-collector" {
114+
name = "event-collector-queue"
115+
max_message_size = 2048
116+
}
117+
118+
# SQS queue policy
119+
resource "aws_sqs_queue_policy" "event-collector-policy" {
120+
queue_url = aws_sqs_queue.event-collector.url
121+
policy = jsonencode({
122+
Version = "2012-10-17"
123+
Statement = [
124+
{
125+
Effect = "Allow"
126+
Principal = {
127+
Service = "lambda.amazonaws.com"
128+
}
129+
Action = [
130+
"sqs:ReceiveMessage",
131+
"sqs:GetQueueAttributes",
132+
"sqs:DeleteMessage"
133+
]
134+
Resource = aws_sqs_queue.event-collector.arn
135+
Condition = {
136+
ArnEquals = {
137+
"aws:SourceArn" = aws_lambda_function.event-processor.arn
138+
}
139+
}
140+
}
141+
]
142+
})
143+
144+
depends_on = [
145+
aws_sqs_queue.event-collector,
146+
aws_lambda_function.event-processor
147+
]
148+
}
149+
150+
# S3 bucket
151+
resource "aws_s3_bucket" "event-storage" {
152+
bucket = "my-bucket-20250329"
153+
force_destroy = true
154+
tags = {
155+
Name = "event-storage"
156+
}
157+
}
158+
159+
# Bucket policy document
160+
data "aws_iam_policy_document" "bucket-policy" {
161+
statement {
162+
effect = "Allow"
163+
actions = ["s3:PutObject"]
164+
principals {
165+
type = "Service"
166+
identifiers = [
167+
"lambda.amazonaws.com"
168+
]
169+
}
170+
resources = [
171+
"${aws_s3_bucket.event-storage.arn}",
172+
"${aws_s3_bucket.event-storage.arn}/*",
173+
]
174+
condition {
175+
test = "ArnEquals"
176+
variable = "aws:SourceArn"
177+
values = ["${aws_lambda_function.event-processor.arn}"]
178+
}
179+
}
180+
}
181+
182+
# Bucket policy
183+
resource "aws_s3_bucket_policy" "event-storage-bucket-policy" {
184+
bucket = aws_s3_bucket.event-storage.id
185+
policy = data.aws_iam_policy_document.bucket-policy.json
186+
}
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
import boto3
2+
from botocore.config import Config
3+
import json
4+
import maskpass
5+
import uuid
6+
7+
access_key = maskpass.askpass('Enter Access Key: ')
8+
secret_access_key = maskpass.askpass('Enter Secret Access Key: ')
9+
config = Config(region_name='ap-south-1')
10+
sqs_client = boto3.client('sqs',
11+
aws_access_key_id=access_key,
12+
aws_secret_access_key=secret_access_key,
13+
config=config)
14+
uniq_id = str(uuid.uuid4())
15+
response = sqs_client.send_message(
16+
QueueUrl='event-collector-queue',
17+
MessageBody=json.dumps({"status": 200, "uniqueID": uniq_id})
18+
)
19+
print(response)

0 commit comments

Comments
 (0)