├── coffee.jpg
├── image-resizing-s3.py
└── README.md
/coffee.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/mathesh-me/image-resizing-using-s3-lambda-sns/HEAD/coffee.jpg
--------------------------------------------------------------------------------
/image-resizing-s3.py:
--------------------------------------------------------------------------------
1 | import os
2 | import boto3
3 | from PIL import Image
4 | from io import BytesIO
5 |
6 | # Initialize AWS clients
7 | s3 = boto3.client('s3')
8 | sns = boto3.client('sns')
9 |
10 | # Define the S3 buckets and SNS topic
11 | bucket_1 = 'image-non-sized-1' # your-source-bucket
12 | bucket_2 = 'image-sized-1' # your-destination-bucket
13 | sns_topic_arn = 'arn:aws:sns:ap-south-1:804937851364:image-resizing-topic' # your-sns-topic
14 |
15 | def lambda_handler(event, context):
16 | if 'Records' in event:
17 | # Handle S3 batch event
18 | for record in event['Records']:
19 | handle_s3_record(record)
20 | else:
21 | # Handle single S3 event
22 | handle_s3_record(event)
23 |
24 | def handle_s3_record(record):
25 | # Ensure the event record structure is correct
26 | if 's3' in record and 'bucket' in record['s3'] and 'name' in record['s3']['bucket'] and 'object' in record['s3'] and 'key' in record['s3']['object']:
27 | # Get the bucket name and object key from the S3 event record
28 | source_bucket = record['s3']['bucket']['name']
29 | object_key = record['s3']['object']['key']
30 |
31 | # Download the file from S3 bucket_1
32 | response = s3.get_object(Bucket=source_bucket, Key=object_key)
33 | content_type = response['ContentType']
34 | image_data = response['Body'].read()
35 |
36 | # Resize and compress the image
37 | resized_image = resize_and_compress_image(image_data)
38 |
39 | # Upload the resized and compressed image to S3 bucket_2
40 | destination_key = f"resized/{object_key}"
41 | s3.put_object(Bucket=bucket_2, Key=destination_key, Body=resized_image, ContentType=content_type)
42 |
43 | # Send a notification to the SNS topic
44 | message = f"Image {object_key} has been resized and uploaded to {bucket_2}"
45 | sns.publish(TopicArn=sns_topic_arn, Message=message)
46 | else:
47 | # Log an error message if the event record structure is unexpected
48 | print("Error: Invalid S3 event record structure")
49 |
50 |
51 | def resize_and_compress_image(image_data, quality=75):
52 | # Open the image using PIL
53 | image = Image.open(BytesIO(image_data))
54 |
55 | # Compress the image
56 | image_io = BytesIO()
57 | image.save(image_io, format=image.format, quality=quality)
58 |
59 | return image_io.getvalue()
60 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Automated Image Resizing and Transfer System Using AWS Services
2 |
3 | ## Project Description:
4 | This project focuses on building an automated system for image processing and management within the AWS ecosystem. The goal is to streamline the handling of images by automatically resizing them and transferring them to a designated storage location while keeping stakeholders informed through notifications. Key AWS services, such as Lambda, S3, and SNS, are used to orchestrate this workflow.
5 |
6 | ## Key Features:
7 | 1. Image processing automation: Automatically resize and optimize images upon upload.
8 | 2. Secure storage: Store processed images in a secure and reliable S3 bucket.
9 | 3. Real-time notifications: Receive immediate updates about image processing via SNS.
10 | 4. Scalable architecture: Design for scalability to handle image processing demands.
11 | 5. Cost-efficient solution: Leverage AWS serverless technologies to minimize operational costs.
12 |
13 | ## Overview :
14 |
15 | 
16 |
17 |
18 |
19 | ## Steps :
20 | ### Step 1 :
21 | ### Creating Source and Designation s3 Buckets :
22 |
23 | 1. Navigate to the S3 Console.
24 | 2. Follow the Outlined Steps below.
25 |
26 |
27 | 
28 |
29 |
30 | 
31 |
32 |
33 | 
34 |
35 | 3. Create the destination bucket using the same steps and name it with a unique name.
36 |
37 | 
38 |
39 | 4. As you can see above , I created two buckets one is Source bucket and another one is Destination bucket.
40 |
41 | ### Step 2 :
42 | ### Creating the SNS Notification :
43 |
44 | 1. Navigate to the SNS console.
45 | 2. Follow the Outlined Steps below.
46 |
47 |
48 | 
49 |
50 |
51 | 
52 |
53 |
54 |
55 | 
56 |
57 |
58 | 
59 |
60 |
61 | 
62 |
63 |
64 | 
65 | 3. Scroll down and Click "Create subscription"
66 | 4. After this , you will receive some mail for Subscription Confirmation and you have to confirm that.
67 | 5. You can use any other protocols also like SQS, HTTP, SMS etc .,
68 |
69 |
70 | 
71 |
72 |
73 | 
74 |
75 |
76 | 
77 |
78 |
79 |
80 | ### Step 3 :
81 | ### Creating the Lambda :
82 |
83 | 1. Navigate to the Lambda Console.
84 | 2. Follow the Outlined steps below.
85 |
86 | 
87 |
88 |
89 |
90 | 
91 |
92 | 3. Now replace the default code with the image-resizing-s3.py and deploy the changes , Don't test the code now we have to do some more actions before testing.
93 | 4. After that , We have to give some permission for our Lambda Function to do our process (resizing) , For that navigate to the IAM Console and follow the below steps.
94 |
95 |
96 | 
97 |
98 | 
99 |
100 |
101 | 
102 |
103 |
104 | 
105 |
106 |
107 | 
108 |
109 |
110 | 
111 |
112 |
113 | 
114 |
115 | 5. Now navigate to the Lambda Console and follow the steps below.
116 |
117 |
118 | 
119 |
120 |
121 | 
122 |
123 |
124 | 
125 |
126 |
127 | 6. Now we have to trigger the function.
128 |
129 |
130 | 
131 |
132 |
133 | 
134 |
135 |
136 | 
137 |
138 |
139 | 7. Now we have to go to code section , and scroll down to layers.
140 | 8. We have to add layer .
141 | 9. May be you can think , why ?
142 | 10. It's because for resize the image we upload in our source S3 bucket , We need a python library called pillow in our code to resize the image . We can manually add Pillow library also, But it's very time consuming and you have to do lot more , Instead of manually adding pillow library we are going to use layers for Some easy action.
143 | 11. Follow the outlined Steps below.
144 |
145 |
146 | 
147 |
148 |
149 | 
150 | 12.You can copy the arn from below.
151 |
152 | ```
153 | arn:aws:lambda:ap-south-1:770693421928:layer:Klayers-p39-pillow:1
154 | ```
155 |
156 | 13. After done all the actions above , now we can test our code.
157 |
158 | 
159 |
160 |
161 | 
162 |
163 | 14. It will show some results like below , It runs successfully but return some error because we still not upload the images in S3 yet.
164 |
165 |
166 | 
167 |
168 |
169 | ### Step 4 :
170 | ### Results :
171 |
172 | 1. Navigate to the S3 Console.
173 | 2. Upload Some images in Source Bucket.
174 |
175 |
176 | 
177 |
178 |
179 |
180 | 
181 |
182 |
183 | 
184 |
185 |
186 | 
187 |
188 |
189 | 
190 |
191 |
192 | 
193 |
194 |
195 | 
196 |
197 | ### It Successfully resized the Image and sends me the Notification.
198 |
199 |
200 |
--------------------------------------------------------------------------------