AWS Lambda and Storage in S3

AWS Lambda and Storage in S3

AWS Lambda and S3 Storage integration with monitor logs and alarms

☁️ AWS Lambda and S3 Storage to create services in the Cloud with scalable infrastructure, automatic monitoring and alarms with CloudWatch.

 AWS Amazon Web Services for DevOps in the Cloud

AWS Lambda with S3 Storage


Amazon S3 and AWS Lambda are some of the most widely used Amazon Web Services.
You can integrate these services to provide custom tasks required in your workflow.

⚡ AWS Lambda Functions


AWS Lambda is a serverless compute service that runs code in response to events, without provisioning or managing servers.
Scales automatically based on demand.
Event-driven: triggered by actions like file uploads to S3 storage, API Gateway requests, DynamoDB updates, or Step Functions transitions.
Pay-per-use: charged only for execution time (milliseconds).
Supports multiple runtimes like: Python, Node.js, Java, Go, and other programming languages.
Used to handle infrastructure tasks like patching, scaling, and monitoring automatically.
• Some use cases of lambda functions:
- Automation of operational tasks with divided in microservices.
- Real-time file processing like image resizing or data processing when uploaded to S3.
- Works a backend logic for web and mobile apps at the same time.
- Data transformation pipelines and Extract Transfer and Load (ETL) processes.

📀 S3 Simple Storage Service


🌐 Amazon S3 means Simple Storage Service
- S3 is an object storage service designed for scalability, availability, and durability.
- Organize all the internal data into buckets and objects.
- Can stores any type of data like images, videos, audios, backups, logs, etc.
- Offers multiple storage classes for cost optimization like: Standard, Glacier and Intelligent-Tiering.
- Provides security with fine access control with IAM policies and bucket policies.
- Can be integrated with other AWS services for analytics, monitoring, alerts, machine learning, and content delivery.
• Can be used in applications like:
- Hosting static websites with html pages, images, audios or videos.
- Backup and disaster recovery.
- Store data lakes for big data analytics.
- Media storage and distribution.

AWS Lambda and S3 Integration


🔗 AWS Lambda and S3 can work together to provide microservices for different applications.
One of the most common patterns is S3 + Lambda integration architecture:
- A file is uploaded to an S3 bucket.
- This event triggers a Lambda function.
- The Lambda function processes the file (e.g., compresses, resizes, extracts metadata) and stores the result back in S3 or another service.
This architecture combine these powerful services for building event-driven processes that are cost-efficient and highly scalable.
The project add logs, monitoring and alerts with AWS CloudWatch.

AWS Lambda and S3 in Python


📝 This is simple a Python example showing how to integrate AWS S3 with AWS Lambda. This pattern is commonly used when you want Lambda to process files automatically after they’re uploaded to S3.

Example of : S3 → Lambda Integration in Python
# Python AWS Lambda Function to access S3 storage
# Lambda Function (Python) import json import boto3 # lambda handler to process the event def lambda_handler(event, context): # Initialize S3 client s3 = boto3.client('s3') # Extract bucket and object key from the event bucket_name = event['Records'][0]['s3']['bucket']['name'] object_key = event['Records'][0]['s3']['object']['key'] print(f"New file uploaded: {object_key} in bucket {bucket_name}") # Download the file from S3 download_path = f"/tmp/{object_key.split('/')[-1]}" s3.download_file(bucket_name, object_key, download_path) # Read file content with open(download_path, 'r') as f: file_content = f.read() print(f"File content: {file_content}") # Example task with text transform # Write processed content back to S3 processed_key = f"processed/{object_key.split('/')[-1]}" s3.put_object( Bucket=bucket_name, Key=processed_key, Body=file_content.upper() # Example transformation ) return { 'statusCode': 200, 'body': json.dumps(f"Processed {object_key} and saved to {processed_key}") }



⚙️ Steps to Set Up this example:
- Create an S3 bucket (e.g., my-upload-bucket).
- Create a Lambda function in AWS Console with runtime Python 3.x.
• Attach IAM role to Lambda with permissions:
- s3:GetObject
- s3:PutObject
• Add an S3 trigger:
- Configure the bucket to trigger the Lambda function on ObjectCreated events.
• Test the integration:
- Upload a file to the S3 bucket.
- Lambda will automatically process it and save the transformed file in a processed/ folder.

This is a basic template. In real-world scenarios, you might:
- Process images (resize, compress).
- Extract metadata.
- Push data into DynamoDB or another service.
- Trigger workflows with Step Functions.

Lambda and S3 from AWS CLI


This is a step-by-step deployment guide to set up from your AWS CLI terminal with S3 + Lambda integration.

1. Create an S3 Bucket
aws s3 mb s3://my-upload-bucket
- Replace my-upload-bucket with a unique bucket name.

2. Create IAM Role for Lambda
Create a trust policy file (trust-policy.json):
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "lambda.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }


Create the role: aws iam create-role \ --role-name LambdaS3Role \ --assume-role-policy-document file://trust-policy.json

Attach policies for S3 and CloudWatch Logs:

aws iam attach-role-policy \ --role-name LambdaS3Role \ --policy-arn arn:aws:iam::aws:policy/AmazonS3FullAccess aws iam attach-role-policy \ --role-name LambdaS3Role \ --policy-arn arn:aws:iam::aws:policy/CloudWatchLogsFullAccess




3. Package Lambda Function
Save your Python code (e.g., lambda_function.py) and zip it: zip function.zip lambda_function.py



4. Create Lambda Function
aws lambda create-function \ --function-name MyS3ProcessorFunction \ --runtime python3.9 \ --role arn:aws:iam::myACCOUNT_ID:role/LambdaS3Role \ --handler lambda_function.lambda_handler \ --zip-file fileb://function.zip


- Replace myACCOUNT_ID with your AWS account ID.

5. Add S3 Trigger
Create a notification configuration file (s3-notification.json):
{ "LambdaFunctionConfigurations": [ { "LambdaFunctionArn": "arn:aws:lambda:REGION:ACCOUNT_ID:function:MyS3ProcessorFunction", "Events": ["s3:ObjectCreated:*"] } ] }


Apply the notification to the bucket:
aws s3api put-bucket-notification-configuration \ --bucket my-upload-bucket \ --notification-configuration file://s3-notification.json



6. Grant S3 Permission to Invoke Lambda
aws lambda add-permission \ --function-name MyS3ProcessorFunction \ --statement-id S3InvokePermission \ --action "lambda:InvokeFunction" \ --principal s3.amazonaws.com \ --source-arn arn:aws:s3:::my-upload-bucket



✅ Test the Setup
Upload a file to your bucket:
aws s3 cp test.txt s3://my-upload-bucket/



- This should trigger the Lambda function, process the file, and (in the example code) save the transformed version in a processed/
folder.

Monitor logs in CloudWatch


✅ Use monitor logs in CloudWatch to verify that your Lambda executed successfully.

Workflow of tasks:
- CloudWatch metrics track Lambda performance.
- Alarms notify you when thresholds are exceeded.
- SNS topics deliver alerts to email/SMS.
- Lambda Insights provides advanced monitoring.
To monitor your Lambda function execution logs after setting up the S3 + Lambda integration, you’ll use Amazon CloudWatch Logs.
Here’s how to do it with the AWS CLI:

1. Find the Log Group
Each Lambda function automatically creates a CloudWatch log group named: /aws/lambda/function-name

For example:
/aws/lambda/MyS3ProcessorFunction

List log groups with :
aws logs describe-log-groups



2. List Log Streams
Log streams are created for each instance of your Lambda execution environment.
aws logs describe-log-streams \ --log-group-name "/aws/lambda/MyS3ProcessorFunction" \ --order-by LastEventTime \ --descending



3. View Logs
Retrieve the latest logs:
aws logs get-log-events \ --log-group-name "/aws/lambda/MyS3ProcessorFunction" \ --log-stream-name "mylog-stream-name"


Replace mylog-stream-name with one from the previous step.

4. Tail Logs in Real Time
You can continuously stream logs to your terminal:
aws logs tail "/aws/lambda/MyS3ProcessorFunction" --follow


This is the most convenient way to watch your Lambda output live as files are uploaded to S3.

✅ You’ll See
- START and END markers for each invocation.
- Any print() or logging output from your Python code.
- Errors or stack traces if something fails.

CloudWatch metrics and alarms


Set up CloudWatch metrics and alarms (e.g., trigger an alert if your Lambda errors exceed a threshold).
That way you don’t have to manually check logs all the time.
Here’s how you can set up CloudWatch monitoring and alarms for your Lambda function using the AWS CLI:

🔹 Key Metrics to Monitor
CloudWatch automatically tracks several Lambda metrics:
- Invocations – number of times the function runs.
- Errors – failed executions.
- Duration – execution time in milliseconds.
- Throttles – when requests exceed concurrency limits.
- IteratorAge – for stream-based invocations (Kinesis/DynamoDB).
These are the most common metrics to attach alarms to.

1. Create a CloudWatch Alarm for Errors
This alarm triggers if your Lambda function has more than 1 error in a 5-minute period.
aws cloudwatch put-metric-alarm \ --alarm-name LambdaErrorAlarm \ --alarm-description "Alarm when Lambda errors exceed threshold" \ --metric-name Errors \ --namespace AWS/Lambda \ --statistic Sum \ --period 300 \ --threshold 1 \ --comparison-operator GreaterThanOrEqualToThreshold \ --dimensions Name=FunctionName,Value=MyS3ProcessorFunction \ --evaluation-periods 1 \ --alarm-actions arn:aws:sns:myREGION:myACCOUNT_ID:MySNSTopic


- Replace myREGION and myACCOUNT_ID with your values.
- --alarm-actions points to an SNS topic that sends notifications (email, SMS, etc.).

2. Create a CloudWatch Alarm for Duration
This alarm triggers if execution time exceeds 3 seconds.
aws cloudwatch put-metric-alarm \ --alarm-name LambdaDurationAlarm \ --alarm-description "Alarm when Lambda duration exceeds 3s" \ --metric-name Duration \ --namespace AWS/Lambda \ --statistic Average \ --period 300 \ --threshold 3000 \ --comparison-operator GreaterThanThreshold \ --dimensions Name=FunctionName,Value=MyS3ProcessorFunction \ --evaluation-periods 1 \ --alarm-actions arn:aws:sns:myREGION:myACCOUNT_ID:MySNSTopic




3. Set Up Notifications (SNS)
Create an SNS topic for alerts:
aws sns create-topic --name MySNSTopic


Subscribe your email:
aws sns subscribe \ --topic-arn arn:aws:sns:myREGION:myACCOUNT_ID:MySNSTopic \ --protocol email \ --notification-endpoint you@example.com


Confirm the subscription via the email you receive.

4. Enable Lambda Insights (Optional)
For deeper monitoring (memory usage, CPU, network):
aws lambda update-function-configuration \ --function-name MyS3ProcessorFunction \ --layers arn:aws:lambda:myREGION:myACCOUNT_ID:layer:LambdaInsightsExtension:myVERSION


This adds Lambda Insights, giving you detailed dashboards in CloudWatch.

Trigger another Lambda function


How to automatically trigger another Lambda function when an alarm fires (for example, to clean up resources or send custom alerts).

You can trigger another Lambda function when a CloudWatch alarm fires. The trick is to use Amazon SNS or Amazon EventBridge as the intermediary, since CloudWatch alarms themselves don’t directly invoke Lambda.

Here’s how to set it up this workflow:
🔹 Option 1: Using SNS (Simple Notification Service)
- Create an SNS topic:
aws sns create-topic --name AlarmTriggerTopic
- Subscribe your Lambda function to the topic:
aws lambda add-permission \ --function-name SecondaryLambdaFunction \ --statement-id SNSInvokePermission \ --action "lambda:InvokeFunction" \ --principal sns.amazonaws.com \ --source-arn arn:aws:sns:myREGION:myACCOUNT_ID:AlarmTriggerTopic aws sns subscribe \ --topic-arn arn:aws:sns:myREGION:myACCOUNT_ID:AlarmTriggerTopic \ --protocol lambda \ --notification-endpoint arn:aws:lambda:myREGION:myACCOUNT_ID:function:SecondaryLambdaFunction

• Attach the SNS topic to your CloudWatch alarm:

aws cloudwatch put-metric-alarm \ --alarm-name LambdaErrorAlarm \ --metric-name Errors \ --namespace AWS/Lambda \ --statistic Sum \ --period 300 \ --threshold 1 \ --comparison-operator GreaterThanOrEqualToThreshold \ --dimensions Name=FunctionName,Value=MyS3ProcessorFunction \ --evaluation-periods 1 \ --alarm-actions arn:aws:sns:myREGION:myACCOUNT_ID:AlarmTriggerTopic



👉 Now, whenever the alarm fires, SNS will invoke your SecondaryLambdaFunction.

🔹 Option 2: Using EventBridge
EventBridge can directly route alarm state changes to Lambda.
- Create an EventBridge rule:

aws events put-rule \ --name AlarmStateChangeRule \ --event-pattern '{"source":["aws.cloudwatch"],"detail-type":["CloudWatch Alarm State Change"]}'

• Add Lambda as the target:

aws events put-targets \ --rule AlarmStateChangeRule \ --targets "Id"="1","Arn"="arn:aws:lambda:myREGION:myACCOUNT_ID:function:SecondaryLambdaFunction"

• Grant EventBridge permission to invoke Lambda:

aws lambda add-permission \ --function-name SecondaryLambdaFunction \ --statement-id EventBridgeInvokePermission \ --action "lambda:InvokeFunction" \ --principal events.amazonaws.com \ --source-arn arn:aws:events:myREGION:myACCOUNT_ID:rule/AlarmStateChangeRule


👉 This way, when the alarm changes state (e.g., goes into ALARM), EventBridge directly triggers your Lambda.

Best options to use:
- SNS → Best if you want multiple subscribers (e.g., email + Lambda).
- EventBridge → Best if you want fine-grained filtering and routing of alarm events.

✅ Real-world example workflow with Alarm triggers → Secondary Lambda cleans up resources or sends a Slack alert, so you can see how this fits into a practical system.
Technology AWS Storage Integration 2026
Amazon FSx for NetApp ONTAP now integrates with Amazon S3 for seamless data access Amazon Web Services (AWS)
Frontier agents, Trainium chips, and Amazon Nova: key announcements from AWS re:Invent 2025 About Amazon
Introducing Amazon S3 Vectors: First cloud storage with native vector support at scale (preview) Amazon Web Services (AWS)
Top 10 Cloud Storage Services for Business Business.com