AWS Lambda and Serverless Computing

AWS Lambda and Serverless Computing

AWS Lambda and Serverless Computing in the Cloud

☁️ AWS Lambda Serverless Computing: Empowering services in the Cloud with scalable infrastructure with streamlined processes.

 AWS Amazon Web Services for DevOps in the Cloud

⚡ AWS Lambda


⛅ AWS Lambda is the the Core of Serverless Compute of Amazon Web Services
✨ AWS Lambda and serverless computing redefine how applications are built: event-driven, scalable, cost-efficient, and tightly integrated with AWS services. They are ideal for APIs, automation, data pipelines, and microservices, but require careful design around state, latency, and monitoring.
📘 Definition: AWS Lambda is a serverless compute service that runs code in response to events without provisioning or managing servers.
🖥️ Execution Model: You upload code (Node.js, Python, Java, Go, etc.), define triggers, and Lambda executes it automatically.
💰 Billing: Charged based on execution time and number of requests, not idle server capacity.

🧩 Key Features of Lambda Functions


✔ Event-driven: Triggered by AWS services (S3 uploads, DynamoDB streams, API Gateway requests, CloudWatch events).
🤖 Automatic scaling: Each event runs in its own isolated environment; Lambda scales horizontally without manual intervention.
⚡ Stateless execution: Functions don’t retain state between invocations, but can integrate with databases or caches.
🔑 Security: Integrated with IAM for fine-grained permissions.

🛠️ Common Workflows with Lambda

With lambda functions you can build web applications, data processing services, automation processes and IoT applications
• Web Applications:
- API Gateway → Lambda → DynamoDB
- Example: A REST API backend with serverless compute.
• Data Processing:
- S3 (file upload) → Lambda → Transform data → Store in RDS/Redshift.
- Example: Image resizing or log processing pipelines.
• Automation:
- CloudWatch Event → Lambda → Execute scripts (e.g., start/stop EC2).
- Example: Scheduled tasks replacing cron jobs.
• IoT Applications:
- IoT Core → Lambda → Process sensor data → Store in DynamoDB.

🏗️ Serverless Architectures in AWS


• Backend APIs: API Gateway + Lambda + DynamoDB → fully serverless REST/GraphQL APIs.
• Event-driven pipelines: S3 + Lambda + Kinesis → scalable data ingestion and transformation.
• Microservices: Each Lambda function represents a microservice endpoint, orchestrated via Step Functions.
• Hybrid workflows: Lambda integrates with ECS/EKS for mixed serverless + container workloads.

📊 Advantages of Serverless Compute


✅ Benefits and Impacts of Serverless Computing
No server management; Focus on code, not infrastructure
Cost efficiency; Pay only for execution time
Scalability; Automatic scaling per request
Rapid development; Faster prototyping and deployment
Integration; Native triggers from AWS ecosystem

🌧️ Challenges


⚠️ Some challenges and considerations for implementing serverless computing.
- Cold starts: First invocation after inactivity may have latency.
- Execution limits: Max runtime (15 minutes), memory (10 GB), and ephemeral storage (512 MB default, up to 10 GB).
- State management: Requires external services (DynamoDB, S3) for persistence.
- Complex debugging: Distributed, event-driven workflows can be harder to trace.

🎓 How to create a AWS Lambda function


Learn about the process both in the AWS Management Console and using the AWS CLI, so you can choose whichever workflow fits your style.
👣 Here’s a step‑by‑step guide to creating an AWS Lambda function with CLI commands.
🖥️ Creating a Lambda Function via AWS Console
- Sign in to AWS Console
Go to AWS Management Console and select Lambda from the services list.
• Create Function
- Click Create function.
• Choose Author from scratch.
- Enter a function name (e.g., MyFirstLambda).
- Select a runtime (Node.js, Python, Java, Go, etc.).
• Set Permissions
- Choose or create an IAM role with permissions (e.g., access to S3 if your function processes files).
- Write Your Code
- In the inline editor, paste your function code. Example (Python):
def lambda_handler(event, context):
return {
'statusCode': 200,
'body': 'Hello from Lambda!'
}

• Configure Trigger
- Add a trigger (e.g., API Gateway, S3 bucket event, DynamoDB stream).
- Example: If you want a REST API, select API Gateway.
• Deploy & Test
- Click Deploy.
- Use the Test button to run with sample input.
- Check logs in CloudWatch for debugging.

💻 Creating a Lambda Function via AWS CLI


• Prepare Your Code
Save your function in a file, e.g., lambda_function.py.
def lambda_handler(event, context):
return "Hello from Lambda CLI!"

• Zip the Code:
zip function.zip lambda_function.py

• Create IAM Role:
aws iam create-role --role-name lambda-ex-role \
--assume-role-policy-document file://trust-policy.json
- (trust-policy.json defines Lambda’s permission to assume the role.)
- Attach Policy with:

aws iam attach-role-policy --role-name lambda-ex-role \
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

• Create the Lambda Function:
aws lambda create-function \
--function-name MyFirstLambda \
--runtime python3.9 \
--role arn:aws:iam::< account-id >:role/lambda-ex-role \
--handler lambda_function.lambda_handler \
--zip-file fileb://function.zip

• Invoke Function from terminal:
aws lambda invoke --function-name MyFirstLambda output.json

🎯 Best Practices


- Use environment variables for configuration.
- Keep functions small and focused on one task.
- Monitor with CloudWatch for logs and metrics.
- Versioning & Aliases help manage deployments safely.
- Combine with API Gateway for serverless APIs.

👩‍💻 AWS Lambda function in Python


🐍 Here’s a practical AWS Lambda function in Python that connects to a database. There are two common scenarios: DynamoDB (NoSQL, serverless) and RDS (Relational Database Service).

🟢 Example 1: Lambda with DynamoDB
import boto3
import json

# Initialize DynamoDB client
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('ItemsTable')

def lambda_handler(event, context):
# Example: Insert a new item
item_id = event.get('id', '123')
name = event.get('name', 'Sample Item')

table.put_item(
Item={
'id': item_id,
'name': name
}
)

return {
'statusCode': 200,
'body': json.dumps({'message': 'Item inserted', 'id': item_id})
}


- IAM Role: Lambda needs permissions (dynamodb:PutItem, dynamodb:GetItem).
- Trigger: API Gateway can pass JSON payloads to this function.

📀 AWS Lambda function with RDS (MySQL/PostgreSQL)

🔷 Example 2: Lambda with RDS (MySQL/PostgreSQL)
import pymysql
import json

# Database connection settings (use environment variables for security)
db_host = "mydb.cluster-xyz.us-east-1.rds.amazonaws.com"
db_user = "admin"
db_password = "mypassword"
db_name = "mydatabase"

def lambda_handler(event, context):
connection = pymysql.connect(
host=db_host,
user=db_user,
password=db_password,
database=db_name
)

cursor = connection.cursor()
cursor.execute("SELECT * FROM users LIMIT 5;")
rows = cursor.fetchall()

cursor.close()
connection.close()

return {
'statusCode': 200,
'body': json.dumps(rows, default=str)
}


⛓️ Dependencies: Package pymysql must be included in your Lambda deployment package or Lambda layer.
🔌 Networking: Lambda must be in the same VPC/subnet as your RDS instance.
🔑 Security: Use Secrets Manager or SSM Parameter Store for credentials.

🖥️ Best Practices
- Environment Variables: Store DB host, user, and table names securely.
- Connection Management: Open/close connections inside the handler to avoid leaks.
- Error Handling: Wrap DB calls in try/except blocks.
- Monitoring: Use CloudWatch Logs to track queries and errors.

👉 You can package dependencies (like pymysql) into a Lambda layer so you can deploy the RDS example cleanly.

Serverless API with Lambda and Dynamo DB


📋 This is a reference architecture for a serverless API using API Gateway + Lambda + DynamoDB so you can visualize how these pieces fit together.
This architecture is a classic serverless pattern:
- API Gateway = entry point and request manager.
- Lambda = compute engine for business logic.
- DynamoDB = persistent storage layer.
🧱 Together, they form a highly scalable, cost-efficient, and resilient API without managing servers.

🏗️ Serverless API Reference Architecture
Flow of Requests
• Client Request
- A user (via web app, mobile app, or IoT device) sends an HTTP request.
- Example: GET /items or POST /items.
• Amazon API Gateway
- Acts as the front door to your API.
- Handles routing, authentication, throttling, and monitoring.
- Forwards the request to the appropriate Lambda function.
• AWS Lambda Function
- Executes your business logic.
- Example: Fetching items from DynamoDB, validating input, or transforming data.
- Stateless, event-driven, and scales automatically.
• Amazon DynamoDB
- A fully managed NoSQL database.
- Stores and retrieves application data with millisecond latency.
- Lambda interacts with DynamoDB using the AWS SDK.
• Response Path
- Lambda returns the result to API Gateway.
- API Gateway formats the response (JSON, XML, etc.) and sends it back to the client.

🔧 Example Workflow
- POST /items
- Client sends JSON payload → API Gateway → Lambda → DynamoDB (insert record).
- Response: { "status": "success", "id": "12345" }
- GET /items/{id}
- Client requests item → API Gateway → Lambda → DynamoDB (fetch record).
- Response: { "id": "12345", "name": "Sample Item" }

📊 Benefits of This Architecture
- Scalability: Automatically scales with demand.
- Cost Efficiency: Pay only for requests and execution time.
- Security: IAM roles and API Gateway authentication (Cognito, OAuth).
- Flexibility: Easy to add new endpoints by creating new Lambda functions.

Infrastructure as Code (IaC)


🧩 Infrastructure as Code (IaC) examples (like a CloudFormation or Terraform snippet) so you can see how to deploy this architecture automatically.
You can build out some Infrastructure as Code (IaC) examples for a serverless API architecture using API Gateway + Lambda + DynamoDB.
Using AWS CloudFormation (YAML) and Terraform (HCL) you can see how this can be automated.

📝 CloudFormation Example (YAML)
AWSTemplateFormatVersion: '2010-09-09'
Resources:
MyLambdaFunction:
Type: AWS::Lambda::Function
Properties:
FunctionName: MyServerlessAPI
Runtime: python3.9
Handler: lambda_function.lambda_handler
Role: arn:aws:iam::123456789012:role/lambda-ex-role
Code:
S3Bucket: my-code-bucket
S3Key: function.zip

MyApiGateway:
Type: AWS::ApiGateway::RestApi
Properties:
Name: MyServerlessAPI

MyApiResource:
Type: AWS::ApiGateway::Resource
Properties:
RestApiId: !Ref MyApiGateway
ParentId: !GetAtt MyApiGateway.RootResourceId
PathPart: items

MyApiMethod:
Type: AWS::ApiGateway::Method
Properties:
RestApiId: !Ref MyApiGateway
ResourceId: !Ref MyApiResource
HttpMethod: GET
AuthorizationType: NONE
Integration:
Type: AWS
IntegrationHttpMethod: POST
Uri:
Fn::Sub: arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${MyLambdaFunction.Arn}/invocations

MyDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: ItemsTable
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
BillingMode: PAY_PER_REQUEST

📝 Terraform Example (HCL)


- Terraform is multi-cloud and widely used for portability.
- CloudFormation is AWS-native and tightly integrated with the console.
- Both examples define Lambda, API Gateway, and DynamoDB resources, wiring them together for a serverless API.
- You can extend these templates with IAM policies, logging, and environment variables for production readiness.
provider "aws" {
region = "us-east-1"
}

resource "aws_dynamodb_table" "items" {
name = "ItemsTable"
billing_mode = "PAY_PER_REQUEST"
hash_key = "id"

attribute {
name = "id"
type = "S"
}
}

resource "aws_iam_role" "lambda_role" {
name = "lambda-ex-role"
assume_role_policy = < < EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}

resource "aws_lambda_function" "my_lambda" {
function_name = "MyServerlessAPI"
runtime = "python3.9"
handler = "lambda_function.lambda_handler"
role = aws_iam_role.lambda_role.arn
filename = "function.zip"
}

resource "aws_apigatewayv2_api" "api" {
name = "MyServerlessAPI"
protocol_type = "HTTP"
}

resource "aws_apigatewayv2_integration" "lambda_integration" {
api_id = aws_apigatewayv2_api.api.id
integration_type = "AWS_PROXY"
integration_uri = aws_lambda_function.my_lambda.arn
}

resource "aws_apigatewayv2_route" "route" {
api_id = aws_apigatewayv2_api.api.id
route_key = "GET /items"
target = "integrations/${aws_apigatewayv2_integration.lambda_integration.id}"
}

CI/CD pipeline


This is a CI/CD pipeline example with CodePipeline + CodeBuild, so you can see how to automatically deploy updates to this serverless API
You can our serverless API architecture with a CI/CD pipeline so updates can be deployed automatically. Learn how to use AWS CodePipeline + CodeBuild (CloudFormation) to create a Terraform workflow.

📝 CloudFormation Example: CI/CD Pipeline
Resources:
MyCodeCommitRepo:
Type: AWS::CodeCommit::Repository
Properties:
RepositoryName: MyServerlessAPIRepo
Code:
BranchName: main
S3:
Bucket: my-source-bucket
Key: source.zip

MyCodeBuildProject:
Type: AWS::CodeBuild::Project
Properties:
Name: MyServerlessAPIBuild
Environment:
ComputeType: BUILD_GENERAL1_SMALL
Image: aws/codebuild/standard:5.0
Type: LINUX_CONTAINER
Source:
Type: CODECOMMIT
Location: !GetAtt MyCodeCommitRepo.CloneUrlHttp
Artifacts:
Type: CODEPIPELINE

MyPipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
Name: MyServerlessAPIPipeline
RoleArn: arn:aws:iam::123456789012:role/CodePipelineRole
Stages:
- Name: Source
Actions:
- Name: SourceAction
ActionTypeId:
Category: Source
Owner: AWS
Provider: CodeCommit
Version: 1
OutputArtifacts:
- Name: SourceOutput
Configuration:
RepositoryName: !Ref MyCodeCommitRepo
BranchName: main
- Name: Build
Actions:
- Name: BuildAction
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
InputArtifacts:
- Name: SourceOutput
OutputArtifacts:
- Name: BuildOutput
Configuration:
ProjectName: !Ref MyCodeBuildProject
- Name: Deploy
Actions:
- Name: DeployLambda
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: Lambda
Version: 1
InputArtifacts:
- Name: BuildOutput
Configuration:
FunctionName: MyServerlessAPI

📝 Terraform with CI/CD Pipeline


Terraform Example with a CI/CD Pipeline
resource "aws_codecommit_repository" "repo" {
repository_name = "MyServerlessAPIRepo"
}

resource "aws_codebuild_project" "build" {
name = "MyServerlessAPIBuild"
service_role = aws_iam_role.codebuild_role.arn
artifacts {
type = "CODEPIPELINE"
}
environment {
compute_type = "BUILD_GENERAL1_SMALL"
image = "aws/codebuild/standard:5.0"
type = "LINUX_CONTAINER"
}
source {
type = "CODECOMMIT"
location = aws_codecommit_repository.repo.clone_url_http
}
}

resource "aws_codepipeline" "pipeline" {
name = "MyServerlessAPIPipeline"
role_arn = aws_iam_role.codepipeline_role.arn

stage {
name = "Source"
action {
name = "SourceAction"
category = "Source"
owner = "AWS"
provider = "CodeCommit"
version = "1"
output_artifacts = ["SourceOutput"]
configuration = {
RepositoryName = aws_codecommit_repository.repo.repository_name
BranchName = "main"
}
}
}

stage {
name = "Build"
action {
name = "BuildAction"
category = "Build"
owner = "AWS"
provider = "CodeBuild"
version = "1"
input_artifacts = ["SourceOutput"]
output_artifacts = ["BuildOutput"]
configuration = {
ProjectName = aws_codebuild_project.build.name
}
}
}

stage {
name = "Deploy"
action {
name = "DeployLambda"
category = "Deploy"
owner = "AWS"
provider = "Lambda"
version = "1"
input_artifacts = ["BuildOutput"]
configuration = {
FunctionName = aws_lambda_function.my_lambda.function_name
}
}
}
}



🔑 How It Works
- Source Stage: CodeCommit repository holds your Lambda code.
- Build Stage: CodeBuild compiles, tests, and packages the Lambda function.
- Deploy Stage: CodePipeline deploys the new package to Lambda automatically.

✅ Benefits
- Automation: No manual redeployment needed.
- Consistency: Same pipeline ensures repeatable builds.
- Integration: Works seamlessly with CloudFormation/Terraform.
- Scalability: Multiple environments (dev, test, prod) can be added as stages.

CloudFormation with Testing Stage


You can extend this pipeline with automated testing (unit tests in CodeBuild + integration tests via API Gateway) so you can see a full production-ready workflow.
If you extend the CI/CD pipeline with automated testing so your serverless API (API Gateway + Lambda + DynamoDB) is production‑ready.
📝 CloudFormation Example with Testing Stage
Resources:
MyCodeCommitRepo:
Type: AWS::CodeCommit::Repository
Properties:
RepositoryName: MyServerlessAPIRepo

MyCodeBuildProject:
Type: AWS::CodeBuild::Project
Properties:
Name: MyServerlessAPIBuild
Environment:
ComputeType: BUILD_GENERAL1_SMALL
Image: aws/codebuild/standard:5.0
Type: LINUX_CONTAINER
Source:
Type: CODECOMMIT
Location: !GetAtt MyCodeCommitRepo.CloneUrlHttp
Artifacts:
Type: CODEPIPELINE

MyTestProject:
Type: AWS::CodeBuild::Project
Properties:
Name: MyServerlessAPITest
Environment:
ComputeType: BUILD_GENERAL1_SMALL
Image: aws/codebuild/standard:5.0
Type: LINUX_CONTAINER
Source:
Type: CODEPIPELINE
Artifacts:
Type: CODEPIPELINE
# buildspec.yml will run unit + integration tests
# Example: pytest for Lambda, curl for API Gateway

MyPipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
Name: MyServerlessAPIPipeline
RoleArn: arn:aws:iam::123456789012:role/CodePipelineRole
Stages:
- Name: Source
Actions:
- Name: SourceAction
ActionTypeId:
Category: Source
Owner: AWS
Provider: CodeCommit
Version: 1
OutputArtifacts:
- Name: SourceOutput
Configuration:
RepositoryName: !Ref MyCodeCommitRepo
BranchName: main
- Name: Build
Actions:
- Name: BuildAction
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
InputArtifacts:
- Name: SourceOutput
OutputArtifacts:
- Name: BuildOutput
Configuration:
ProjectName: !Ref MyCodeBuildProject
- Name: Test
Actions:
- Name: TestAction
ActionTypeId:
Category: Test
Owner: AWS
Provider: CodeBuild
Version: 1
InputArtifacts:
- Name: BuildOutput
OutputArtifacts:
- Name: TestOutput
Configuration:
ProjectName: !Ref MyTestProject
- Name: Deploy
Actions:
- Name: DeployLambda
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: Lambda
Version: 1
InputArtifacts:
- Name: BuildOutput
Configuration:
FunctionName: MyServerlessAPI

Terraform with Testing Stage


📝 Terraform Example with Testing Stage
resource "aws_codebuild_project" "test" {
name = "MyServerlessAPITest"
service_role = aws_iam_role.codebuild_role.arn
artifacts {
type = "CODEPIPELINE"
}
environment {
compute_type = "BUILD_GENERAL1_SMALL"
image = "aws/codebuild/standard:5.0"
type = "LINUX_CONTAINER"
}
source {
type = "CODEPIPELINE"
}
}

resource "aws_codepipeline" "pipeline" {
name = "MyServerlessAPIPipeline"
role_arn = aws_iam_role.codepipeline_role.arn

stage {
name = "Source"
action {
name = "SourceAction"
category = "Source"
owner = "AWS"
provider = "CodeCommit"
version = "1"
output_artifacts = ["SourceOutput"]
configuration = {
RepositoryName = aws_codecommit_repository.repo.repository_name
BranchName = "main"
}
}
}

stage {
name = "Build"
action {
name = "BuildAction"
category = "Build"
owner = "AWS"
provider = "CodeBuild"
version = "1"
input_artifacts = ["SourceOutput"]
output_artifacts = ["BuildOutput"]
configuration = {
ProjectName = aws_codebuild_project.build.name
}
}
}

stage {
name = "Test"
action {
name = "TestAction"
category = "Test"
owner = "AWS"
provider = "CodeBuild"
version = "1"
input_artifacts = ["BuildOutput"]
output_artifacts = ["TestOutput"]
configuration = {
ProjectName = aws_codebuild_project.test.name
}
}
}

stage {
name = "Deploy"
action {
name = "DeployLambda"
category = "Deploy"
owner = "AWS"
provider = "Lambda"
version = "1"
input_artifacts = ["BuildOutput"]
configuration = {
FunctionName = aws_lambda_function.my_lambda.function_name
}
}
}
}



🔧 Testing Workflow
- Unit Tests: Run inside CodeBuild using pytest (Python) or jest (Node.js).
- Integration Tests: Use curl or Postman scripts to hit API Gateway endpoints after build.
- Fail Fast: If tests fail, pipeline stops before deployment.
- Logs: Results stored in CloudWatch Logs for debugging.

✅ Benefits
- Quality Assurance: Prevents broken code from reaching production.
- Automation: Every commit triggers build, test, and deploy.
- Scalability: Add more test stages (security scans, performance tests).
- Confidence: Ensures Lambda + API Gateway + DynamoDB integration works end‑to‑end.

CloudFormation with Performance Testing


To extend the pipeline with performance testing, you can validate scalability under heavy traffic.
This is typically done in the Test stage using load‑testing tools like Artillery (Node.js) or Locust (Python) inside CodeBuild.
You can add performance testing (e.g., load testing with tools like Artillery or Locust in CodeBuild) so you can validate scalability under heavy traffic.
📝 CloudFormation Example with Performance Testing
Resources:
MyPerfTestProject:
Type: AWS::CodeBuild::Project
Properties:
Name: MyServerlessAPIPerfTest
Environment:
ComputeType: BUILD_GENERAL1_SMALL
Image: aws/codebuild/standard:5.0
Type: LINUX_CONTAINER
Source:
Type: CODEPIPELINE
Artifacts:
Type: CODEPIPELINE
# buildspec.yml runs load tests
# Example: artillery run perf-test.yml

MyPipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
Name: MyServerlessAPIPipeline
RoleArn: arn:aws:iam::123456789012:role/CodePipelineRole
Stages:
- Name: Source
Actions: [...]
- Name: Build
Actions: [...]
- Name: UnitTest
Actions: [...]
- Name: PerfTest
Actions:
- Name: PerfTestAction
ActionTypeId:
Category: Test
Owner: AWS
Provider: CodeBuild
Version: 1
InputArtifacts:
- Name: BuildOutput
OutputArtifacts:
- Name: PerfTestOutput
Configuration:
ProjectName: !Ref MyPerfTestProject
- Name: Deploy
Actions: [...]

Terraform with Performance Testing


📝 Terraform Example with Performance Testing
resource "aws_codebuild_project" "perf_test" {
name = "MyServerlessAPIPerfTest"
service_role = aws_iam_role.codebuild_role.arn
artifacts {
type = "CODEPIPELINE"
}
environment {
compute_type = "BUILD_GENERAL1_SMALL"
image = "aws/codebuild/standard:5.0"
type = "LINUX_CONTAINER"
}
source {
type = "CODEPIPELINE"
}
}

resource "aws_codepipeline" "pipeline" {
name = "MyServerlessAPIPipeline"
role_arn = aws_iam_role.codepipeline_role.arn

stage {
name = "PerfTest"
action {
name = "PerfTestAction"
category = "Test"
owner = "AWS"
provider = "CodeBuild"
version = "1"
input_artifacts = ["BuildOutput"]
output_artifacts = ["PerfTestOutput"]
configuration = {
ProjectName = aws_codebuild_project.perf_test.name
}
}
}
}



🔧 Example buildspec.yml for Performance Testing
version: 0.2
phases:
install:
commands:
- npm install -g artillery
build:
commands:
- echo "Running performance tests..."
- artillery run perf-test.yml
artifacts:
files:
- perf-results.json


Where perf-test.yml defines load scenarios (e.g., 1000 requests/minute to your API Gateway endpoint).

📊 Benefits of Adding Performance Testing
- Scalability Validation: Ensures Lambda + API Gateway + DynamoDB can handle peak traffic.
- Early Detection: Identifies bottlenecks before production deployment.
- Automated QA: Every commit is tested for performance, not just functionality.
- Confidence: Guarantees your serverless API meets SLAs under load.

This makes your pipeline production‑grade: Source → Build → Unit Tests → Integration Tests → Performance Tests → Deploy.

Terraform with Performance Testing


You can also add a monitoring and alerting layer (CloudWatch alarms + SNS notifications) so you get real‑time alerts if performance thresholds are breached.
Here’s how you can add a monitoring and alerting layer to your serverless API pipeline so you get real‑time visibility and notifications when performance or reliability thresholds are breached:

🖥️ Monitoring with CloudWatch
- Metrics:
- Lambda: Invocation count, duration, errors, throttles.
- API Gateway: Latency, 4xx/5xx error rates.
- DynamoDB: Read/write capacity usage, throttled requests.
- Logs:
- Lambda logs automatically stream to CloudWatch Logs.
- API Gateway access logs can be enabled for request tracing.
- Dashboards:
- Create CloudWatch dashboards to visualize latency, error rates, and throughput.

🔔 Alerting with CloudWatch Alarms + SNS
- Define Alarms
- Example: Trigger if Lambda error rate > 5% for 5 minutes.
- Example: Trigger if API Gateway latency > 500ms.
Resources:
LambdaErrorAlarm:
Type: AWS::CloudWatch::Alarm
Properties:
AlarmName: LambdaErrorRateHigh
MetricName: Errors
Namespace: AWS/Lambda
Statistic: Sum
Period: 300
EvaluationPeriods: 1
Threshold: 5
ComparisonOperator: GreaterThanThreshold
Dimensions:
- Name: FunctionName
Value: MyServerlessAPI
AlarmActions:
- !Ref MySNSTopic

MySNSTopic:
Type: AWS::SNS::Topic
Properties:
TopicName: ServerlessAlerts

- Subscribe to SNS Topic
- Add email, SMS, or webhook subscriptions.
- Example: DevOps team receives email alerts when alarms fire.

Terraform with Monitoring and Alerts

📝 Terraform Example for Monitoring + Alerts
resource "aws_cloudwatch_metric_alarm" "lambda_errors" {
alarm_name = "LambdaErrorRateHigh"
comparison_operator = "GreaterThanThreshold"
evaluation_periods = 1
metric_name = "Errors"
namespace = "AWS/Lambda"
period = 300
statistic = "Sum"
threshold = 5

dimensions = {
FunctionName = aws_lambda_function.my_lambda.function_name
}

alarm_actions = [aws_sns_topic.alerts.arn]
}

resource "aws_sns_topic" "alerts" {
name = "ServerlessAlerts"
}

resource "aws_sns_topic_subscription" "email" {
topic_arn = aws_sns_topic.alerts.arn
protocol = "email"
endpoint = "devops-team@example.com"
}



📊 Benefits of Monitoring + Alerts
- Real‑time visibility: Track latency, errors, and throughput.
- Proactive response: Alerts notify teams before customers notice issues.
- Scalability insights: Monitor DynamoDB capacity and Lambda concurrency.
- Automation: Alarms can trigger automated remediation (e.g., scale DynamoDB, restart services).

By combining CloudWatch metrics, dashboards, and alarms with SNS notifications, you create a monitoring and alerting layer that ensures your serverless API is reliable under load. This closes the loop in your CI/CD pipeline: Source → Build → Test → Performance → Deploy → Monitor & Alert.
✅ You can also add an automated remediation workflow (e.g., using CloudWatch alarms + Lambda to auto‑scale DynamoDB or adjust concurrency limits) so the system can self‑heal without manual intervention.

Introduction to AWS Lambda - Serverless Compute on Amazon Web Services. AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources and more.
AWS Lambda explained in 90 seconds by Amazon Web Services. AWS Lambda processes tens of trillions requests every month. Learn how AWS Lambda accelerates your journey from an idea to reality. Summary : Intro; What is Lambda; Advantages; Conclusion
You can invoke AWS Lambda Function From Another Lambda!
AWS Lambda durable functions explained.
AWS Lambda durable functions enhances Lambda's programming model to simplify building fault-tolerant applications and a new approach to application logic orchestration.
Introduction to AWS Lambda & Serverless Applications in Amazon Web Services. Overview of Lambda, a serverless compute platform that has changed the way that developers in the cloud. Intro, Serverless means, Lambda Handles, AWS Lambda release history, Anatomy of a Lambda function, Introducing: AWS Lambda runtime API and layers, Fine-grained pricing, Tweak your function's computer power, Smart resource allocation, Lambda execution model, Lambda API, Common Lambda use cases, Amazon API Gateway, Introducing: API Gateway WebSockets, Serverless web application with API Gateway
Technology AWS Lambda 2026
Frontier agents, Trainium chips, and Amazon Nova: key announcements from AWS re:Invent 2025 About Amazon
Introducing AWS Transform custom: Crush tech debt with AI-powered code modernization | AWS News Blog Amazon Web Services (AWS)
Arm64 beats x86 with faster cold starts & huge cost savings across runtimes TechRadar
AWS Weekly Roundup: Kiro, AWS Lambda remote debugging, Amazon ECS blue/green deployments, Amazon Bedrock AgentCore, and more (July 21, 2025) Amazon Web Services (AWS)