Cloud Courier: Automated Bulk Email Delivery with AWS Serverless Services
1. Overview
Cloud Courier is a cloud-native solution for automating bulk email notifications using AWS serverless services. It enables users to upload CSV files with recipient details and automatically dispatch personalized emails through Amazon SES. The entire process is event-driven and highly scalable—uploading a CSV to Amazon S3 triggers an AWS Lambda function that parses the file, personalizes email content for each recipient, and sends out the emails. Supporting services like Amazon EventBridge, Amazon CloudWatch, and AWS IAM work together to ensure the workflow is seamless, secure, and requires minimal manual intervention. This automation streamlines large-scale email campaigns and minimizes the operational effort in delivering personalized emails to hundreds or thousands of recipients.
Emails are automatically sent via amazon SES.
2. Purpose and Motivation
Sending bulk emails efficiently, securely, and with personalized content presents significant challenges. Traditional approaches often struggle with scaling infrastructure for large recipient lists, handling dynamic content for each user, and monitoring delivery in real-time. Cloud Courier addresses these challenges by leveraging AWS serverless architecture in an event-driven design.
- Automating Infrastructure: By using AWS Lambda and managed services, the system removes the need for managing servers or provisioning capacity for bulk email jobs.
- Strengthening Security: End-to-end security is built in through AWS features—data is encrypted at rest in S3, transmissions to SES are secure, and IAM enforces strict access controls for all components.
- Enhancing Monitoring: AWS CloudWatch and SNS provide real-time monitoring and alerts, so every email dispatch is tracked and any issues trigger immediate notifications, improving reliability.
By addressing these areas, Cloud Courier provides a robust solution for large-scale email delivery that is easier to maintain and adapt than a traditional self-managed bulk email system.
3. Architecture
Cloud Courier follows a serverless architecture composed of several AWS services working in concert. The key components include:
- Amazon S3: Serves as the ingestion point for email jobs. Users upload CSV files containing recipient information (in this case names and email addresses) to an S3 bucket. Each upload event acts as a trigger for the email dispatch workflow.
CSV files are uploaded to an S3 bucket
- Amazon EventBridge: Acts as the event router. It listens for Object Created events from the S3 bucket and, when a new CSV file is uploaded, EventBridge triggers the Lambda function to start processing that file.
- AWS Lambda: Handles the core logic of the system. The Lambda function (running a Python script) is invoked for each uploaded CSV. It reads the file from S3, parses the data (‘name’ and ‘email’), personalizes a predefined email template for each recipient, and sends out emails via Amazon SES. This model scales automatically with the number of files and size of recipient lists.
- Amazon SES (Simple Email Service): Responsible for sending the emails. SES dispatches the personalized email messages to all recipients. It manages the actual email delivery, including handling bounces or complaints according to its configuration.
- Amazon CloudWatch: Monitors the execution of the Lambda function and overall system health. CloudWatch Logs capture the Lambda function’s runtime information (e.g., how many emails were sent, any errors encountered), and CloudWatch Alarms can be set to alert on failures or performance issues.
- AWS IAM (Identity and Access Management): Provides security through permissions controls. IAM roles and policies ensure that the Lambda function only has access to the specific S3 bucket, SES sending capabilities, CloudWatch logging, and SNS publish rights that it needs. This principle of least privilege enhances the security of the system.

All these components work together in an event-driven pipeline. In a typical workflow, a user dropping a CSV file into the S3 bucket will seamlessly initiate the EventBridge rule, which invokes the Lambda function. The Lambda function then orchestrates reading data, sending emails via SES, and reporting any issues via CloudWatch and SNS. The architecture is fully serverless, meaning it can automatically scale to handle spikes in email volume and you only pay for the actual usage of resources, making the solution cost-efficient for both small and large email campaigns.
4. Setup Instructions
To deploy and configure the Cloud Courier system, follow these steps. The setup is divided into stages that mirror the architecture components. By the end of this setup, you will have the S3 bucket, Lambda function, and all necessary configurations in place to run the bulk email workflow.
4.1 Stage 1 – Create the S3 Buckets and a CSV file
- Access the S3 Console: Log in to your AWS Management Console and navigate to the Amazon S3 service.
-
Create an S3 Bucket: Create a new S3 bucket that will hold the CSV files (created in the next step) for your email campaigns. For example, something like
cloudcourier-source
(the name must be globally unique so this exact name cannot be used). -
Create a CSV file: Create a CSV file (you can use any CSV file creator) containing the recipients of your emails. Put the names of the email recipients in one column (the column should be called
name
without capitalization) and their emails in the next (the column should be titledemail
without capitalization). For example:
An example of a valid CSV file. Note that the columns name
and email
are not capitalized and should be recreated exactly in order to be compatible with the lambda function.
Emails are automatically sent via amazon SES.
4.2 Stage 2 – Set up SES
- Open the SES Console: In the AWS Console, go to IAM (Identity and Access Management) and navigate to the Roles section.
- Create New Identities in SES: In the left column, under Configuration, click on Identities. Then, create identities out of the email addresses you will use to send emails.

In AWS SES, create and verify the email addresses or domains you will use to send through AWS SES.
4.3 Stage 3 – Deploy the Lambda Function
- Access the Lambda Console: Go to the AWS Lambda service in the console and click Create Function.
-
Create a New Lambda Function: Choose “Author from scratch” for the creation method. Provide a name for your function, for example,
S3TriggerFunction
. Select Python 3.11 for the runtime. For architecture, use the default (x86_64). -
Add Function Code: Add the python code found in the appendix of this document. Remember to replace the email found in the code to whichever email you’re going to use.
In AWS SES, create and verify the email addresses/domains you will use to send through AWS SES. - Deploy (Save) the Function: Once you have added the code and configured the settings, click Deploy to save the Lambda function.
- Add a trigger to the function: Configure the trigger so that the function is invoked whenever an object is uploaded to your S3 bucket.


4.4 Stage 4 – Create and Configure a Cloudwatch Alarm for Lambda Errors
- Open the CloudWatch Console: In the navigation pane, choose Alarms, then click Create an Alarm.
- Select the Lambda Metric: When prompted to select a metric for your new alarm, scroll and find your Lambda function and click on the iteration that tracks Errors.
- Set Threshold and Conditions:
-
Configure Alarm Actions (SNS Notification): In the Configure actions section, choose an existing SNS topic or create a new one, providing a name such as
LambdaErrorNotifications
if creating a new one. Add a subscription, such as an email address where alarm notifications will be sent. -
Name and Review the Alarm: Give the alarm a descriptive name (i.e.
LambdaAlarm
) then click Create alarm.

This configuration means that if there are more than 5 errors in a single 5-minute window, the alarm will fire.

4.5 Stage 5 – Create the IAM roles and permissions for your Lambda Function and set a Bucket Policy
- Go to the IAM Console: Select Roles in the left navigation pane and click Create Role.
- For service and use case, select Lambda. Leave everything else default, select a name for the role, and create the role.
- Navigate back to Roles: In the default Permissions tab, create an Inline policy.
- Enter a JSON Policy: Use the example provided in the appendix that grants least-privilege permissions for accessing S3, SES, and CloudWatch. Remember to replace the given bucket name with your own unique bucket name.
- Update Lambda Permissions: Navigate back to your Lambda function, go to Permissions > Configuration, and click on Edit.
- Select Existing Role: Choose “Use an existing role” and then select the IAM role you created from the dropdown.
-
Set the S3 Bucket Policy: Navigate to your S3 bucket, click on the Permissions tab, and set a bucket policy (example given in the appendix). Remember to fill in your own AWS account number, IAM role name, and unique bucket name.
This bucket policy grants your IAM role permission to read objects from your S3 bucket.






5. Testing and Validation
5.1 Test Strategy
-
Upload a new CSV into your S3 bucket: Perform a test run by uploading a sample CSV file to the S3 bucket. Make sure the CSV columns are named precisely
name
andemail
as in the example shown previously. -
Monitor the Lambda Execution: Once the CSV file is uploaded, Cloud Courier’s event chain will begin. In the AWS Lambda console, you should see that your function was invoked (you can check the Monitoring tab for this). The Lambda will read the file and attempt to send emails via SES.
An example of a CloudWatch log of a successful run.
- Verify Email Delivery: Check that the emails were sent.
-
Success!
Throughout testing, you can use the AWS Console to observe each service: S3 (to see the object), EventBridge (the rule’s metrics to see if it fired), Lambda (invocations and logs), SES (delivery status), and SNS (any alerts sent). This will give you a full picture of the system’s behavior.

Conclusion
Cloud Courier demonstrates how a combination of AWS serverless services can be orchestrated to create an efficient, secure, and scalable bulk email notification system. The project achieves its goals of eliminating manual email dispatch processes and infrastructure management, instead leveraging AWS S3 for triggering events, Lambda for on-the-fly processing, and SES for reliable email delivery. Throughout its design and implementation, Cloud Courier adheres to AWS best practices in security (through IAM roles and encryption), monitoring (via CloudWatch and SNS alerts), and performance (scaling automatically with demand), while remaining highly customizable and easy to deploy for different use cases.
This solution not only meets the immediate needs of bulk email sending with personalization and monitoring but also provides a flexible framework for future expansion. Future enhancements could further increase its capabilities, such as supporting multiple email templates for different campaigns, integrating a database (e.g., Amazon DynamoDB) to record email delivery status and recipient preferences, adding advanced analytics for open and bounce rates, or implementing a more sophisticated retry mechanism for failed sends to improve deliverability. By incorporating these improvements, Cloud Courier could evolve into an even more powerful platform for cloud-based communications.
In summary, Cloud Courier proves the value of a serverless architecture in a real-world application domain: it simplifies operations, scales effortlessly, and maintains reliability. It serves as a blueprint for how modern cloud services can be combined to solve the challenges of bulk communication in an elegant, cost-effective manner.
Appendix
Lambda Function Code
import json
import boto3
import csv
import io
s3 = boto3.client('s3')
ses = boto3.client('ses')
def lambda_handler(event, context):
try:
bucket_name = event['Records'][0]['s3']['bucket']['name']
file_key = event['Records'][0]['s3']['object']['key']
event_name = event['Records'][0]['eventName']
except KeyError as e:
print(f"Error extracting event data: {e}")
return {'statusCode':400,'body':json.dumps('Error extracting event data.')}
recipients = []
try:
response = s3.get_object(Bucket=bucket_name, Key=file_key)
csv_content = response['Body'].read().decode('utf-8')
print("CSV Content:", csv_content)
reader = csv.DictReader(io.StringIO(csv_content))
for row in reader:
try:
print("Row Data:", row)
recipients.append(row['email'])
except KeyError as e:
print(f"Error: Missing required field in CSV: {e}")
except Exception as e:
print(f"Error processing S3 object: {str(e)}")
return {'statusCode':500,'body':json.dumps('Error processing S3 object.')}
if "ObjectCreated:" in event_name:
print(f"Handling object creation event for bucket {bucket_name} with key {file_key}")
subject = "The service is live!"
body = "Welcome to our service!"
for recipient in recipients:
try:
response = ses.send_email(
Source='johnlee0207@gmail.com',
Destination={'ToAddresses': [recipient]},
Message={'Subject': {'Data': subject}, 'Body': {'Text': {'Data': body}}}
)
print(f"Email sent to {recipient}! Message ID: {response['MessageId']}")
except Exception as e:
print(f"Error sending email to {recipient}: {str(e)}")
elif "ObjectRemoved:" in event_name:
print(f"Handling object deletion event for bucket {bucket_name} with key {file_key}")
subject = "You have been removed from the service"
body = (f"The object with key {file_key} has been deleted from bucket {bucket_name}. "
"You have been removed from the service.")
for recipient in recipients:
try:
response = ses.send_email(
Source='johnlee0207@gmail.com',
Destination={'ToAddresses': [recipient]},
Message={'Subject': {'Data': subject}, 'Body': {'Text': {'Data': body}}}
)
print(f"Deletion notification sent to {recipient}! Message ID: {response['MessageId']}")
except Exception as e:
print(f"Error sending deletion notification to {recipient}: {str(e)}")
return {'statusCode':200,'body':json.dumps('Processed S3 event successfully!')}
Lambda Function JSON policy (Remember to replace cloudcourier-source with your unique bucket name):
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "S3Access",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::cloudcourier-source",
"arn:aws:s3:::cloudcourier-source/*"
]
},
{
"Sid": "SESAccess",
"Effect": "Allow",
"Action": [
"ses:SendEmail",
"ses:SendRawEmail"
],
"Resource": "*"
},
{
"Sid": "CloudWatchLogs",
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:*:*:*"
}
]
}
S3 Bucket Policy: Remember to replace the AWS account number (e.g. 600627315963) with your own unique AWS account number. Also remember to replace the bucket name with your own, as well as the name of your IAM role.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowLambdaRead",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::600627315963:role/S3CloudCourierRole"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::cloudcourier-source/*"
}
]
}