LocalStack LogoLocalStack Icon

Testing Your AWS Serverless App Locally Like a Boss (or at Least Like Someone Who Writes Tests šŸ˜…)

Writing tests for serverless apps often feels like herding cats in the cloud. In this tutorial, learn how to run real integration tests on your AWS stack, including S3, Lambda, SQS, and DynamoDB, using LocalStack, pytest, and boto3.

Testing Your AWS Serverless App Locally Like a Boss (or at Least Like Someone Who Writes Tests šŸ˜…)

You deployed your serverless app locally. It worked. You even saw data show up in DynamoDB and felt pretty good about yourself. But here is the question: will it still work tomorrow? Will it still work after your coworker ā€œoptimizesā€ the Lambda with some mysterious regex he copy-pasted from ChatGPT? This is why we write automated tests. So we can catch bugs early and avoid debugging production during our weekend.

Today we are going to write tests for the inventory app we deployed in my previous post, still running locally with LocalStack. We will use Python, pytest, and boto3 to make sure the pipeline from S3 to DynamoDB actually behaves.

Why Test Locally?

Some developers still ā€œtest in productionā€ and call it a day. But testing locally gives you:

  • Fast feedback since you are not waiting on AWS.
  • A zero-dollar AWS bill, which your manager will love.
  • Confidence that uploading a blank CSV won’t break everything.

LocalStack lets you use real AWS services locally and write proper integration tests without mocking the universe.

What We’re Going to Do

Ok so here is the plan:

  1. Deploy the app with LocalStack if you have not already.
  2. Write a test that uploads a sample CSV to S3.
  3. Wait for the pipeline to finish processing.
  4. Assert that DynamoDB contains the expected records.

Let’s get started.

Set Up the Test Environment

If you already have the repo cloned and LocalStack running, skip this part. If not, run these commands to catch up:

Terminal window
# Clone the app
git clone https://github.com/aws-samples/amazon-sqs-best-practices-cdk.git
cd amazon-sqs-best-practices-cdk
# Start LocalStack
localstack start
# Set up virtual environment
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
# Bootstrap and deploy
cdklocal bootstrap
cdklocal deploy

Now install the tools you need for testing:

Terminal window
pip install pytest boto3

Writing the Test Using Prefixes

When you deploy your stack with CDK, it politely appends random gibberish to your bucket and table names. This is to avoid collisions. So instead of getting a nice clean inventory-updates-bucket, you end up with something like:

Terminal window
sqsblogstack-inventoryupdatesbucketfe-7z8g3s9d

Charming, right?

If you hardcode that name in your test, it will break the next time you destroy and redeploy the stack. That’s why we use prefix-based searching.

Let’s create a file named test_integration.py in the project root. It will search all available buckets and tables and pick the ones that start with the known CDK prefixes.

Here’s the code:

Terminal window
import os
import boto3
import time
# Set dummy AWS credentials for LocalStack
os.environ['AWS_ACCESS_KEY_ID'] = 'test'
os.environ['AWS_SECRET_ACCESS_KEY'] = 'test'
os.environ['AWS_DEFAULT_REGION'] = 'us-east-1'
dynamodb = boto3.client('dynamodb', endpoint_url="http://localhost:4566")
s3 = boto3.client('s3', endpoint_url="http://localhost:4566")
def test_s3_to_dynamodb_flow():
# Look for the S3 bucket by prefix
target_bucket_prefix = "sqsblogstack-inventoryupdatesbucketfe-"
response = s3.list_buckets()
bucket_name = next(
(bucket['Name'] for bucket in response['Buckets'] if bucket['Name'].startswith(target_bucket_prefix)),
None
)
assert bucket_name is not None, "Bucket not found"
# Look for the DynamoDB table by prefix
target_ddb_prefix = "SqsBlogStack-InventoryUpdates"
response = dynamodb.list_tables()
table_name = next(
(table for table in response['TableNames'] if table.startswith(target_ddb_prefix)),
None
)
assert table_name is not None, "DynamoDB table not found"
# Upload the test CSV to S3
test_file = "sqs_blog/sample_file.csv"
s3.upload_file(test_file, bucket_name, test_file)
print(f"Uploaded {test_file} to bucket {bucket_name}")
# Wait for the pipeline to process
time.sleep(5)
# Scan DynamoDB for records
response = dynamodb.scan(TableName=table_name)
items = response.get("Items", [])
assert len(items) > 0, "No items found in DynamoDB"
print(f"Found {len(items)} items in DynamoDB")

Why Use Prefixes?

  • CDK creates resources with long names that include random suffixes to avoid collisions.
  • Prefix searching allows your tests to remain flexible, even if you destroy and redeploy the stack.
  • This is especially useful in CI/CD pipelines where every deployment can create slightly different resource names.

Run the Test

Run it using pytest:

Terminal window
pytest -v test_integration.py

If everything works, you will see green and feel like a responsible adult.

A Few Tips

  • You might need to adjust the time.sleep() to give the pipeline enough time to finish.
  • If you want to clean up the DynamoDB table between tests, you can add a helper to delete all items at the start of each test.
  • If Python is not your favorite, you can use the same concept with any language you like.

Why Not Just Unit Test?

Unit tests are great, but they do not tell you if your services actually work together. These integration tests make sure that S3 really triggers Lambda, that Lambda really sends to SQS, and that SQS really ends up in DynamoDB.

It’s like checking that your car actually moves, not just that the tires look round.

Quick Reference: All the Commands

Here’s a cheat sheet with all the commands you’ll use for this tutorial.

Terminal window
# Clone the app
git clone https://github.com/localstack/samples.git
cd samples/aws-cdk-inventory-app
# Set up virtual environment
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
# Synthesize template
cdklocal synth
# Bootstrap and deploy
cdklocal bootstrap
cdklocal deploy
# Install dependencies
pip install pytest boto3
# Run the tests
pytest -v test_integration.py

The Payoff

With these tests in place, you save yourself from shipping broken code and spending your weekend reading CloudWatch logs. You can make changes with confidence, and your coworker’s ā€œoptimizationsā€ are more likely to fail gracefully.

Want to follow along in real time? Watch the video where we walk through everything step by step.

In the next post, we will hook this into a CI/CD pipeline so tests run automatically on every push.

Until then, happy testing.


Kiah Imani
Kiah Imani
DevRel at LocalStack
Kiah Imani is a Senior Dev Advocate at LocalStack, where she turns cloud chaos into clarity. She’s all about making AWS dev feel local, fun, and way less stressful.