Testing Your AWS Serverless App Locally Like a Boss (or at Least Like Someone Who Writes Tests š )
Writing tests for serverless apps often feels like herding cats in the cloud. In this tutorial, learn how to run real integration tests on your AWS stack, including S3, Lambda, SQS, and DynamoDB, using LocalStack, pytest, and boto3.

You deployed your serverless app locally. It worked. You even saw data show up in DynamoDB and felt pretty good about yourself. But here is the question: will it still work tomorrow? Will it still work after your coworker āoptimizesā the Lambda with some mysterious regex he copy-pasted from ChatGPT? This is why we write automated tests. So we can catch bugs early and avoid debugging production during our weekend.
Today we are going to write tests for the inventory app we deployed in my previous post, still running locally with LocalStack. We will use Python, pytest, and boto3 to make sure the pipeline from S3 to DynamoDB actually behaves.
Why Test Locally?
Some developers still ātest in productionā and call it a day. But testing locally gives you:
- Fast feedback since you are not waiting on AWS.
- A zero-dollar AWS bill, which your manager will love.
- Confidence that uploading a blank CSV wonāt break everything.
LocalStack lets you use real AWS services locally and write proper integration tests without mocking the universe.
What Weāre Going to Do
Ok so here is the plan:
- Deploy the app with LocalStack if you have not already.
- Write a test that uploads a sample CSV to S3.
- Wait for the pipeline to finish processing.
- Assert that DynamoDB contains the expected records.
Letās get started.
Set Up the Test Environment
If you already have the repo cloned and LocalStack running, skip this part. If not, run these commands to catch up:
# Clone the appgit clone https://github.com/aws-samples/amazon-sqs-best-practices-cdk.gitcd amazon-sqs-best-practices-cdk
# Start LocalStacklocalstack start
# Set up virtual environmentpython3 -m venv .venvsource .venv/bin/activatepip install -r requirements.txt
# Bootstrap and deploycdklocal bootstrapcdklocal deploy
Now install the tools you need for testing:
pip install pytest boto3
Writing the Test Using Prefixes
When you deploy your stack with CDK, it politely appends random gibberish to your bucket and table names. This is to avoid collisions.
So instead of getting a nice clean inventory-updates-bucket
, you end up with something like:
sqsblogstack-inventoryupdatesbucketfe-7z8g3s9d
Charming, right?
If you hardcode that name in your test, it will break the next time you destroy and redeploy the stack. Thatās why we use prefix-based searching.
Letās create a file named test_integration.py
in the project root. It will search all available buckets and tables and pick the ones that start with the known CDK prefixes.
Hereās the code:
import osimport boto3import time
# Set dummy AWS credentials for LocalStackos.environ['AWS_ACCESS_KEY_ID'] = 'test'os.environ['AWS_SECRET_ACCESS_KEY'] = 'test'os.environ['AWS_DEFAULT_REGION'] = 'us-east-1'
dynamodb = boto3.client('dynamodb', endpoint_url="http://localhost:4566")s3 = boto3.client('s3', endpoint_url="http://localhost:4566")
def test_s3_to_dynamodb_flow(): # Look for the S3 bucket by prefix target_bucket_prefix = "sqsblogstack-inventoryupdatesbucketfe-" response = s3.list_buckets() bucket_name = next( (bucket['Name'] for bucket in response['Buckets'] if bucket['Name'].startswith(target_bucket_prefix)), None ) assert bucket_name is not None, "Bucket not found"
# Look for the DynamoDB table by prefix target_ddb_prefix = "SqsBlogStack-InventoryUpdates" response = dynamodb.list_tables() table_name = next( (table for table in response['TableNames'] if table.startswith(target_ddb_prefix)), None ) assert table_name is not None, "DynamoDB table not found"
# Upload the test CSV to S3 test_file = "sqs_blog/sample_file.csv" s3.upload_file(test_file, bucket_name, test_file) print(f"Uploaded {test_file} to bucket {bucket_name}")
# Wait for the pipeline to process time.sleep(5)
# Scan DynamoDB for records response = dynamodb.scan(TableName=table_name) items = response.get("Items", []) assert len(items) > 0, "No items found in DynamoDB" print(f"Found {len(items)} items in DynamoDB")
Why Use Prefixes?
- CDK creates resources with long names that include random suffixes to avoid collisions.
- Prefix searching allows your tests to remain flexible, even if you destroy and redeploy the stack.
- This is especially useful in CI/CD pipelines where every deployment can create slightly different resource names.
Run the Test
Run it using pytest:
pytest -v test_integration.py
If everything works, you will see green and feel like a responsible adult.
A Few Tips
- You might need to adjust the time.sleep() to give the pipeline enough time to finish.
- If you want to clean up the DynamoDB table between tests, you can add a helper to delete all items at the start of each test.
- If Python is not your favorite, you can use the same concept with any language you like.
Why Not Just Unit Test?
Unit tests are great, but they do not tell you if your services actually work together. These integration tests make sure that S3 really triggers Lambda, that Lambda really sends to SQS, and that SQS really ends up in DynamoDB.
Itās like checking that your car actually moves, not just that the tires look round.
Quick Reference: All the Commands
Hereās a cheat sheet with all the commands youāll use for this tutorial.
# Clone the appgit clone https://github.com/localstack/samples.gitcd samples/aws-cdk-inventory-app
# Set up virtual environmentpython3 -m venv .venvsource .venv/bin/activatepip install -r requirements.txt
# Synthesize templatecdklocal synth
# Bootstrap and deploycdklocal bootstrapcdklocal deploy
# Install dependenciespip install pytest boto3
# Run the testspytest -v test_integration.py
The Payoff
With these tests in place, you save yourself from shipping broken code and spending your weekend reading CloudWatch logs. You can make changes with confidence, and your coworkerās āoptimizationsā are more likely to fail gracefully.
Want to follow along in real time? Watch the video where we walk through everything step by step.
In the next post, we will hook this into a CI/CD pipeline so tests run automatically on every push.
Until then, happy testing.