Efficient infrastructure testing with LocalStack & Terraform tests framework
Terraform tests framework can integrate with LocalStack to perform local testing of your AWS cloud infrastructure. We'll explore how to use this to test a serverless workflow and enable rapid, cost-effective validation of your Terraform configurations right on your local machine.
Introduction
With the introduction of Terraform 1.6, the Terraform tests became generally available. However, using Terraform tests to create real cloud infrastructure presents challenges, such as long deployment times and unnecessary costs that can lead to slower development and testing cycles. LocalStack addresses these issues by allowing integration testing of cloud solutions and configurations locally and in CI/CD environments. With LocalStack’s Terraform integration (tflocal
), you can now use the testing framework to test your IaC configurations locally without creating actual cloud resources.
In this blog, we will walk through setting up an event-driven serverless workflow on your local machine using Terraform and LocalStack, and how to configure the Terraform tests to run tests locally. This method eliminates the need for actual AWS services, thereby avoiding costs related to managing resources in AWS. This approach would also set up a rapid feedback loop for accelerated cloud development and testing using Terraform & LocalStack.
Table of Contents
How LocalStack works with Terraform
LocalStack runs as a Docker container on your local machine or in an automated environment. Once running, you can use LocalStack with Terraform to create and manage AWS resources locally. For local deployment and testing of Terraform configurations, LocalStack provides a CLI wrapper called tflocal
. tflocal
utilizes the Terraform Override mechanism and creates a temporary file localstack_providers_override.tf
, which sets the AWS provider
endpoints to point to the LocalStack API (http://localhost:4566
).
To set up tflocal
, you can use install the PyPI package with these commands:
Since tflocal
acts as a wrapper over the terraform
CLI, you can use all the Terraform CLI commands that you are used to, including terraform test
. Instead of deploying and testing resources on the real cloud, resources are deployed locally, and the tests verify their correctness and availability.
Image Resizer with Lambda & S3
In this tutorial, we’ll setup a serverless workflow to resize images uploaded to an S3 bucket. For simplicity, we’ll setup S3 bucket notifications to trigger a Python Lambda that runs image resizing operation using Pillow and uploads the resized image to another S3 bucket. The infrastructure will be setup using Terraform, and we’ll use LocalStack to deploy & test it locally.
Prerequisites
localstack
CLI with LocalStack Auth Token- Terraform v1.6.0 and later with
tflocal
wrapper script - Docker
- LocalStack Web Application account
Setup the Lambda
To start, create a new file named lambda_function.py
. This Lambda function automatically resizes images uploaded to an S3 bucket named original-images
, ensuring they don’t exceed 400x400 pixels while maintaining aspect ratio. The resized images are then saved to a separate resized-images
bucket. Add the following code to the file:
To deploy the Lambda, we’ll use the ZIP archive. Create a text file named requirements.txt
and add Pillow
as a dependency. Now, run the following commands to package your Lambda function:
These commands use Docker to install the required Python packages in a Lambda-compatible environment, then create a ZIP file containing both the dependencies and your function code. The final ZIP file lambda.zip
will be ready to use while creating the Lambda function.
Setup the Terraform configuration
The next step involves creating a Terraform configuration that accomplishes the following:
- Creates two S3 buckets named
original-images
andresized-images
. - Creates a Lambda function named
ImageResizerFunction
to resize images. - Sets up the S3 bucket notification configuration to trigger the Lambda function when images are uploaded to the
original-images
bucket.
Create a new file named main.tf
and add the following Terraform configuration:
It’s important to note that this Terraform configuration only sets up S3 buckets, a Lambda function, and bucket notifications, without any IAM roles or permissions. LocalStack doesn’t enforce IAM roles strictly, as it is a permit-all system. However, you should configure IAM roles and permissions before moving to production.
You’re now ready to test our infrastructure deployment with tflocal
!
Deploy the local infrastructure
Before starting a local deployment with tflocal
, first start your LocalStack container using your LOCALSTACK_AUTH_TOKEN
:
Once the LocalStack container is running, initialize your Terraform configuration with this command:
Finally, deploy your Terraform configuration using:
You will be prompted to confirm the resource actions. After confirmation, your entire infrastructure will be deployed locally. The output will look like this:
If you have an account on the LocalStack Web Application, you can check the Status Page and Resource Browsers to verify that your resources have been successfully created using Terraform.
Asserting the resource provisioning
You can now begin writing Terraform tests using HashiCorp Configuration Language (HCL). Terraform identifies test files by their extensions: .tftest.hcl
or .tftest.json
.
A tests file generally includes the following components:
- An optional
provider
block to customize the provider configuration. - A
variables
block that contains the input variables passed into the module. - A
run
block to execute a specific test scenario, to be run in sequence.
Tests in Terraform serve two main purposes:
- Unit testing focuses on individual components to ensure each part functions correctly.
- Integration testing ensures that the deployed infrastructure operates as expected as a whole.
For unit testing, the framework typically uses terraform plan
commands within its run blocks. This approach speeds up testing by avoiding the actual provisioning of infrastructure. Assertions are then used to confirm that the configuration produces the expected values.
In contrast, integration tests uses terraform apply
to deploy the infrastructure and then check its functionality, often using data sources to validate expected responses from the deployed resources. Instead of specifying individual commands directly, the command
attribute is used within the run
block to indicate whether to execute plan
or apply
(default being apply
).
To get started, create a new directory named tests
and within it, a file called assert.tftest.hcl
. Here’s how to add a test to check the created S3 buckets:
In this run
block:
- The label
verify_s3_buckets
names the test. - The
command
is set toplan
, which executes theterraform plan
command. - The
assert
block contains acondition
argument where the expression should evaluate totrue
if the test passes andfalse
if it fails.
This test ensures that the S3 buckets were created with the correct name specified to them. As a note, you can include multiple run
blocks in your test file, and each run
block can contain multiple assert
blocks. Terraform further executes run
blocks sequentially within the configuration directory.
Let’s run these tests using the Terraform testing framework. Start by restarting your LocalStack container for a fresh state:
Now, execute the following command to run your tests:
The output should resemble this:
Next, to verify if the Lambda function was created correctly, add a new block to assert.tftest.hcl
:
When you run the tests again, you might encounter an error:
This error indicates that instead of using command = plan
, you should use command = apply
, because the Lambda function ARN can only be retrieved after the Terraform configuration is applied. The plan
command only simulates changes without creating resources, so these runtime values remain undefined. Make this change and re-run the tests to confirm functionality:
As you can see, Terraform processes run
blocks in the order they appear in the test file, executing them sequentially. Each run
block can depend on the state changes made by the previous ones.
You can similarly test other resources, like verifying if the bucket notification is correctly configured with the Lambda function or the S3 bucket ARNs.
Integration Testing with Modules
With the Terraform tests, modules can be used to design and test workflows. You can use modules to:
- Set up infrastructure with a setup module
- Validate secondary infrastructure with a loading module
For example, a setup module deploys core infrastructure (S3 buckets, Lambda function), and a loading module uploads an image to the original-images
S3 bucket and verifies the resized image.
To start, create execute
and verify
directories in tests
, each with a main.tf
file.
In tests/execute/main.tf
, add the following configuration:
In this file, the provider
block specifies mock AWS credentials, routes requests to LocalStack, and includes flags to bypass account and credentials checks. The S3 bucket object resource uploads an image to the S3 bucket, triggering the Lambda function.
As a note, you can set or override providers using provider
and providers
blocks in Terraform testing files. Without these, Terraform defaults to initializing providers with default configuration.
In tests/verify/main.tf
, add:
This file includes the AWS provider as well as the time
provider, which adds a 10-second delay before the aws_s3_bucket_object
data source retrieves the resized image from resized_bucket_name
. This ensures the Lambda function has time to process and upload the resized image.
Now, in the tests
directory, create integration.tftest.hcl
to specify values for Input Variables:
Ensure that a PNG image named image.png
is downloaded in your root directory, where the tests will be executed. You can alternatively download it from our GitHub repository. These variables will be passed to the modules defined in this section, specifying the original bucket, resized bucket, image path, and image key for retrieval.
Next, add the module
block in your test files, specifying the source
attribute to point to the desired module path. This source
can be a path to a local module or a registry module reference, with only these two options supported.
Here’s how you might structure this:
This configuration:
- Deploys the primary infrastructure using the
main.tf
file in the root directory. - Uploads
image.png
to theoriginal-images
S3 bucket. - Waits 10 seconds, then verifies the resized image exists in the
resized-images
bucket.
Run the tests using tflocal test
to see the following output:
The module
block in the run
block specifies which module to execute. The inputs and assertions in each run
block configure the module and verify expected results.
As you might have noticed, Terraform automatically attempts to destroy all resources created during a test after each run
block completes in the reverse order of their creation, as specified by the state file. However, with LocalStack, this cleanup process is simplified. You can simply stop or restart your container to achieve a fresh state, ensuring there are no lingering cloud resources that could incur additional costs.
Conclusion
That’s the long and the short of how can use the Terraform tests with LocalStack. If you already have test files set up to validate your Terraform deployments, you can get started by swapping the terraform
command with LocalStack’s tflocal
. This allows you to validate Terraform deployments locally, giving you confidence in your configuration by closely emulating real cloud behavior with LocalStack’s cloud emulator.
Terraform 1.7 has also introduced test mocking to simulate providers, resources, and data sources, generating fake data for tests without creating infrastructure or requiring credentials. In contrast, LocalStack provides a full replication of real-world behavior as shown in the example above. With LocalStack’s focus on parity with AWS, you can avoid building mock data to simulate specific behaviors, and rely on our high-fidelity, fully local cloud developer experience.
You can find the complete example and a sample GitHub Actions workflow pattern in our repository.