Integration testing for Pulumi programs with LocalStack & Automation API
Learn how to use LocalStack and Pulumi's Automation API to run integration tests for your cloud infrastructure on your local machine.

Testing a AppSync API to DynamoDB workflow
Testing your cloud infrastructure should not feel like summoning a storm deity—you press deploy, cross your fingers, and hope your AWS bill doesn’t strike you down. 😭 ⚡ 🌩️
Pulumi makes Infrastructure as Code (IaC) testing easier by offering different approaches, from fast unit tests (mocking resources) to full integration tests that spin up real infrastructure.
One of Pulumi’s superpowers is the Automation API, which lets you deploy and manage stacks programmatically—no CLI wizardry needed! 🧙
But here’s the problem: running tests in the actual cloud can be painfully slow and expensive. Spinning up resources? Costly. Tearing them down? Time-consuming. Debugging live environments? An exercise in masochism.
That’s why we think LocalStack is cool! A local AWS emulator that lets Pulumi interact with a cloud-like environment without actually deploying to AWS.
In this guide, we’ll show you how to:
- Set up Pulumi integration testing using LocalStack + Automation API
- Deploy a AppSync API with DynamoDB entirely locally
- Automate test execution to verify everything works
Let’s dive right in, shall we? 🌊
Table of contents
- How LocalStack works with Pulumi
- Prerequisites
- Step 1: Spin up a Pulumi project
- Step 2: Structure the Pulumi project
- Step 3: Deploy Pulumi stack
- Step 4: Setup Pulumi integration tests
- Step 5: Run Pulumi integration tests
- Summary
- What to try next
How LocalStack works with Pulumi
LocalStack runs AWS services locally, so Pulumi treats it like the real cloud—minus the cloud. When Pulumi deploys S3 buckets, Lambda functions, or ECS tasks, they go to LocalStack instead of AWS.
Two Ways to Connect Pulumi to LocalStack
Option 1: Use pulumilocal
You can use pulumilocal
, a wrapper around the Pulumi CLI that auto-configures endpoints for LocalStack.
pip install pulumi-localpulumilocal version
Option 2: Configure Pulumi Manually
You can configure AWS service endpoints in Pulumi.localstack.yaml
so that Pulumi will communicate with LocalStack instead of AWS.
config: aws:accessKey: test aws:endpoints: - s3: http://localhost:4566 - dynamodb: http://localhost:4566 aws:region: us-east-1 aws:secretKey: test aws:skipCredentialsValidation: "true"
Prerequisites
Step 1: Spin up a Pulumi project
First, create a new Pulumi project with the aws-python
template.
This template includes all the boilerplate code you need.
1.1: Set up project
mkdir -p appsync-dynamodbexport PULUMI_CONFIG_PASSPHRASE=lsdevtestexport PULUMI_ACCESS_TOKEN=lsdevtestexport PULUMI_BACKEND_URL=file://$(PWD)/appsync-dynamodb
Next, initialize Pulumi:
pulumi new aws-python --cwd appsync-dynamodb -y -s localstack
This creates your project inside appsync-dynamodb
, using localstack
as the stack name.
1.2 Peek inside new project
❯ cd appsync-dynamodb❯ tree -L 1.├── Pulumi.localstack.yaml├── Pulumi.yaml├── __main__.py├── requirements.txt└── venv
Now, create resource_appsync.py
because it’s time to define some AWS resources with Pulumi.
Step 2: Structure the Pulumi project
We’ll split our logic across two files:
__main__.py
→ The entry point, imports and calls the API setup.resource_appsync.py
→ Defines the AppSync API and DynamoDB integration.
2.1: Define the imports
Open resource_appsync.py
and import the necessary libraries.
import jsonimport pulumifrom pulumi import Outputimport randomimport stringfrom pulumi_aws import appsync, dynamodb, iam
2.2: Define the constants
These will be used to export the API details after deployment.
OUTPUT_KEY_ENDPOINT = "endpoint"OUTPUT_KEY_API_KEY = "api_key"OUTPUT_KEY_API_ID = "api_id"
2.3: Create the function
This function will contain all the logic for creating the AppSync API and DynamoDB table. All the code below will go inside this function.
def create_appsync_api(): ...
2.4: Define the DynamoDB table (╯°□°)╯︵ ┻━┻
This will store our tenant data.
table = dynamodb.Table( "tenants", hash_key="id", attributes=[dynamodb.TableAttributeArgs(name="id", type="S")], read_capacity=1, write_capacity=1,)
2.5: Set up IAM Permissions
- AppSync needs an IAM role to interact with DynamoDB.
- We give it only the permissions it needs.
# Create an IAM role that AppSync can assumerole = iam.Role( "iam-role", assume_role_policy=json.dumps({ "Version": "2012-10-17", "Statement": [{"Action": "sts:AssumeRole", "Principal": {"Service": "appsync.amazonaws.com"}, "Effect": "Allow"}] }))
# Define a policy that allows specific DynamoDB actionspolicy = iam.Policy( "iam-policy", policy=Output.json_dumps({ "Version": "2012-10-17", "Statement": [{"Action": ["dynamodb:PutItem", "dynamodb:GetItem"], "Effect": "Allow", "Resource": [table.arn]}] }))
# Attach the policy to the roleiam.RolePolicyAttachment("iam-policy-attachment", role=role.name, policy_arn=policy.arn)
👆 Boom! AppSync can now read & write from DynamoDB.
2.6: Define the GraphQL schema
This schema defines a simple Tenant type with ID and name fields, along with queries and mutations for retrieving and adding tenants.
schema = """ type Query { getTenantById(id: ID!): Tenant } type Mutation { addTenant(id: ID!, name: String!): Tenant! } type Tenant { id: ID! name: String } schema { query: Query mutation: Mutation }"""
2.7: Create the AppSync API
This sets up a AppSync API with API Key authentication.
api = appsync.GraphQLApi("key", authentication_type="API_KEY", schema=schema)api_key = appsync.ApiKey("key", api_id=api.id)
2.8: Connect the API to DynamoDB
This creates a data source that connects the AppSync API to the DynamoDB table.
random_datasource_name = "".join(random.choice(string.ascii_letters) for _ in range(15))
data_source = appsync.DataSource( "tenants-ds", name=random_datasource_name, api_id=api.id, type="AMAZON_DYNAMODB", dynamodb_config=appsync.DataSourceDynamodbConfigArgs(table_name=table.name), service_role_arn=role.arn,)
🤔 Why the random name? AWS hates duplicate names, so we generate one on the fly.
2.9: Define GraphQL resolvers
Resolvers connect GraphQL queries to DynamoDB operations.
appsync.Resolver( "get-resolver", api_id=api.id, data_source=data_source.name, type="Query", field="getTenantById", request_template="""{"version": "2017-02-28", "operation": "GetItem", "key": {"id": $util.dynamodb.toDynamoDBJson($ctx.args.id)}}""", response_template="$util.toJson($ctx.result)",)
appsync.Resolver( "add-resolver", api_id=api.id, data_source=data_source.name, type="Mutation", field="addTenant", request_template="""{ "version" : "2017-02-28", "operation" : "PutItem", "key" : { "id" : $util.dynamodb.toDynamoDBJson($ctx.args.id) }, "attributeValues" : { "name": $util.dynamodb.toDynamoDBJson($ctx.args.name) } } """, response_template="$util.toJson($ctx.result)",)
The resolver above maps the GraphQL operations to DynamoDB actions, such as GetItem
and PutItem
.
2.10: Export API Details
Finally, we export the API endpoint, API key, and API ID so we can use them later.
pulumi.export(OUTPUT_KEY_ENDPOINT, api.uris["GRAPHQL"])pulumi.export(OUTPUT_KEY_API_KEY, api_key.key)pulumi.export(OUTPUT_KEY_API_ID, api.id)
return api, api_key, api.uris["GRAPHQL"]
2.11: Update the entry point
Now, let’s update __main__.py
to remove the boilerplate code and call our function:
from resource_appsync import create_appsync_apiapi, api_key, endpoint = create_appsync_api()
Step 3: Deploy Pulumi stack
Time to bring this thing to life!
We’ll deploy our Pulumi stack, which spins up all our AWS resources inside LocalStack.
3.1: Start LocalStack
Before deploying, make sure LocalStack is up and running:
localstack auth set-token <YOUR_LOCALSTACK_AUTH_TOKEN>localstack start
3.2: Install pulumilocal
Since we’re using LocalStack, we’ll install pulumilocal
to deploy our stack:
pip install pulumilocal
3.3: Select the Pulumi stack
Pulumi organizes deployments into stacks (think: dev
, staging
, production
). We’ll select the localstack
stack:
pulumilocal stack select -c localstack
This tells Pulumi: “Use the stack we created earlier.”
3.4: Deploy the stack
Now, let’s deploy all the things: ️🌈
pulumilocal up
Pulumi will:
- Ask if you want to overwrite the config file (say yes).
- Preview what it will create. (AppSync API, DynamoDB, IAM roles, resolvers, etc.)
- Deploy everything in the right order.
Example preview:
Previewing update (localstack): Type Name Plan + pulumi:pulumi:Stack appsync-dynamodb-localstack create + ├─ aws:iam:Role iam-role create + ├─ aws:dynamodb:Table tenants create + ├─ aws:appsync:GraphQLApi key create + ├─ aws:appsync:ApiKey key create + ├─ aws:appsync:DataSource tenants-ds create + ├─ aws:iam:RolePolicyAttachment iam-policy-attachment create + ├─ aws:iam:Policy iam-policy create + ├─ aws:appsync:Resolver get-resolver create + └─ aws:appsync:Resolver add-resolver create
Outputs: api_id : output<string> api_key : output<string> endpoint: output<string>
Resources: + 10 to create
After confirming, Pulumi deploys everything in seconds:
Outputs: api_id : "ca334cb8d3eb45eb8e25fc0e41" api_key : [secret] endpoint: "http://localhost.localstack.cloud:4566/graphql/ca334cb8d3eb45eb8e25fc0e41"
Resources: + 10 created
Duration: 9s
Now you’ve got a AppSync API running locally!
Step 4: Setup Pulumi integration tests
Now that our infrastructure is in place, let’s test it. 🧪
We’ll use Python’s unittest framework and Pulumi’s Automation API to:
- Deploy the stack programmatically.
- Verify that the AppSync API was created.
- Test adding a tenant via GraphQL.
- Confirm that the data is stored in DynamoDB.
- Destroy all resources after testing.
4.1: Create the test file
Create test_appsync.py
and import the necessary libraries:
import osimport unittestimport jsonimport requestsimport boto3import timefrom pulumi import automation as auto
from resource_appsync import OUTPUT_KEY_ENDPOINT, OUTPUT_KEY_API_KEY, OUTPUT_KEY_API_ID
4.2: Set Up & Tear Down
We’ll use setUpClass
and tearDownClass
to handle the deployment lifecycle:
class TestAppSync(unittest.TestCase): @classmethod def setUpClass(cls) -> None: cls.STACK_NAME = "localstack" cls.REGION_NAME = "us-east-1" cls.WORK_DIR = os.path.join(os.path.dirname(__file__)) cls.TENANT_ID = "123" cls.TENANT_NAME = "FirstCorp"
# Configure LocalStack clients cls.endpoint_url = "http://localhost:4566" cls.appsync_client = boto3.client("appsync", region_name=cls.REGION_NAME, endpoint_url=cls.endpoint_url) cls.dynamodb_client = boto3.client("dynamodb", region_name=cls.REGION_NAME, endpoint_url=cls.endpoint_url)
# Deploy the stack cls.stack = auto.create_or_select_stack(stack_name=cls.STACK_NAME, work_dir=cls.WORK_DIR) cls.stack.up(on_output=print) cls.outputs = cls.stack.outputs()
@classmethod def tearDownClass(cls) -> None: cls.stack.destroy(on_output=print) cls.stack.workspace.remove_stack(cls.STACK_NAME)
How the Automation API works here:
auto.create_or_select_stack()
→ Selects or creates a Pulumi stackstack.up()
→ Deploys the infrastructurestack.outputs()
→ Fetches deployed API detailsstack.destroy()
→ Cleans up resources after testing
4.3: Test the AppSync API Exists
First, we’ll check if the AppSync API was created successfully.
def test_appsync_api_exists(self): """Test if the AppSync API was created successfully""" api_id = self.outputs.get(OUTPUT_KEY_API_ID).value
# Get the API using boto3 response = self.appsync_client.get_graphql_api(apiId=api_id) self.assertEqual(response["graphqlApi"]["apiId"], api_id)
4.4: Test AppSync API by adding a tenant
Let’s run a quick API request… because who has time to manually poke endpoints?
We’ll grab the endpoint URL and API key from the stack outputs, then fire off a GraphQL mutation to add a tenant.
def test_graphql_add_tenant(self): """Test the GraphQL mutation to add a tenant""" endpoint = self.outputs.get(OUTPUT_KEY_ENDPOINT).value api_key = self.outputs.get(OUTPUT_KEY_API_KEY).value
query = { "query": f""" mutation AddTenant {{ addTenant(id: "{self.TENANT_ID}", name: "{self.TENANT_NAME}") {{ id name }} }} """ }
headers = {"Content-Type": "application/json", "x-api-key": api_key} response = requests.post(endpoint, json=query, headers=headers)
self.assertEqual(response.status_code, 200)
# Check the response data response_data = response.json() self.assertIn("data", response_data) self.assertIn("addTenant", response_data["data"]) self.assertEqual(response_data["data"]["addTenant"]["id"], self.TENANT_ID) self.assertEqual(response_data["data"]["addTenant"]["name"], self.TENANT_NAME)
This sends a GraphQL mutation and verifies that it returns the expected result.
4.5: Verify data in DynamoDB
Finally, we’ll check if the tenant was actually stored in DynamoDB.
def test_dynamodb_table_contains_data(self): """Test that the tenant was actually stored in DynamoDB""" self.test_graphql_add_tenant() # Ensure the tenant exists
# Find the actual table name tables = self.dynamodb_client.list_tables() table_name = next((t for t in tables["TableNames"] if "tenants" in t), "tenants")
response = self.dynamodb_client.get_item(TableName=table_name, Key={"id": {"S": self.TENANT_ID}})
self.assertIn("Item", response) self.assertEqual(response["Item"]["id"]["S"], self.TENANT_ID) self.assertEqual(response["Item"]["name"]["S"], self.TENANT_NAME)
4.6: Run the tests
Add this at the bottom of your file:
if __name__ == "__main__": unittest.main()
Running this script will:
- Deploy the infrastructure.
- Run the tests.
- Tear everything down after execution.
Step 5: Run Pulumi integration tests
All that setup? Worth it. Time to run our integration tests with Pulumi and LocalStack.
5.1: Reset LocalStack
Start fresh by restarting LocalStack:
localstack restart
This ensures a clean test environment, removing leftover resources.
5.2: Check Pulumi Configuration
If you deployed with pulumilocal
, your stack should already be configured for LocalStack. If not, update your Pulumi config (Pulumi.localstack.yaml
):
config: aws:accessKey: test aws:endpoints: - dynamodb: http://localhost:4566 - appsync: http://localhost:4566 - iam: http://localhost:4566 - sts: http://localhost:4566 aws:region: us-east-1 aws:secretKey: test aws:skipCredentialsValidation: "true"
You still need to pass some AWS credentials—but don’t worry, LocalStack isn’t actually verifying them. Go ahead, put in accessKey: test
like a rebel. 😎
5.3: Install dependencies
Install the required dependencies for the test script:
pip install boto3 pulumi pulumi-aws requests
5.4: Run the Tests
Execute the test suite:
python -m unittest test_appsync.py
Example output:
Updating (localstack):
+ pulumi:pulumi:Stack appsync-dynamodb-localstack creating (0s)... + aws:appsync:Resolver add-resolver creating (0s) + aws:appsync:Resolver get-resolver created (0.01s) + aws:appsync:Resolver add-resolver created (0.01s) + pulumi:pulumi:Stack appsync-dynamodb-localstack created (9s)Outputs: api_id : "2909a32d813c4bd08c6655cc7b" api_key : [secret] endpoint: "http://localhost.localstack.cloud:4566/graphql/2909a32d813c4bd08c6655cc7b"
Resources: + 10 created
Duration: 10s...
----------------------------------------------------------------------Ran 3 tests in 17.278s
OK
If you run into any issues, with Pulumi not able to find the resources, consider deleting the
.pulumi
folder in your project directory. This will reset the state and allow Pulumi to re-create the resources.
Summary
We did it! We built a AppSync API to DynamoDB workflow using Pulumi, deployed it locally with LocalStack, and wrote integration tests using Pulumi’s Automation API. 💪🏻💪🏼💪🏽💪🏾💪🏿
Check out the full code on GitHub.
Now that you’ve seen how Pulumi and LocalStack can revolutionize your integration testing, it’s time to put it into action. Clone the repo, give it a spin, and let us know what you build!
What to try next
- Expand the API with additional queries and mutations.
- Integrate Lambda resolvers for custom business logic.
- Set up CI/CD pipelines to automate infrastructure testing.