Introducing the LocalStack Model Context Protocol (MCP) Server
We're excited to introduce the LocalStack MCP Server, which enables AI agents to directly manage LocalStack through a standardized protocol, utilizing various tools. This allows agents to automate the entire local cloud development lifecycle via a conversational interface.

Introduction
The way cloud apps and infrastructure stacks are developed is changing, as developers now increasingly use AI to generate complex setups. This agent-driven approach promises to accelerate software delivery and foster agility. But this introduces a core bottleneck: the need for a fast, isolated, and low-cost sandbox to validate the AI’s output to ensure trust and safety. LocalStack addresses these challenges head-on, but what if we could take it a step further? What if AI agents could help you not only write infrastructure code, but also control LocalStack directly to deploy, test, and verify their work in a fully automated and safe feedback loop?
Our new LocalStack Model Context Protocol (MCP) Server – released today as an experimental public preview – makes this possible. This MCP server lets AI agents run a complete, end-to-end development lifecycle on your local machine. It provides several tools that allow you to manage your LocalStack container, deploy and destroy infrastructure code, analyze container logs, generate IAM policies from runtime denials, run chaos tests for resilience, and manage state snapshots with Cloud Pods.
The LocalStack MCP server is an opt-in enhancement that creates a conversational interface for your local cloud environment, where you can let an agent enable fully-local workflows on LocalStack, such as:
- “Deploy the Terraform project in the infra/ directory and return the outputs.”
- “My app is hitting access denied errors; watch it run and generate the IAM policy to fix them.”
- “Inject 503 Service Unavailable errors into all S3 calls so I can test my app’s retry logic.”
Ready to try it? Get started with the LocalStack MCP Server using your preferred MCP client or AI editor.
How did we get here?
At its core, MCP (Model Context Protocol) is a specification built on three pillars:
- Model: The LLM that generates content but cannot act.
- Context: The information, tools, and resources given to the model.
- Protocol: The standardized foundation for this interaction.
MCP is an open standard that defines how AI agents connect to external systems. With a single, reusable MCP server for a service, any compatible AI agent can interact with it to ensure interoperability and remove the need for custom one-off integrations.
So why build an MCP server for LocalStack? LocalStack provides a local sandbox environment that emulates over 100+ AWS Services for seamless development & testing. However, the developer experience might involve a series of disconnected, manual steps — like running verbose CLI commands with specific flags, digging through thousands of raw log lines to find a critical error, and manually handling deployment and teardown of infrastructure code. This results in a slow, error-prone, and fragmented workflow that interrupts the inner development loop.
The LocalStack MCP Server solves this problem. It acts as an automation layer between an AI agent and the LocalStack environment, turning high-level developer intent into the exact steps needed to manage the full local cloud lifecycle. Thus, the workflow shifts from manual tasks to a conversational flow, speeding up local cloud development and testing.
What are the current capabilities?
The LocalStack MCP Server offers a comprehensive set of tools that give AI agents detailed control over the full local cloud development lifecycle. Through the standardized protocol, an agent can move beyond code generation and into automated execution, testing, and validation.
The server currently offers the following capabilities:
- Lifecycle Management (
localstack-management
): Start, stop, restart, and check the status of the LocalStack container, with support for custom environment variables. - Infrastructure Deployment (
localstack-deployer
): Deploy and destroy CDK, CloudFormation, and Terraform infrastructure code on the LocalStack instance. - Log Analysis (
localstack-logs-analysis
): Analyze LocalStack logs to summarize activity, find errors, and inspect or filter API requests. - IAM Policy Analyzer (
localstack-iam-policy-analyzer
): Configures IAM enforcement modes and generates IAM policies by analyzing permission denial errors from logs. - Chaos Engineering (
localstack-chaos-injector
): Inject service failures and network latency to test infrastructure resilience and fault tolerance. - State Management (
localstack-cloud-pods
): Save and load snapshots of a LocalStack instance for reproducible environments using Cloud Pods to persist data & resources. - AWS Client (
localstack-aws-client
): Run AWS CLI commands directly against the emulated services running inside the LocalStack container.
This toolset is growing, with new features and improvements added regularly. Support for Resources and Prompts is planned next.
How do I get started?
The LocalStack MCP Server is available on the NPM registry and can be run directly using NPX. Once configured, your AI development tool (Cursor, VS Code, Amazon Q, etc.) will manage the server’s lifecycle automatically.
Before you begin, make sure these are installed and in your system’s PATH
:
- Node.js (v20.x or later) to run the
npx
command. - LocalStack CLI and Docker to manage the LocalStack container itself.
cdklocal
ortflocal
to use thelocalstack-deployer
tool. (optional)- A LocalStack Auth Token to enable licensed features. (optional)
Installation
To get started, add this configuration to your AI client’s mcp.json
file (e.g., ~/.cursor/mcp.json
for Cursor). It runs the server with npx
, which downloads and installs the package from the NPM registry on its first run:
{ "mcpServers": { "localstack-mcp-server": { "command": "npx", "args": ["-y", "@localstack/localstack-mcp-server"] } }}
Once configured, your AI client can start the server and give agents full access to the LocalStack toolset.
Configuration
You can pass any LocalStack environment variable to the container through the env
block in mcp.json
. The MCP server will forward these variables when it starts the LocalStack instance, allowing you to customize its behavior. You can find the full list of variables in the configuration docs.
This is also how you enable licensed tools by providing your LocalStack Auth Token. For example, you can enable debug logging, turn on persistence, and provide an Auth Token by updating your configuration like this:
{ "mcpServers": { "localstack-mcp-server": { "command": "npx", "args": ["-y", "@localstack/localstack-mcp-server"], "env": { "LOCALSTACK_AUTH_TOKEN": "<YOUR_TOKEN>", "DEBUG": 1, "PERSISTENCE": 1 } } }}
Quickstart
With the configuration in place, you can start the server from your AI client. Most MCP-compatible clients, like Cursor, will detect the new server and display the associated tools in the MCP server management section. The client will run the server with NPX in the background, and you will see an indicator that it is running.
Once the server is running, your AI agent has access to the full LocalStack toolset. You can verify the setup by opening up a chat with your agent and asking it to start the LocalStack instance.
The agent will use the localstack-management
tool to communicate with the server and return the current status, confirming that the container has been started correctly.
Deploying and testing a CDK app locally
To demonstrate a complete, end-to-end workflow, we will use the MCP server to deploy a sample serverless CDK app that provisions a Lambda function, API Gateway, DynamoDB table, and a CloudFront distribution. The example has been taken from the AWS for Frontend Developers repository.
First, clone the repository and install the necessary dependencies:
git clone https://github.com/localstack-samples/aws-for-frontend-devs.gitcd aws-for-frontend-devs/dynamodbnpm install
The prompts in this section are implemented using Cursor, but you can use the same instructions with any MCP-compatible client. Open the project in Cursor to follow along.
Check LocalStack status
Before deploying, ensure your LocalStack container is running. You can ask your agent to check for you:
If it’s not running, the agent will inform you and then start the container.
Deploy the CDK Application
Once LocalStack is running, instruct your agent to deploy the CDK project from the directory you just cloned:
The agent will use the localstack-deployer tool
, and you will see the cdklocal
output in your terminal as it bootstraps and deploys the stack.
Open the URL shown in the MCP client output to view the app in your browser.
Verify the Deployed Resources
After a successful deployment, ask the agent to verify that the key AWS resources were created correctly using the localstack-aws-client
tool:
Analyze for Infrastructure Issues
Finally, perform a quick health check by asking the agent to analyze the logs for any potential issues that may have occurred during the deployment:
State Management with Cloud Pods
You can also save the exact state of your deployed infrastructure to a snapshot that you can instantly load later, creating a perfect, reproducible test environment.
You can simulate a full teardown and recovery by resetting the state, checking that the resources are gone, and restoring the environment from the snapshot.
Resilience Testing with Chaos Engineering
You can also test your application’s resilience by injecting faults. For instance, you can simulate a failure in the Lambda function to see how your frontend or upstream services react.
After injecting the fault, try accessing the app again via its CloudFront URL. You should see that no message has been displayed in the UI, confirming that the chaos test is active.
Next Steps
This release is intended to be an initial preview to gauge interest and feedback, but we hope it’s just the beginning. We’ve got lots more ideas intended to make the toolchain faster, smarter, and more integrated – basically to turn the MCP server into a proactive partner, rather than just an assistant. We also plan to work to address any security and compliance concerns that might hinder the adoption of the tool within enterprises or regulated environments.
Here are some ideas we’re currently working on:
- Optimizing the core engine to cut latency in log analysis and command execution.
- Add support for other popular Infrastructure as Code frameworks, such as SAM and Pulumi.
- Add tools that analyze your code and suggest least-privilege IAM policy before deployment.
- “Open the Black Box” by exposing LocalStack’s internal tracing capabilities as tools for the agent to inspect.
Longer term, we hope to enable it to not only detect infrastructure errors and configuration drift but also propose and execute corrective actions, which can then be tested & verified on LocalStack’s core cloud emulator.
We are excited to see how you use the MCP server and to hear what you think. We encourage you to ask questions, share your experiences, and suggest new features by reaching out to us on our Community Slack or by creating an issue on our open-source GitHub repository.
New to LocalStack? Create a free account today and get started!