LocalStack LogoLocalStack Icon

What's new in LocalStack for Snowflake?

A few months ago, we launched the public preview of our LocalStack for Snowflake emulator, continuously enhancing it to offer the best Snowflake developer experience. Discover the latest updates, including enhanced function support, a new web interface, and improved compatibility with Snowflake integrations.

What's new in LocalStack for Snowflake?

Introduction

LocalStack aims to empower development teams by providing a seamless and powerful local cloud development experience. While AWS is a big part of cloud development, it is not the only part and we are working on expanding our emulators to address other popular cloud services.

To that end, a few months ago we launched the public preview of our LocalStack for Snowflake emulator. Since then, we’ve continuously improved this emulator to address the evolving needs of our users. In this blog, we’ll discuss the areas we’ve invested in recent months towards providing our users with the best Snowflake developer experience. These updates include increasing parity with Snowflake, introducing a new web user interface, tighter integration with our SaaS offerings, and enhancements to developer tools and integrations.

How to upgrade?

To upgrade your LocalStack for Snowflake installation to the latest version, run the following commands:

$ docker pull localstack/snowflake:latest
$ IMAGE_NAME=localstack/snowflake:latest localstack start

Note that installing the Snowflake emulator through the LocalStack Extension mechanism is now deprecated and may not be compatible with all versions of LocalStack. It is recommended to use the custom LocalStack Snowflake Docker image for the best experience.

New features

Support for new functions

We have added support for several new SQL functions in LocalStack for Snowflake. The new functions include:

FunctionNotes
DATEDIFFReturns the difference between two dates or timestamps.
TIMEDIFFReturns the difference between two time values.
LEADAccesses the value of a subsequent row in a result set without the use of a self-join.
LAGAccesses the value of a previous row in a result set without the use of a self-join.
INITCAPConverts the first letter of each word in a string to uppercase and the rest to lowercase.
LAST_QUERY_IDReturns the query ID of the last executed query in the session.
DESCRIBE FUNCTIONDescribes a user-defined function.
SPLITSplits a string into an array based on a specified delimiter.
DATE_FROM_PARTSConstructs a date from individual year, month, and day parts.
FLOORRounds down a numeric value to the nearest integer.
MINReturns the smallest value in a set of values.
MIN_BYReturns the minimum value of a specified column, grouped by another column.
MAXReturns the largest value in a set of values.
MAX_BYReturns the maximum value of a specified column, grouped by another column.
COALESCEReturns the first non-null value in a list of arguments.
DIV0Returns zero if the divisor is zero; otherwise, it performs division.
LEASTReturns the smallest value from a list of expressions.
GREATESTReturns the largest value from a list of expressions.
NVL2Returns one value if a specified expression is not null, otherwise returns another value.
IS_NULL_VALUEChecks if a value is null and returns a boolean result.
PARSE_IPParses an IP address and returns its components.
OBJECT_KEYSReturns the keys of an object as an array.
OBJECT_CONSTRUCT_KEEP_NULLConstructs an object and keeps null values.
DATE_TRUNCTruncates a date or timestamp to a specified precision.
TIMESTAMP_LTZReturns the current timestamp in local time zone.
AS_DOUBLEConverts a value to double precision.
AS_INTEGERConverts a value to an integer.
AS_NUMBERConverts a value to a numeric type.
AS_CHARConverts a value to a character string.
DATE_TRUNCTruncates a date, time, or timestamp to the specified part.
NVL2Returns different values depending on whether the first argument is NULL.
LEASTReturns the smallest value from a list of expressions.
GREATESTReturns the largest value from a list of expressions.
COALESCEReturns the first non-NULL expression among its arguments.
SPLITSplits a string into an array based on a delimiter.
FLOORReturns the largest integer less than or equal to a specified value.
DATE_FROM_PARTSCreates a date from individual parts (year, month, day).
LAST_QUERY_IDReturns the query ID of the last executed query.
INITCAPCapitalizes the first letter of each word in a string.
LEADProvides access to a row at a given physical offset after the current row.
LAGProvides access to a row at a given physical offset before the current row.
DATEDIFFReturns the difference between two dates or timestamps.
TIMEDIFFReturns the difference between two time values.
HASHReturns the hash value for the input.
ANY_VALUEReturns any value from a group, without guarantees about which one.
CONTAINSReturns TRUE if a string contains a substring.
BETWEENTests whether an expression lies between two other expressions.
MODEReturns the most frequent value in a group.
AVGReturns the average (arithmetic mean) of the non-NULL values.
CBRTReturns the cube root of a number.
CEILReturns the smallest integer greater than or equal to a specified value.
ZEROIFNULLConverts NULL to 0.
ARRAY_UNIQUE_AGGReturns an array of distinct elements from the input array.
DEGREESConverts radians to degrees.
RADIANSConverts degrees to radians.
EXPReturns e raised to the power of the input.
REPEATRepeats a string a specified number of times.
REVERSEReverses the characters in a string.
SQRTReturns the square root of a number.
ASCIIReturns the ASCII code of the first character of a string.
LOGReturns the logarithm of a number to a specified base.
LNReturns the natural logarithm of a number.
IS_NULL_VALUETests if an expression evaluates to NULL.
OBJECT_KEYSReturns the keys of a Snowflake object.
PARSE_IPParses an IP address into its component parts.
DESCRIBE FUNCTIONDescribes the details of a user-defined function.
CREATE … CLONECreates a copy of an existing object in the system.
COPY INTOUnloads data from a table (or query) into one or more files in internal/external stage or an external location.

You can find more in our functions coverage documentation.

New Web User Interface

We have released a new Web User Interface for LocalStack for Snowflake that provides a user-friendly experience for managing your local Snowflake resources. The interface includes a dashboard that allows you to:

  • Run Snowflake SQL queries and view the results using a Query Editor.
  • View detailed request/response traces of API calls made to Snowflake.
  • Forward queries from the Snowflake emulator to a real Snowflake instance using a proxy.

You can access the Web User Interface by navigating to https://snowflake.localhost.localstack.cloud in your web browser, after starting your Snowflake emulator.

LocalStack Snowflake Web UI

The Web User Interface is still in preview, and we are actively working on adding new features and improving the user experience. Learn more about the latest Web User Interface in our documentation.

Support for LocalStack Ephemeral Instances

We have launched Ephemeral Instances, enabling you to run a LocalStack for Snowflake sandbox in the cloud rather than on your local machine. This ephemeral environment is a short-lived, encapsulated deployment of the Snowflake emulator in the cloud. With these sandboxes, you can run tests, preview features in your Snowflake applications, and collaborate asynchronously within and across your team!

LocalStack Snowflake Ephemeral Instance

After launching an ephemeral instance, you can switch the Snowflake host in your application to the ephemeral instance’s hostname. This change enables you to interact with the Snowflake emulator running in the cloud as if it were on your local machine. Additionally, Ephemeral Instances allow you to generate a preview environment from GitHub Pull Request (PR) builds.

The feature is in preview, and you can learn more about it in our documentation.

Improved parity with Snowflake

We have made several enhancements to LocalStack for Snowflake to improve compatibility with the Snowflake service. These improvements include:

  • Metadata queries and schema lookups can now utilize fully qualified table names, enhancing parity with the Snowflake service.
  • Support for identifiers in JDBC prepared statements to incorporate query parameters, with additional improvements to ensure full parity.
  • Functions returning VARIANT values, including those that encode dictionary and list values, are now supported.
  • Enhanced parsing of parameters in CREATE STAGE statements, now supporting alternative formats such as s3compat://....
  • Parity improvements for the creation and deletion of schemas with fully qualified names.
  • Enhanced support for inserting timestamps with subsecond precision to match Snowflake behavior.
  • Improvements in handling CREATE TABLE AS SELECT (CTAS) with nested subqueries, ensuring better parity.
  • Adjustments to the MIN_BY/MAX_BY aggregate functions for more accurate results.
  • Proper extraction of database and schema parameters for JDBC connections, ensuring improved compatibility.

Support for Snowpipe

Snowpipe enables data loading into Snowflake tables from files in an external stage, continuously processing files as they become available. It uses a queue to manage this near real-time data loading.

LocalStack for Snowflake now includes support for Snowpipe, allowing you to create and manage Snowpipe objects within the emulator. This functionality lets you load data into Snowflake tables from files stored either in a local directory or an S3 bucket, both locally and remotely. The supported SQL statements for managing Snowpipe include:

Learn more about Snowpipe support in our documentation.

Support for Iceberg tables

Iceberg tables utilize the Apache Iceberg open table format specification, providing an abstraction layer over data files stored in open formats. In Snowflake, Iceberg tables enable schema evolution, partitioning, and snapshot isolation for efficient table data management.

LocalStack for Snowflake now includes support for Iceberg tables, enabling you to create and manage these tables locally. You can use Iceberg tables to query data in Snowflake using the Iceberg format, with data stored in external volumes or local/remote S3 buckets.

For more detailed information on using Iceberg tables, refer to our documentation.

Support for Hybrid tables

Snowflake Hybrid tables, also known as Unistore hybrid tables, facilitate fast, single-row operations by enforcing unique constraints on primary keys and incorporating indexes to expedite data retrieval. These tables are tailored to support both analytical and transactional workloads concurrently, forming the backbone of Snowflake’s Unistore architecture.

LocalStack for Snowflake now includes support for Hybrid tables, enabling the creation and management of these tables locally. The supported SQL statements for managing Hybrid tables include:

For more detailed information on using Hybrid tables, refer to our documentation.

Support for Dynamic Tables

Snowflake Dynamic Tables allow a background process to continuously load new data from sources into the table, accommodating both delta and full load operations. These tables automatically update to reflect query results, eliminating the need for a separate target table and custom data transformation code. They are regularly updated through scheduled refreshes by an automated process.

LocalStack for Snowflake now supports Dynamic tables, enabling you to create and manage them locally. The supported SQL statements for managing Dynamic tables include:

For more detailed information on using Dynamic tables, refer to our documentation.

Support for new integrations

We have expanded the integration capabilities, enhancing the Snowflake emulator with new tools and services that streamline development and testing workflows. These integrations focus on improving data quality, schema management, and service compatibility, effectively bridging local development and cloud environments. You can now:

  • Run Soda data quality checks within the Snowflake emulator. You can find our sample app on GitHub.
  • Apply and manage database migrations locally using Flyway, facilitating schema changes efficiently. Find our detailed guide on our documentation.
  • Connect to real or emulated AWS S3 using external volumes for enhanced storage integration and data handling.
  • Broaden your local developer environment with integration for AWS services like MWAA, Glue, and EMR, to facilitate multi-cloud testing.

What’s next?

LocalStack for Snowflake is designed to be with you in every step of your Software Development Life Cycle (SDLC) — right from developing data applications, testing infrastructure in continuous integration (CI) pipelines, to deploying ephemeral application previews for team collaboration — to make you confident in shipping your applications to production!

As we look to the future, we are preparing to enhance our interoperability with the actual Snowflake service, expand integrations with AWS and other providers, and introduce emulation of access control and security policies. Here’s what you can look forward to in the upcoming months:

  • Enhanced support for the internal type system of the emulator, to further enhance parity for different data types.
  • Advanced integration of Cloud Pods to support test data management in CI, and allow merging of database table rows.
  • Enhanced implementation coverage for additional Snowflake SQL functions.
  • Support for automated replication mechanism to copy resources from the real Snowflake service into the Snowflake emulator.
  • Implement further integrations between AWS services and Snowflake in the local emulator, including Kinesis Firehose streaming, EMR/Glue data jobs, SageMaker for ML training/inference, etc

We are immensely thankful to our community and users for their insightful suggestions, feedback, and bug reports shared through our Slack community. Your ongoing support has been instrumental, and we are grateful for it!

Learn more

New to LocalStack? Create a free account today and get started!


Harsh Mishra
Harsh Mishra
Engineer at LocalStack
Harsh Mishra is an Engineer at LocalStack and AWS Community Builder. Harsh has previously worked at HackerRank, Red Hat, and Quansight, and specialized in DevOps, Platform Engineering, and CI/CD pipelines.