Snowflake Summit 2022 Takeaways and How They Impact You

June 28, 2022

Snowflake recently hosted its first in-person Snowflake Summit conference since 2019. Starting with the keynote speech, there was a deluge of announcements. It would be impossible to cover each one, so I’ll highlight my thoughts on the more interesting ones.

I attended the conference with my Onebridge colleague, Andrew Bittermann, VP of Data Analytics Solutions. The sheer quantity of significant announcements and improvements to Snowflake and the dramatic increase in attendance (10,000 people) are testaments to the prominence of Snowflake and the Data Cloud in the market.

Onebridge VP Andrew Bittermann and I hit the ground running at the 2022 Snowflake Summit.

Technical Announcements

Here are some of the interesting developments from a technical standpoint:

Unistore/Hybrid Tables

One of the most impactful announcements was around Unistore. To understand Unistore, you must understand the two broad database workloads: transactional and analytic.

Snowflake started with a focus entirely on analytic workloads. This workload targets questions like “what are the total sales for FY 2021?” and “how many distinct customers bought this product in the last five years?” In more technical terms, you're crunching thousands to billions of rows to get a summarized answer.

Transactional workloads are the opposite. An example would be placing an e-commerce order. The app will need to quickly look up your account information and then insert a row into an orders table. From a technical standpoint, the system needs to make thousands of very fast single row lookups and inserts.

The problem is that increasing performance in one type of workload will decrease it in the other. This is usually why companies would have two databases, one transactional and one analytic.

Unistore is Snowflake’s attempt to expand into transactional workloads without sacrificing its traditional analytic performance. The goal is to have one database that can handle all workloads. This is an exciting development, but given the complexity of this feature, we are unlikely to see it generally available until sometime during 2023.

Does this mean you can ditch your current transactional systems like Oracle, SQL Server, or PostgreSQL? Probably not right away. These systems have been around for decades.

Unistore will launch with basic functionality, but it will take a long time to catch up to the incumbent players. It will be suitable for moderate use cases where it makes sense to keep all the data consolidated in one spot.

Native Application Framework

Snowflake announced the Snowflake Native Application Framework (NAF) at the conference too. The Snowflake Marketplace initially was conceived to sell or share datasets. With a few button clicks, you could instantly have access to hundreds of datasets. With the Native Application Framework, Snowflake is expanding its marketplace to enable selling or sharing of applications as well.

Programmability will be provided via stored procedures and user defined functions. Snowflake’s recent acquisition of Streamlit will provide a way to define a simple user interface. In short, initial applications will be modest. Don’t expect applications with complexity on par with Microsoft Excel to be hosted.

A big selling point of the NAF is that the code can be deployed into the customer’s account. Thus, customers don’t need to transmit data outside their account, and the provider’s code will be hidden from the customer to protect their IP.

Python Snowpark

Snowpark is a way to programmatically interact with your Snowflake data using Java, Scala, or Python. Python is the newest addition and was given special attention at the Summit. The goal is to consolidate all your data engineering and data science workloads into Snowflake without having to rely on external systems like Apache Spark.

Snowflake has partnered with Anaconda to provide access to their large library of Python packages. This will be good news to data scientists. In addition, Snowflake will support Python User Defined Functions. his will allow you to seamlessly blend Python into Snow SQL.

It is important to note that this Python code will run in a strict sandbox. Access to the network, disk, or other IO functions will be prohibited. If you need those functions, then you may be better off using Snowflake’s External Functions.

Other Announcements:

Here are highlights about some of the other new Snowflake capabilities announced at the conference:

  • Snowflake has made several behind-the-scenes performance improvements, especially in AWS
  • Improved support for geospatial workloads
  • Support for lake house architectures with Apache Iceberg
  • Improved data masking
  • Better ways to manage spend using resource groups and budgets
  • Improved streaming pipelines
  • Ability to replicate all objects across regions not just tables

How Our Snowflake-Partner Benefits Help You

As a Snowflake partner, it's clear to us that Snowflake is committed to ensuring that we can support the unique needs of our clients across different industries. For example, we now have the ability to gain expertise in new industry domains and vertical-based solutioning, layered on top of our existing technical expertise. Snowflake is providing support for multiple industries, the top three being healthcare, financial services, and manufacturing.

Due to our partnership status, Snowflake opened all its content to us, which will provide our team with greater knowledge and insights, especially what we can do to build solutions for our clients in specific verticals.

We’re excited Snowflake continues to improve and provide industry leading innovation in data infrastructure and data warehouse solutions – and even more excited to pass those solutions on to our clients.

About the Author:

Bradley Nielsen

Senior Tech Specialist

Bradley is a well-rounded developer in the field of data science and analytics. He has been a developer and architect on a wide range of data initiatives in multiple industries. Bradley's primary specialty is in data engineering: developing, deploying, and supporting data pipelines for big data and data science. He is proficient in Python, C#, SQL Server, Apache Spark, Snowflake, Docker, and Azure.

Related Partners

Related Services

Related Technologies

Related Industries

Stay in Touch with Onebridge

* Indicates required field
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Hey there! We hope you've noticed that none of our content is "gated," meaning we don't force you to provide your information in order to read our content. We work hard to provide valuable information to serve our audience and our clients, and we're proud of it.

If you'd like to be notified of new content, events, and resources from Onebridge, sign up for our newsletter here. After signing up, you'll get a profile link where you can tell us what topics you want to hear about.With Onebridge, you control your data.

Please follow us on social media to see upcoming events and other resources, like blogs, eBooks, and more!