Learnings from real-world Snowflake customer case studies

Within my Mastering Snowflake program I sprinkle real-world use cases throughout the modules in order to bring to life the capabilities Snowflake offers.

The feedback I get from the members indicates that they really like these modules as it demonstrates how the technical deployment of Snowflake features translates to tangible business value.

In this article I have picked out 5 interesting Snowflake case studies as published on the Snowflake website to walk you through. The aim of this is to give you an idea of the common pain points which lead to customers selecting Snowflake, along with other modern cloud-native tools and what kinds of business benefits are realized following the implementation of Snowflake.

  1. Ohio Bureau of Workers’ Compensation (OBWC)

OBWC is a provider of workers’ compensation insurance and serves over 257,000 employees making it one of the largest state-run insurance systems in the US. In 2015 OBWC found itself hitting the limits of its on-premise SQL database. They were suffering from various performance and scalability issues such as storage limitations, resource contention, and infrastructure maintenance. To address these issues with the current solution would have been complex, costly and risky. OBWC decided that the future of their data platform was in the cloud.

Snowflake was selected due to its native SQL capabilities — meaning that the existing workforce could get up to speed quickly with Snowflake and continue to handle BI requests from the business throughout the cloud migration phase. This is an important point to consider as the business doesn’t stop working whilst the data teams migrate their existing solution to the cloud. Demonstrating value early on from any new cloud solution is also imperative to securing business buy-in and adoption.

Snowflake’s multi-cluster shared data architecture with virtually unlimited scalability solved OBWC’s data storage and compute challenges. This allowed for skilled resources to be deployed onto higher value projects rather than addressing those performance challenges with the old solution.

Dbt was introduced to handle data transformations whilst leveraging the power and scale of the Snowflake service. This made it easier for the team to re-develop their dimensional models leading to better decision making across actuarial, medical, claims, and other departments. The ease of connecting dbt to Snowflake along with Tableau opened the door to self-service analytics for the business.

Snowflake’s data sharing feature was also deployed. This allowed for reader accounts to be set up which provided a convenient way to securely share data sets with other agencies. For example, OBWC relies on Snowflake reader accounts to share data from its Safety & Hygiene Department with the Ohio Department of Administrative Services.

In the future OBWC are looking to add ML use cases on top of their dimensional models in Snowflake which are aimed at rapidly identifying fraud and pinpointing claim causation. These are really interesting applications for ML within the Insurance industry where I personally have a lot of industry experience. If you’d like to know more about ML use cases for the Insurance sector please drop me a note in the comments below.

Data from the Snowflake Marketplace can also be leveraged to augment existing internal data to further enhance ML model accuracy. Snowflake’s support for Python through Snowpark (Snowflake’s ML component) allows for building models without the need to move data out of Snowflake and into another tool, further streamlining the analytical modeling process.

2. Natwest

Natwest is a long standing retail and commercial bank in the UK. Its ESG (environmental, social, and corporate governance) team delivers climate and social responsibility insights to help the bank comply with regulations and help customers transition to net zero.

Financial markets need clear, comprehensive, high-quality information on the impacts of climate change. This includes the risks and opportunities presented by rising temperatures, climate-related policy, and emerging technologies in our changing world. In 2019, the Bank of England began stress-testing major financial institutions to understand the impact physical climate risks would have on liabilities and investments. They created the Task Force on Climate-related Financial Disclosures (TCFD) to improve and increase reporting of climate-related financial information.

Due to this increased importance climate was elevated to Natwest’s 2030 goals, which meant their ESG data team needed to provide data insights to meet a wide spectrum of climate analytics use cases across lending, investment and compliance.NatWest’s existing on-premises systems weren’t equipped to handle these requirements or the large, varied data sets ESG reporting demands.

As Natwest primarily used a Microsoft based tech stack so were very keen on a SQL based solution which formed part of their core criteria for their vendor selection process. The ESG team were also looking for features which would support rapid prototyping of new features so ideas could be proved quickly to secure buy-in and build momentum.

Many of the data providers used by Natwest such as FactSet and S&P were already data providers on Snowflake’s marketplace. This meant that more traditional methods of transferring data such as SFTP could be negated. Ultimately Snowflake was selected as the next generation data platform for Natwest’s ESG team.

Similar to the Ohio use case we discussed earlier, the fact that Snowflake is based on SQL meant that the 40 strong team were productive very quickly. NatWest’s ESG team was also able to quickly decommission legacy on-premises systems, reducing both its costs and carbon footprint by around 80%.

Another similarity with the Ohio use case was the dbt was deployed and used to develop a ‘Data Guardian framework’ which appears to be a custom metadata driven solution to profile data and help manage data quality proactively. The granular access controls Snowflake offers such as row & column level access controls and masking policies was also another key factor for a bank who will no doubt be dealing with a lot of PII data on a daily basis.

Future aspirations for Natwest look to integrate Snowflake with the bank’s AWS infrastructure — this may help introduce new data which sits on AWS into Snowflake for example. Use of Snowpark is mentioned to support the build of ML models as well as Data Clean rooms to securely share data with partners for a range of use cases.

3. Ramp

Ramp is a finance automation platform making it easier for companies to issue corporate charge cards, make bill payments, automate accounting processes, and monitor expenses. I picked this use case as it includes a number of modern cloud-native tools to achieve its goals.

Ramp has seen exponential growth since its start, increasing revenue and cardholders by 10x and 15x, respectively, year over year in 2021. This influx of new users created concurrency issues — resulting in costly, hard-to-debug failures, forcing the data team to spend too much time managing workloads. Whenever too many users tried to access a specific database, Ramp’s entire database would deadlock, and nobody could access the data. Scaling up to meet the demands of the various business teams and internal stakeholders required excessive work.

Ramp’s data team also wanted to give their business users access to the data they cared most about in the tools they use every day. This objective, mixed with the challenges of Ramp’s previous data warehouse, led the company to adopt a new data stack that could not only scale with Ramp’s data team but also provide a way for business teams to access the data sitting in the analytics layer.

Looking for a scalable and flexible solution that could act as a single source of truth, Ramp quickly landed on Snowflake. As a fully managed data platform, the Data Cloud completely eliminates all of the bottlenecks that Ramp was facing previously.

Snowflake automatically manages all of the underlying maintenance in the background, allowing the data team to focus their time transforming the data and building models that can positively impact Ramp’s bottom line. Snowflake scales automatically as Ramp grows, and the data team no longer has to worry about key tables getting locked or dashboards freezing. With Snowflake, Ramp’s data can automate all of the manual tasks that the admins were forced to manage in the previous platform.

Ramp collects valuable customer data from a variety of different data sources, including Postgres, Salesforce, HubSpot, Zendesk, Outreach, and various ad platforms. Spinning up additional software infrastructure to clean and transform this data in the analytics warehouse was difficult. The team built new data models manually without testing or version control, resulting in variations of core metrics.

Ramp uses dbt to standardize around a core set of metrics and fully automate, schedule, and run every transformation job. With dbt jobs running in Snowflake, Ramp’s data team can aggregate, transform, and enrich all of their customer data in Snowflake to build a complete 360-degree view of the customer. This data is then used to create risk profiles for specific users and improve personalization, whether it’s in the app, on the website, or through automated marketing campaigns.

Building data models in dbt is one thing, but activating them in downstream sales and marketing channels is another. With so many valuable dbt models living on Snowflake, Ramp wanted to use this data to enrich Salesforce.

Anytime Ramp’s data team wanted to move data, they were forced manually to build and replace various Python scripts. This was a challenge because Ramp’s business teams wanted access to this data in Salesforce and HubSpot to improve personalization. Since adopting Hightouch for reverse ETL, Ramp has created an entirely new outbound automation team that now drives 25% of all sales pipeline. Staffed by data engineers, this team collaborates closely with Ramp’s marketers to identify target prospects and deliver customized emails at scale to the right person at the right time. The team has become the single lowest- costing customer acquisition channel for Ramp.

Since adopting a modern data stack, Ramp can go from ideation to validation to iteration, and set up fully functional operational workflows and marketing campaigns in less than a day. For Ramp, the future lies in continuing to find ways to help businesses become better, more profitable versions of themselves through finance automation that maximizes the output of every dollar and hour. The company is looking to build out its automation platform to reach customers across every expense, payment, purchase, application, and insight, from reporting to forecasting.

4. Veradigm

I wanted to include a broad range of workloads and industries in these case studies, and Veradigm who provides data, insights, and technology to healthcare providers, payers, and life sciences organizations to help improve patient healthcare does just that. It uses collaboration, data engineering, data warehouse, data lake, data science and application workloads in Snowflake.

Veradigm wanted to establish a unified platform to bring together multiple, fragmented and inconsistent technologies. This environment made it difficult to govern data and apply a consistent approach to data quality. Onboarding a new data source took 9 months which gives you an indication of the complexity and inefficiencies buried within this environment. Some tables were so large that Veradigm’s data team had to break columns into smaller pieces, which led to additional complexity and time-consuming development work.

Veradigm successfully migrated data to Snowflake within two months of kickoff. The entire migration project, which involved re-platforming multiple reporting and downstream processes, took less than one year.

Snowflake’s multi-cluster shared data architecture with automatic scaling of storage and compute solved Veradigm’s performance issues at a lower cost. Additionally, the Snowflake Marketplace offered a scalable approach for providing Veradigm’s customers secure, governed access to Veradigm’s data. Also, Snowflake’s ability to provide a single and seamless experience across multiple public clouds aligned with Veradigm’s multi-cloud strategy.

Ongoing cost optimizations, such as granular control over the performance needed to meet the business SLA for each independent workload and refining auto-suspend settings, are also easier with Snowflake. With Snowflake, Veradigm can onboard new data sources in a matter of weeks instead of taking months.

Snowpark will play an important role in Veradigm’s upcoming machine learning initiatives. “Snowflake is evolving, and we want to do data science inside the platform,” Veradigm’s Vice President of Development, Sue Davis, said. Powering data science workloads with Snowflake will reduce Veradigm’s reliance on data extracts to third-party data science tools.”

Publishing additional data listings to Snowflake Marketplace can help Veradigm create awareness with a broader customer base. According to Lucarelli, “We believe Snowflake Marketplace gives us an opportunity to effectively share and showcase our data to healthcare companies and beyond.”

5. EDF

Finally we bring our attention to the Energy sector, a hot topic in the past 12 months due to rapidly increasing prices of electricity and gas. EDF is a leading energy supplier, supplying gas and electricity to homes and businesses across the U.K.

EDF wanted to use data science to help support it’s customers. Specifically EDF has a responsibility to help support its customers use energy efficiency and the identification of financially vunerally customers. They had a in-house built solution called the Customer Analytics Zone (CAZ) for developing and deploying these ML models. Unfortunately deriving tangible insights from this wasn’t particular easy.

Getting access to the right data to build the models and getting the models into production plagues the existing processes. The use cases in focus also contained a lot of sensitive data resulting in several sign off stages and requests to team to provision data extracts from the data lake, all of which took a lot of time and complexity. Furthermore the security and governance of that data was passed to the data science team.

In 2022, after spending four months trying to deploy a single ML model in CAZ, EDF decided it needed a new customer data platform and an entirely new approach. The energy provider turned to the Snowflake Data Cloud to provide a central source of easily accessible data for its new Intelligent Customer Engine (ICE). EDF now uses Snowflake and its Snowpark development framework to allow its data scientists to use Python, their language of their choice, and bring ML models into production on AWS SageMaker. EDF is taking advantage of the AWS and Snowflake partner relationship to provision enterprise-grade machine learning operations (MLOps) and data science capabilities.

Snowflake offered benefits unmatched by competitors, including elastic scalability and a language-agnostic processing engine with Snowpark. EDF now uses Snowflake as a central repository for all of its customer data. And through Snowpark’s support for Python and SQL, business users can manipulate that data exactly where it lies, deploying end-to-end machine learning to uncover the insights that make customers’ lives easier. It’s a world away from the complex and cumbersome environment that existed before.

One notable example among these new products is a machine learning algorithm designed to identify customers who are financially vulnerable. The model flags those who have regular periods of disconnection, or those who top up meters irregularly with small amounts of money, giving EDF the opportunity to provide assistance in a time of need.

EDF feels the platform will also help strengthen that team, attracting the kind of talent that is eager to explore the latest technologies and cloud-based data platforms. “It helps with both attraction and retention,” Rebecca Vickery, Data Science Lead said. “Having a modern platform and providing the opportunity to upskill on Snowflake is a huge pull for people looking to progress their careers. And, for our existing data scientists, not wrestling with data management and governance to get to the interesting part of our job, building models and generating insights, makes EDF a much better place to work.”

With this streamlined platform, EDF’s data scientists are able to focus on their primary initiative — taking care of customers, helping them through difficult times, and providing the bespoke products and services that lead to higher customer satisfaction and retention.

EDF recently released its energy hub analytics to customers, too, enabling them to use their own smart meter data to track their energy usage quickly and accurately. Customers can now access vital insights into how they can reduce consumption, which can greatly reduce both carbon footprints and energy bills.

Summary

You will have noticed there are some common themes emerging from these case studies. Many clients were suffering from the constraints of their existing infrastructure. We saw instances of on-premise technologies which were costly and hard to scale. This created performance issues leading to the business to try and find their own solutions. A bi-product of this fragmented infrastructure results in data silos making it difficult to manage and govern data and data.

Derived insights from data is one thing, but getting AI/ML models into production is a very common hurdle clients face. We certainly saw this to be the case with EDF.

For all these client the introduction of Snowflake immediately solves the scale and performance challenges. As Snowflake follows a consumption based model this lowers the barrier to entry for smaller clients. The fact that it also uses native SQL reduces the learning curve for existing data team. Additionally, if customer have workloads across different cloud providers Snowflake offers a solution to bring those together in one place, or it can offer a true multi-cloud offering to satisfy InfoSec requirements.

One common theme is that Snowflake is never introduced alone. You need to consider how to get the data in and transformed as well as governed appropriately and this leads to complimentary tools like Hightouch, dbt and Alation for example coming into play to form the modern data platform.

Finally, looking to the future for many of the clients we discussed they are looking to leverage semi and unstructured data, and move to integrate tools such as AWS Sagemaker and Snowpark with python into their environments. Native apps from Snowflake will no doubt break into new ground as organisations look to provide better, more relevant products to their customers too.

To stay up to date with the latest business and tech trends in data and analytics, make sure to subscribe to my newsletter, follow me on LinkedIn, and YouTube, and, if you’re interested in taking a deeper dive into Snowflake check out my books ‘Mastering Snowflake Solutions’ and ‘SnowPro Core Certification Study Guide’.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

About Adam Morton

Adam Morton is an experienced data leader and author in the field of data and analytics with a passion for delivering tangible business value. Over the past two decades Adam has accumulated a wealth of valuable, real-world experiences designing and implementing enterprise-wide data strategies, advanced data and analytics solutions as well as building high-performing data teams across the UK, Europe, and Australia.

Adam’s continued commitment to the data and analytics community has seen him formally recognised as an international leader in his field when he was awarded a Global Talent Visa by the Australian Government in 2019.

Today, Adam works in partnership with Intelligen Group, a Snowflake pureplay data and analytics consultancy based in Sydney, Australia. He is dedicated to helping his clients to overcome challenges with data while extracting the most value from their data and analytics implementations.

He has also developed a signature training program that includes an intensive online curriculum, weekly live consulting Q&A calls with Adam, and an exclusive mastermind of supportive data and analytics professionals helping you to become an expert in Snowflake. If you’re interested in finding out more, visit www.masteringsnowflake.com.

--

--

Adam Morton
Snowflake Builders Blog: Data Engineers, App Developers, AI/ML, & Data Science

Our mission is to help people trapped in a career dead end, working with on-premise, legacy technology break into cloud computing by using Snowflake.