Can you predict what your customers need?

Snowflake: Say Hello to the Data Cloud!

What is the future of Snowflake, and how have they begun to differentiate themselves?

James Anderson
Published in
8 min readJun 8, 2020

--

Since 2012, Snowflake has constantly pushed themselves to innovate and enhance their product based on customer feedback, and that initiative remains the same today. Last Tuesday, in lieu of their annual Snowflake Summit (canceled due to the COVID-19 outbreak), Snowflake hosted a 3-hour virtual product launch, the theme of which was “Welcome to the Data Cloud.” The vision is bold, but the message is clear: Snowflake wants to be the first complete end-to-end cloud data solution for any enterprise. And with the announcements that were made, it’s clear that this isn’t just marketing fluff.

One of the most frequent client challenges we encounter is the question of how to best manage customer information. There are many customer-specific tools in the marketplace today, all of which can build a customer data hub that integrates with other applications, but most of them are tailored to specific functions (marketing, sales, service) and all tend to fall short of that vision of a single view. While Salesforce is typically the primary source of customer information, essential data is often scattered across the organization in other systems making it difficult to complete the portrait of a customer. With the new partnership between Salesforce and Snowflake, Snowflake appears well-positioned to complement Salesforce as a much needed general-purpose data storage, integration, and processing platform.

So, how do these new features that are now available help Snowflake validate their claim of being the “Data Cloud?” And specifically, what does it bring to Salesforce customers who want to do more to unlock insights out of their customer data? There are a number of new capabilities that, if leveraged effectively, will give an enterprise a head start on getting value from the eventual integrated flow of data from Salesforce to Snowflake.

External Functions for Customer Analytics

One of the benefits that a Customer Data Platform tool provides is the ability to quickly ingest and enhance a customer record. This is usually accomplished through a basic matching algorithm, though sometimes a tool will have some more advanced functionality which can run some pre-built AI/ML models for predictive scoring. These tools have the compute power built-in, but require knowledge of the platform to deploy these more advanced models, and lock the usage of that model into the tool itself. You may even be limited to what type of language your model is in.

With the release of External Functions, Snowflake has given users the ability to call on functions outside of the Snowflake ecosystem and build them into their existing ETL/ELT pipelines. For example, if you had an ML model that was deployed as an AWS Lambda function for lead scoring, you would not need to leave Snowflake to run the model against a customer record. As long as the function had an HTTPS endpoint, and could accept and return JSON messages, an external function could be created in Snowflake that points to the Lambda function and provides a score for the lead record as it’s being processed into the Snowflake database. Once that’s processed, you could easily report back on those leads via an Einstein or Tableau dashboard, or even create a workflow to send that data point back to the lead record in Salesforce.

As a Salesforce customer, this becomes an incredibly powerful way to enhance lead, customer, or even opportunity data without having to orchestrate many different workflows, or moving your data all over the place to get the insights required. This has traditionally been a barrier to entry for many Salesforce customers, since the level of effort to build, train, and especially deploy these models as a production workload is high. But by getting all customer data loaded to Snowflake, tools like DataRobot and AWS Sagemaker will have all the data required to support the building and training of ML models, and a standard data engineer will be able to take advantage of the trained models in their production ELT workloads very easily by just executing an External Function.

Dynamic Data Masking for Customer Privacy & Innovation

One of the most common questions that come up when dealing with customer data is how to best secure the data. PII and PHI are unbelievably important pieces of data and are incredibly important to keep secure. Data points like Name, Phone Number, Email Address, even Physical Address requires an enhanced level of security that make IT leaders take a hard look at the capabilities of any system that holds customer data. Salesforce has always been very serious about the security of the data that they hold, and the back end database keeps the data masked and only unmasks the data to a front end user who is shown to have access to that data.

Snowflake has also always been very security-focused, with their Business Critical tier known to be PCI and HIPAA compliant out of the box. But, there was never any masking solution in place that would grant users the ability to see data that they needed to see unmasked, but stay masked for any other user. This was usually handled through secured views and row-level security, which leaves an organization vulnerable to development mistakes. With the introduction of Dynamic Data Masking, the data that’s loaded into Snowflake can be considered just as secure as the data loaded to Salesforce, while minimizing the risk that someone will be able to gain access to underlying data sets, or security policies not being applied correctly to a new table or view. Salesforce has put a lot of energy to provide services to their customers around GDPR and California Consumer Privacy Act (CCPA), supporting the security and privacy requirements that come with maintaining customer data in this digital world, and Snowflake is now better equipped to support those requirements.

So, how can Dynamic Data Masking provide not only better security for your customer data, but also enable more innovation? Well, some customer data can only be used for certain things. For example, a mortgage originator cannot use listing data for marketing purposes, but with data masking, data scientists could potentially use masked data to train models that were restricted (for PII, or other legal reasons) data elements are not needed. Data masking allows for greater data innovation by allowing people to easily use the pieces of data that they can see as opposed to restricting access to the entire data set. Going forward, this would be the preferred way to manage role-based access to data for knowledge workers.

Private Data Exchange for Enablement & Enrichment

One of the largest value propositions that Snowflake has touted for their product has been the ability to centralize an enterprise’s data and make it available to everyone within the organization. However, the challenge with that has been how do you govern that data, and make it possible for others in the organization to see what are the data sets that they now have access to. Investments in data catalog tools have made it easier to a certain extent, but there is an amount of process and set up required in order to maximize the value from that investment. By using the Private Data Exchange, organizations can proactively improve data awareness and transparency with the establishment of their own internal data marketplace.

For most enterprises, there are many ways that the organization interacts with a customer, whether that’s through multiple inbound and outbound sales channels, marketing, e-commerce, or even customer service. For the most invested Salesforce customer, there are products and clouds that support each of these functions, with their own data silos collecting the data on a customer. Even if that data is centralized into a large data lake or data warehouse, it’s difficult for these silos to become cross-functional, and provide insights on customer behavior outside of the channel. By setting up a Private Data Exchange in Snowflake, each silo can begin to broadcast the data sets that they have, making the data available for other groups to leverage, develop new insights and innovations about their shared customer base, and publish that back to the Exchange for the other organizations to see. This is the definition of a strong federated operating model, and it is easily enabled by the Private Data Exchange. But why limit this to only internal use cases?

Since the Data Marketplace was launched at last June’s Snowflake Summit, there has been a focus on getting more organizations to monetize their assets. And while many companies have begun to leverage the Marketplace, not that many of the larger enterprises have been part of that wave. Yet some of the largest companies in the world are sitting on a vast amount of data that could very easily be monetized, especially related to customer behavior and buying patterns. So, why are they not pushing themselves to the Snowflake Data Marketplace? Because they don’t want everyone to see what they’re selling. Often times the privacy constraints and contract legalities of commercial data-sharing agreements are too sensitive to be broadcast in a public forum. There is also the concern of revealing sensitive intellectual property via metadata or data set descriptions alone.

With the launch of the Private Data Exchange, the Fortune 100 can begin to more securely explore the idea of sharing their data with their partners and customers. But rather than any Snowflake customer being able to see what data sets and data categories are being shared, an enterprise is able to enter into more formal agreements with their customers, maintaining security and compliance before the customer gets anywhere close to the contexts of the data sets that are available.

What’s Next for Snowflake?

Many of the new features are available in either Public Preview or Private Preview today, but the most groundbreaking one, Einstein Analytics Outbound Connector for Snowflake (more details on that here), will not be available until the fall at the earliest. And while there were not too many features specific to the Salesforce and Snowflake integrations in this release, there were many capabilities that set the foundation for customers to truly harness the power of their Salesforce data. And, with the future Einstein Analytics Outbound Connector for Snowflake providing near real-time sync of data from Salesforce to Snowflake, organizations who want to be ready to rapidly accelerate their customer analytics, as well as the foundations of their enterprise data cloud, should be taking a long hard look at Snowflake.

As experts in both Snowflake and Salesforce, Slalom is uniquely positioned to provide our clients not only immediate value from their customer data but a strategic vision and roadmap for how best to leverage this new technology partnership. With all the new capabilities, like External Functions, Dynamic Data Masking, and the Private Data Exchange, large Salesforce customers should be able to maximize the value of their customer data and reap the rewards. By partnering with Slalom, we can help you and your company build a complete picture of your customers, ultimately helping you serve your customers even better than you do today.

--

--

James Anderson
Slalom Technology

Sales Engineering Leader @ Snowflake. All opinions expressed are my own.