One Thousand and One Days in the Snowflake Cloud

Genius is one percent inspiration and ninety-nine percent perspiration–Thomas Edison

Did you know, perspiration can be automated? Or was that precipitation, I feel the snow coming on.

Hi, I’ve been in an flying in the precipitous Snowflake clouds for over 1,000 days now, and am loving it!

As a former on-premise enterprise architect turned cloud data platform architect, I‘ve got perspective. Here are some of my 1000 cloud day take-aways, and as you read, keep your feet on the ground and reach for the Snowflake Cloud!

Compact

Companies and institutions are transitioning in fits and starts from legacy on-premise systems of storage and computation to newer, more compact, easier to access, easier to use, more abstracted and hosted machines to get work done automatically.

Gains in process and storage automation, in energy consumption, resource allocation, operations as a service, and a move towards operational expenditures have incentivised businesses to pool their resources and move massively to shared resources, leading to economies of scale savings.

What to do? Augment, refurbish, replace where it makes sense, don’t just bolt on yet another solution. Open your mind and institutions to sharing with controls. Consider what durability and high-availability is built-in from day one. Build robust disaster recovery across clouds and regions. Reduce your operational overhead. Better manage mergers and divestitures through sharing and segmentation.

Automation advice. Decide what you hate doing the most, and then find something to automate it. A colleague threw some ABAP at Chatbot the other day to get some SQL.

Share

The positive benefit is greater transparency on costs; the negative might be that costs are now itemised and distributed. Business consumers, who before were running machines 24 hours a day, seven days a week on the Global IT budget, now have to think about when to turn off, on and up the power.

Change management advice. Baseline before you start your project — understand and communicate current costs, and build metrics to judge success. I had a client who complained Snowflake was costing 10K per month. Compared to what I asked, the 4M per year license cost they were paying before?

When comparing, compare end-to-end. Take a look at startup and shutdown times if you’re paying by the second, it makes a difference. Snowflake, by managing the pool of compute clusters, allocates computing resources in 1–2 seconds, and not minutes or hours like other players. Deallocation happens in a second. Some other players cannot even scale down the size of their installations.

Scale Wide

Legacy software editors have fundamentally limiting architectures that only scale horizontally so much. They prefer to go back to one big vertically scaled machine, which means expensive. This exposes fundamental differences in architecture to the cloud, which were built to scale massively on cheaper machines.

Bubbles advice. When I came to France in 1995, Minitels were still around. Someone had forgot to mention personal computers to them. Get out of your bubble, and don’t trust your search engine to give you un-biased advice — the algorithm suggests on previous selections.

Gravitate

As hyper scalers have had better mastery of energy, resources, and physical machines, and less with services, new ecosystems have grown, native to complex governance and security environments, separation of compute and storage, and massive massive parallel processing, with a focus on ease-of-use.

There is a huge gravitational pull from the data housed there, so much so, that it has now become more economical to bring the computational resources and higher level applications to the data.

Far from being Black Holes, Market places of data and applications become available at the click of a button.

Snowflake is massively extending its reach with its Native Applications Framework and container services as well.

Gravity advice. Gravity doesn’t exist. Einstein replaced Newton’s explanation with a better one. I didn’t understand that when I was going to school. Check out David Deutsch’s Beginning of Infinity.

Errrrp!

Of course, keep your ERP! But these systems need to be kept lean and mean, and not conflated with data that doesn’t fit to their purpose. Why? Too much load, they slow down and fall over. On the other hand, lots of other applications can and should be transformed to the cloud, to handle demand and take advantage of all that information out there.

Transactional systems need to be augmented with knowledge actioning services that effect change. Automated systems that are no longer resource constrained, knowledge services that can easily share and mine internal, and the data of others, effortlessly.

Don’t drink the Koolaid. I almost had a fit yesterday — a customer is deploying a brand new ERP system but is keeping 20 year old OLAP technology to run the analytics — -they are missing out on the sharing economy, not to mention open standards and infinite reach.

Hal?

You want a knowledge service that is easy to use, that easily meshes with new technologies as they come on-line, such as Large Language Modules. A knowledge service that brings functions online quickly and seamlessly. No down-time. No stability issues. No little dials.

Look for a service that you can put in all your data natively, whether it’s coming from your transactional systems, or customer satisfaction forms, from dental x-rays, or geographical data.

People still have the creative impetus over machines. Enlist the creativity and responsibility of your team members. People tend to support what they help create. Who knows the business value of the data best? Consider data mesh principals to structure your team and flows. Enable natural language processing.

Hop

Technology compacts, make sure you are reducing the hops your data takes to the destination, and not adding hops, or copies. Zero-copy cloning and Time Travel come to mind as great space and time savers.

Companies driven by quarterly results often have the biggest technical debt because managers are incentivised to squeeze squeeze squeeze. They never have time to decommission older systems. Consider making digital twins of your older systems to Snowflake, then decommission them. If people need access to the data, they still have it, cheaply.

Create

Provide a system to your users so they can provide the one percent of creativity, and then automate the ninety-nine percent!

They will love you for it.

--

--

David Richert
Snowflake Builders Blog: Data Engineers, App Developers, AI/ML, & Data Science

Before joining Snowflake I worked for SAP for 18 years in technical sales for their analytics portfolio. Snowflake fills the big data gap.