The competition in the cloud data warehouse market is certainly hotting up! Amazon continues to be the dominant force with AWS Redshift but Snowflake is hot on their heels.

With Snowflake’s innovative approach to scaling, performance and cost, we’ve started to move all of our KNIME cloud data warehouse workflows from Redshift to Snowflake and with KNIME’s database nodes we can automatically create and launch Snowflake clusters, databases and configure data sources.

Luckily just like with AWS Redshift, the Snowflake drivers are just as easy to install and configure, find out how below;

Configuring the KNIME JDBC Driver

  1. Download the JDBC driver from Snowflake website via Maven and save this in a permanent directory that you’re not going to delete. ( You’ll want the jar file, not the javadoc.jar.
Image for post
Image for post

2. Open KNIME Analytics Platform and navigate to your preferences. I’m using an Apple Mac so your screen may vary slightly.

3. Expand the KNIME section and select Databases. Then select Add file and select the .jar file you downloaded in Step 1.

Image for post
Image for post
KNIME JDBC Driver Interface

4. Once you’ve hit Apply and Close, validate that the JDBC Driver is correctly installed by adding a KNIME Database Connector node to a KNIME workflow.

Database Connector Node

5. Configure the Database Connector node by selecting the newly installed net.snowflake.client.jdbc.SnowflakeDriver Database driver in the dropdown.

Image for post
Image for post

6. Replace the Database URL with your Snowflake instance details;


You can add the name of the warehouse or database to this connection string, for example;


One thing I would recommend you add to your connection URL is;


jdbc:snowflake:// CLIENT_SESSION_KEEP_ALIVE=true

This prevent nodes further down the workflow from timing out and prevents any token/authentication errors.

7. Finally enter your username and password, hit Apply and run the node. If successful, you’ll get a green indicator light.

Dynamically Creating / Scaling Snowflake Instances

Image for post
Image for post
KNIME Workflow to create Snowflake objects

One of the great benefits of KNIME Analytics Platform is being able to create a workflow that dynamically adjusts your cloud data-warehouse to optimise for ETL/ELT or Data Science processes. By leveraging the Database SQL Executor node, we can create a workflow that creates and launches a Snowflake warehouse and database before uploading and processing data.

Following Snowflakes example (

Create database

create or replace database sf_tuts;
Image for post
Image for post

Create warehouse

create or replace warehouse sf_tuts_wh with
auto_suspend = 180
auto_resume = true
Image for post
Image for post

Create CSV Format

create or replace file format my_csv_format
type = csv field_delimiter = ‘|’ null_if = (‘NULL’, ‘null’) empty_field_as_null = true compression = gzip;
Image for post
Image for post

When you run the workflow, it’ll automagically create your warehouse, database and source. You can chain together as many initiation SQL statements as required. To drop it all at the end, just add a node with;

drop database if exists sf_tuts;drop warehouse if exists sf_tuts_wh;

Automate your Data Warehouse

Image for post
Image for post
A complete KNIME workflow that takes Excel and JSON data, uploads it to S3 and automatically creates tables in Snowflake based on source data metadata.

Once you have everything in place, you can use the KNIME Database Table Creator and Dynamic Settings to create tables based on data supplied. This means, as new columns are added to sources, your data warehouse tables can automatically be rebuilt dynamically with the correct column types and sizes.

Update May 2019 — Thanks to Aaron Powers for pointing out that com.snowflake.client.jdbc.SnowflakeDriver is deprecated and net.snowflake.client.jdbc.SnowflakeDriver should be used. I’ve updated the above guide to reflect this.

If you want to find out more about Forest Grove Technology and our BI as a Service offerings, click the link.

You can find more information about the Snowflake JDBC drivers here

Snowflake has great SQL support for manipulating Snowflake objects

Written by

Product Strategy and Analytics. I’m probably doing a tutorial on the latest Azure cloud app. Any Q’s send me a tweet 🐥.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store