External Network Access On Snowflake Snowpark — Unlocking New Capabilities
This article talks about the most awaited and coolest feature — External Network Access which is now available in Public Preview for AWS except the Gov region.
You can create secure access to specific network locations external to Snowflake, and then use that access from within the handler code for user-defined functions (UDFs) and stored procedures. You can enable this access through an external access integration.
How to get started?
To enable access to specific external network locations, you create an external access integration that specifies a list of external locations and a list of secrets you are allowed to use.
By using the EXTERNAL_ACCESS_INTEGRATIONS clause to refer to this integration when creating the UDF or procedure with CREATE FUNCTION or CREATE PROCEDURE, you allow the handler code to use the secret to authenticate with the external location.
Follow the steps to set up access to an external network location from a UDF or procedure:
Create Network Rule:
You can use the CREATE NETWORK RULE command to create a network rule that represents the external network’s location and restrictions for access. For example, a network rule specifies network identifiers such as a hostname and the direction of communication with the network (ingress or egress).
CREATE OR REPLACE NETWORK RULE openai_network_rule
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('api.openai.com');
Create Secret For Credentials:
You can use CREATE SECRET to create a secret that represents credentials required to authenticate with the external network location. For example, the secret can contain credentials such as a username and password.
CREATE OR REPLACE SECRET openai_api_key
TYPE = GENERIC_STRING
SECRET_STRING = '<my-openai-api-key>';
Create External Access Integration:
You can use the CREATE EXTERNAL ACCESS INTEGRATION command to create an external access integration that aggregates allowed network rules (representing external network locations) and allowed secrets (representing credentials for authenticating) for use with UDFs and procedures.
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION openai_access_integration
ALLOWED_NETWORK_RULES = (openai_network_rule)
ALLOWED_AUTHENTICATION_SECRETS = (openai_api_key)
ENABLED = true;
Voila! Now it’s ready to make use of external access inside our UDFs or Stored Procedures.
When using the CREATE FUNCTION or CREATE PROCEDURE command to create a UDF or procedure, you can enable access to external network locations as follows:
- Include the EXTERNAL_ACCESS_INTEGRATIONS parameter, setting its value to one or more integrations. Each integration you specify allows access to the external network locations and secrets the integration specifies.
- Include the SECRETS parameter, setting its value to one or more secrets and the names you’ll use to access them from handler code. The secrets you specify as values must also be specified in the external access integration.
- In handler code, access the secret to retrieve credentials for authenticating with the external network location.
Let’s Try This Out
Here’s my example of how we can use OpenAI to get data insights from snowflake data.
And this is how I am able to call openAI API to get prompt responses from my Stored procedure.
Additional Content you may also like:
- Snowflake now allows you to record the activity of your snowflake function and stored procedure by capturing the log messages and trace events directly from the code whenever it executes. Read More below:
- A dynamic table is a table that materializes the results of a query that you specify. Rather than creating a separate target table and writing code to transform and update the data in that table, you can define the target table as a dynamic table, and you can specify the SQL statement that performs the transformation. Learn More below:
- Your Cheatsheet to Snowflake Snowpark Dataframes Using Python
About Me:
Hi there! I am Divyansh Saxena
I am an experienced Data Engineer with a proven track record of success in Snowflake Data Cloud technology. Highly skilled in designing, implementing, and maintaining data pipelines, ETL workflows, and data warehousing solutions. Possessing advanced knowledge of Snowflake’s features and functionality, I am a Snowflake Data superhero & Snowflake Snowpro Core SME. With a major career in Snowflake Data Cloud, I have a deep understanding of cloud-native data architecture and can leverage it to deliver high-performing, scalable, and secure data solutions.
Follow me on Medium for regular updates on Snowflake Best Practices and other trending topics:
Also, I am open to connecting all data enthusiasts across the globe on LinkedIn: