Unlocking Snowflake Native App Potential with External API Calls

Snowflake Native Apps are self-contained data applications that live within the Snowflake data platform. Instead of having to bring the data to your app, you bring your app to the data. Developers can take advantage of Snowflake’s security, scalability, and performance without managing their own infrastructure. Additionally, the Snowflake Marketplace facilitates application distribution and monetization.

Snowflake users benefit from the ability to install and utilize apps to extend Snowflake beyond its native capabilities. Installing and using apps from the marketplace is a straightforward process, requiring just a few clicks.

Snowflake recently introduced to public preview, support for native external API calls, significantly enhancing functionality beyond the previous method, which relied on external functions. Calling external APIs from Snowflake offers another means to expand Snowflake’s default experience.

This blog post explores how to combine these two worlds — Native Apps and external API calls — empowering you with the tools to create any type of data application you can imagine.

Introduction to what we will build

There are many reasons you might want to call an external API from a Snowflake Native App. For example, you might want to write a function that imports all your pull requests from GitHub, tokenize and encrypt PII and PHI using Skyflow, or grab your message history from Twilio for logging and analysis.

In this blog post, we’ll create a translation Native App that uses DeepL’s translation API. With the app installed, you can easily translate data in your warehouse from one language to another.

The image below shows the information flow we’ll be creating.

Translation app data flow to call an external API to translate input text.

The translation app is installed in an app consumer’s account, giving the consumer access to call the translate function. Behind the scenes, the translate function within the app calls the DeepL translate API.

If you want to jump directly into the source code, you can check out the GitHub repo here.

Calling a third-party API

As described in my prior article, in order to call a third-party API we need to grant permission for our Snowflake account to access the domain of the API. This is done by creating a network rule where we list the domains of the third-party APIs that we need access to.

CREATE OR REPLACE NETWORK RULE my_apis_network_rule
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('<REPLACE_WITH_API_DOMAIN>');

Next, to call most third-party APIs, we need to authenticate the call via an API key or auth bearer token of some sort. Snowflake has native support for managing secrets, which is a secure and recommended way for managing API access.

Snowflake supports several types of secrets including OAuth and basic authentication. The secret below stores an auth key as a string value.

CREATE OR REPLACE SECRET my_auth_key
TYPE = GENERIC_STRING
SECRET_STRING = '<REPLACE_WITH_AUTH_KEY>';

Finally, we aggregate the network rule and secret values by defining an external access integration.

CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION my_external_access_integration
ALLOWED_NETWORK_RULES = (my_apis_network_rule)
ALLOWED_AUTHENTICATION_SECRETS = (my_auth_key)
ENABLED = true;

With a UDF, the external access integration and secret are passed into the UDF so that the UDF can successfully call the API. Using a third-party API from a Native App follows a similar flow, except instead of passing these values as parameters of a UDF, we need to pass them into the Native App during an app initialization procedure.

The Native App setup

Snowflake has a great tutorial that walks you through all the steps to create and test a Native App. In this post, I won’t be covering all steps in detail and instead will focus on the external API functionality.

The app we’re creating has the following file structure:

/snowflake-translation-app
manifest.yml
README.md
/scripts/
setup.sql
/src
udf.py

The setup.sql runs automatically when a consumer of the application installs the app in their account. The udf.py contains the Python code that defines the application logic.

The app setup

The setup.sql contains the definition of an init_app procedure and translate function.

CREATE APPLICATION ROLE app_public;
CREATE SCHEMA IF NOT EXISTS core;
GRANT USAGE ON SCHEMA core TO APPLICATION ROLE app_public;

CREATE OR ALTER VERSIONED SCHEMA code_schema;
GRANT USAGE ON SCHEMA code_schema TO APPLICATION ROLE app_public;

CREATE OR REPLACE PROCEDURE code_schema.init_app(config variant)
RETURNS string
LANGUAGE python
runtime_version = '3.8'
packages = ('snowflake-snowpark-python', 'requests', 'simplejson')
imports = ('/src/udf.py')
handler = 'udf.init_app';

GRANT USAGE ON PROCEDURE code_schema.init_app(variant) TO APPLICATION ROLE app_public;

CREATE OR REPLACE FUNCTION code_schema.translate(text string, target_language string)
RETURNS string
LANGUAGE python
runtime_version = '3.8'
packages = ('snowflake-snowpark-python', 'requests', 'simplejson')
imports = ('/src/udf.py')
handler = 'udf.translate';

GRANT USAGE ON FUNCTION code_schema.translate(string, string) TO APPLICATION ROLE app_public;

The init_app procedure is very important. This procedure is what will make calling the external API from the translate function possible.

It takes in a config object, which will be a key/value pair containing the name of the secret that contains the DeepL API auth key and the external access integration.

Let’s take a look at the code of the procedure.

Granting access to call the API

The code to initialize the translation app is shown below. The only thing the procedure is doing is altering the translate function’s secrets and external_access_integration parameters. This is necessary to give access to the secret and external API domain.

def init_app(session: Session, config) -> str:
"""
Initializes function API endpoints with access to the secret and API integration.

Args:
session (Session): An active session object for authentication and communication.
config (Any): The configuration settings for the connector.

Returns:
str: A status message indicating the result of the provisioning process.
"""
secret_name = config['secret_name']
external_access_integration_name = config['external_access_integration_name']

alter_function_sql = f'''
ALTER FUNCTION code_schema.translate(string, string) SET
SECRETS = ('token' = {secret_name})
EXTERNAL_ACCESS_INTEGRATIONS = ({external_access_integration_name})'''

session.sql(alter_function_sql).collect()

return 'Snowflake translation app initialized'

Not needed for this particular application, but this is where you might want to accept other configuration settings from the application consumer and store them in a table for use by your application.

Calling the translate API

The last part of the application that we need is to define the translate function.

The code below converts the passed in text to the target language using the DeepL translate API endpoint. The app consumer’s DeepL API auth key is retrieved from the secrets manager. If the init_app function is not called prior to calling translate, the function can’t call outside of Snowflake to the api-free.deepl.com domain.

def translate(text, target_language):
"""
Translate the text value to the target language and return the translated text.


Args:
text (str): The text to translate.
target_language (str): The two letter code representing the language to translate the text into.
Returns:
str: The translated text.
"""

auth_key = _snowflake.get_generic_secret_string('token')

body = {
'text': [
text
],
'target_lang': target_language
}

url = 'https://api-free.deepl.com/v2/translate'
headers = {
'Authorization': 'DeepL-Auth-Key ' + auth_key
}

session = requests.Session()
response = session.post(url, json=body, headers=headers)
response_as_json = json.loads(response.text)

return response_as_json['translations'][0]['text']

Testing the app

After following the Snowflake instructions to publish and install your app, calling the application is straightforward. Just as with calling an external API from a UDF, you need to define the secret, network rule, and external access integration for the API. In our example, we’ll do that for the DeepL API.

CREATE OR REPLACE SECRET deepl_auth_key
TYPE = GENERIC_STRING
SECRET_STRING = '<REPLACE_WITH_DEEPL_AUTH_KEY>';

CREATE OR REPLACE NETWORK RULE deepl_apis_network_rule
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('api-free.deepl.com');

CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION deepl_external_access_integration
ALLOWED_NETWORK_RULES = (deepl_apis_network_rule)
ALLOWED_AUTHENTICATION_SECRETS = (deepl_auth_key)
ENABLED = true;
Example of creating the external access integration.

Next, we need to grant usage of the external access integration and secret to the installed app. Since secrets are a schema-level object, we also have to grant access to the database (TEST in this example) and schema (TEST_SCHEMA in this example) to the app and use the fully qualified secret name, “TEST.TEST_SCHEMA.deepl_auth_key”.

GRANT USAGE ON DATABASE TEST TO APPLICATION snowflake_translation_app;
GRANT USAGE ON SCHEMA TEST.TEST_SCHEMA TO APPLICATION snowflake_translation_app;
GRANT USAGE ON INTEGRATION deepl_external_access_integration TO APPLICATION snowflake_translation_app;
GRANT READ ON SECRET TEST.TEST_SCHEMA.deepl_auth_key TO APPLICATION snowflake_translation_app;

With the configuration out of the way, we can call the application’s init_app procedure, passing in the external access integration and secret names. Again, note that I’m using the fully qualified secret name.

CALL snowflake_translation_app.code_schema.init_app(PARSE_JSON('{
"secret_name": "TEST.TEST_SCHEMA.deepl_auth_key",
"external_access_integration_name": "deepl_external_access_integration",
}'));
Initializing the app, passing in the secret and external access integration names.

Finally, we can call the app’s translate function, passing in the text we want to be translated.

SELECT snowflake_translation_app.code_schema.translate('Hello, World!', 'FR')
UNION
SELECT snowflake_translation_app.code_schema.translate('Hello, World!', 'DE')
UNION
SELECT snowflake_translation_app.code_schema.translate('Hello, World!', 'ZH');
Executing the translation app to translate into three different languages via DeepL’s API.

Final thoughts

As demonstrated in this article, Snowflake now allows your data applications to seamlessly call any third-party API, empowering you to enrich and modify data with ease. This includes APIs requiring consumer authorization credentials, expanding your possibilities.

API providers can develop and distribute Native Apps, acting as Snowflake-specific SDKs for a seamless user experience. This significantly reduces client development burden and creates a more integrated and Snowflake-native experience.

Hopefully my article and code repository helps you successfully develop and distribute your app. Happy building!

--

--

Sean Falconer
Snowflake Builders Blog: Data Engineers, App Developers, AI/ML, & Data Science

Head of Developer Relations and Marketing @ Skyflow | Engineer & Storyteller | 100% Canadian 🇨🇦 | Snowflake Data Superhero ❄️ | AWS Community Builder