Snowflake GenAI Capabilities 1: Inference Using Streamlit App

Feng Li
3 min readJan 28, 2024

--

Woodland Trial in Rouge Park. Toronto, ON, Jan 27, 2024

Snowflake has made it super easy to inference a LLM via it’s API. External access integration allows UDF or procedure to call OpenAI APIs for example. This UDF can then be called in Streamlit app. Let’s see how we can do it with just a handful of code.

1 Create UDF to call OpenAI APIs

use role sysadmin;
use feng_database;

-- Create secret for openai api key
CREATE OR REPLACE SECRET secret_open_ai_api
TYPE = GENERIC_STRING
SECRET_STRING = 'sk-Hxxxxy';

-- Create network rule to allow outbound access to api.openai.com
CREATE OR REPLACE NETWORK RULE network_rule_openai
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('api.openai.com');

-- Create external access integration which will be used by next UDF to
-- call Openai API
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION external_access_int_openai
ALLOWED_NETWORK_RULES = (network_rule_openai)
ALLOWED_AUTHENTICATION_SECRETS = (secret_open_ai_api)
ENABLED = true;

-- Create UDF to call chat completion api for given text
CREATE OR REPLACE FUNCTION UDF_CHAT_COMPLETE(query varchar)
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION = 3.10
HANDLER = 'char_completions'
EXTERNAL_ACCESS_INTEGRATIONS = (external_access_int_openai)
SECRETS = ('openai_api_key' = secret_open_ai_api)
PACKAGES = ('openai')
AS
$$
import _snowflake
import openai
def char_completions(query):
openai.api_key = _snowflake.get_generic_secret_string('openai_api_key')
messages = [{"role": "user", "content": query}]
model="gpt-4"
response = openai.chat.completions.create(model=model,messages=messages,temperature=0)
return response.choices[0].message.content
$$;

-- Test
SELECT UDF_CHAT_COMPLETE('Describe Snowflake external access to a non-technical user') as chatgpt_3_5_response;

2 Create Streamlit App

import streamlit as st
from snowflake.snowpark.context import get_active_session

st.title("Chat with OpenAI")

session = get_active_session()

def get_responses(prompt):
df_res = session.sql(f"SELECT UDF_CHAT_COMPLETE('{prompt}')").to_pandas()
return df_res.iloc[0][0][0:-1]

my_chat = st.text_input("Enter a chat text",placeholder='For example: what is Generative AI?')
if my_chat:
st.caption(f"OpenAI Responses")
st.write(get_responses(f"\"Answer this question: {my_chat}\""))

Now we have a front end UI where we input chat messages and get responses from OpenAI base model GPT-4. Great!

Further, we can convert this Streamlit app to a Snowflake Native App so it can be distributed via Marketplace.

Note, currently we can not use OpenAI client library in Streamlit code to communicate with OpenAI server directly. Getting “APIConnectionError: Connection error.” if you do so. This is because of the security policy of Snowpark engine where our Streamlit Python code runs. External access integration is needed.

Finally we understand there are a lot more need to be done from here. For example, base model only predicts next words and has no field knowledge etc. Those can be enhanced with prompt, RAG and Fine-tuning techniques. we’ll see what other GenAI capabilities in Snowflake next.

Happy Reading!

--

--

Feng Li

Software Engineer, playing with Snowflake, AWS and Azure. Snowflake Data Superhero 2024. SnowPro SME, Jogger, Hiker. LinkedIn: https://www.linkedin.com/in/fli01