Connecting Azure Postgres DB in Databricks Notebook
TLDR; version
Since Azure is catching up in the market, it is gonna be necessary for developers to deal with Azure components — for example Postgres on Azure.
It is assumed that you have Postgres DB in Azure.
Check whether you have previously installed psycopg2 by using this command
!pip show psycopg2
If it is not there, you will get a warning like below
WARNING: Package(s) not found: psycopg2
If you don’t have psycopg2 installed previously, install by using the command below:
!pip install psycopg2
Let’s keep all credentials at once place, so we can get it from environment later. As of now, we keep them in the notebook for teting. It is not recommended to keep the credentials in the notebook.
username = '<db_username>'
password = '<db_password>'
dbname = '<db_name>'
hostname = '<host_name>'
port = <port_name_integer>
import necessary libraries as below
import psycopg2
Connect the database by using psycopg2 library as below:
conn_string = f"postgres://{username}:{password}@{hostname}/{dbname}?sslmode=require"
conn = psycopg2.connect(conn_string)
conn.autocommit = True
now, get the cursor as below
cur = conn.cursor()
finally, we can execute a simple query and print the row as below:
cur.execute("SELECT * FROM city;")
rows = cur.fetchall()
print('rows count : '+str(len(rows)))
if(len(rows) <= 0):
print('No Data available')
for row in rows:
print(row)
you can change the table name if you have a different table other than “city”
Comment below, if you have any problems.