Part 2: Create API Bigquery to Postgresql with Nodejs

Rayhan Rafiud Darojat
4 min readFeb 21, 2020

--

Hello, In this part we will learn how to insert data from big query public data set into Postgresql. I hope you have done read the first part of this Series

Photo by NASA on Unsplash

What we will do ?

  1. Create Project In Google Cloud Platform
  2. Setting docker for postgresql
  3. Code and code
  4. Check data with postman

Lets Get Started

Create Project In Google Cloud Platform

  1. Create Project
console.cloud.google.com/project/create
Setup New Project

2. Setting up Authentication

To run the client library, you must first set up authentication by creating a service account and setting an environment variable. Complete the following steps to set up authentication. For other ways to authenticate, see the GCP authentication documentation.

  • In the Cloud Console, go to the Create service account key page.
  • From the Service account list, select New service account.
  • In the Service account name field, enter a name.
  • From the Role list, select Project > Owner.
  • Click Create. A JSON file that contains your key downloads to your computer.
Generate key.json
  • Then will be auto download key.json for auth to bigquery

3. Setting Credentials Path

export GOOGLE_APPLICATION_CREDENTIALS="[PATH]"example : export GOOGLE_APPLICATION_CREDENTIALS="/home/bigquery-api-nodejs/key.json"

4 . Get Public Data Set and Choose Google Analytics Sample

Public Dataset in Bigquery
Google Analytics Sample Dataset

5. Generate API Key for Acces Dataset

https://console.cloud.google.com/flows/enableapi
Generate API Key for Access Public Dataset
  • Add key to your key.json

Okay it’s Done for Setup in Google Cloud

Setting docker for postgresql

  1. Create folder bigquery-api-nodejs
  2. cd into folder and Initialization nodejs with
npm init -y

Create docker-compose.yml file

  • You can change environment according to your heart
  • There is 2 service when your run it, Postgresql and Grafana

Then run it with

sudo docker-compose up// or
// if you want to run it in background app
sudo docker-compose up -d
docker-compose up

If you want check container that running in docker

sudo docker ps

Last but not least, is go to postgresql bash

sudo docker exec -it CONTAINERID bash// nextpsql postgres -U user -p 5432
  • -U mean Username
  • -p mean Port

Yeyy, we are successfull running postgresql in docker

Let’s Code

Install needed package

npm install @google-cloud/bigquery --savenpm install express --savenpm install pg --save

Create file index.js

  • In this file, we are running server node with port 8080
  • And Create route for api

Create file command.js

  • Require postgres and bigquery package
  • Compare your config postgresql in const pool and in docker-compose.yml file
  • We are going to create 4 function, Get Bigquery, Get Postgres, Create Table and Drop Table
  • In getFromBigQuery function, SELECT FROM * table like ordinary sql db
  • Then set limit, i want set it to 1000, and you can change it as you as like
  • After data get data from bigquery, then map and modeling it
  • Last is get just the value from model , Postgresql doesn’t insert value in JSON so change it to Array
from JSON
{
name: "test", phone: 101010
}
to Array
["test",101010]

Run the code with node index.js or with nodemon

--

--