How to set up GenAI x Looker extension under 15 minutes | Dashboard summary

Chi Bui
Joon Solutions Global
8 min readJun 24, 2024

This blog aims to present step-by-step guides to integrate GenAI technology seamlessly into your analytics workflow on Looker.

1. Introduction about Dashboard summary extension

Dashboard summary extension is an open source solution, powered by Google Vertex AI engine that utilizes Google Gemini model to generate summary for your organization dashboards at scale. This extension can be easily integrated with Looker to enhance organization analytics workflow.

Dashboard summary demo

2. Extension deep dive under the hood

2.1. Tech stack

  • For frontend side: A Looker extension is set up and can be added to dashboards to get LookML metadata and context to have input for prompts. Read more about Looker extension
  • For backend side: A websocket service will be deployed via Cloud Run, which is a serverless application on GCP. This service will be responsible for receiving traffic (dashboard contexts) from Looker frontend via extension, then calling Vertex AI API and Looker Query API to generate content (summary of dashboards).

2.2. Repository structure

Github repo: looker-open-source/dashboard-summarization

The repository contains 2 parts:

  • A bunch of files located on root repository and a src folder → for running frontend of the extension on Nodejs
  • A folder named websocket-service will be the code for running backend of this project: src folder containing code and Dockerfile for running front end of websocket service, AND terraform folder to trigger the service by deploying on Cloud Run

2.3. How does the extension work under the hood?

This extension will query chart data of each tile on each Looker dashboard, and call to Vertex AI SDK to generate a summary.

  • Generative model to be used: gemini-1.0-pro-001
  • Initial config for model: {max_output_tokens: 2500, temperature: 0.4, candidate_count: 1}
  • A prompt will include 3 parts: a fixed context, the query of the chart, and chart data in JSON format will be used to call Vertex AI API

Details of the configurations of the model can be found on websocket-service/src/index.js .Hence, if you want to play around twisting model parameters and prompts, go to that file and edit corresponding variables. Note that after adjusting this file, we need to deploy new versions of updated web-socket server into Cloud Run (Re-run step 4 & 5 of Backend setup on below section)

3. Setup guide

For ease of development experience, I recommend using GCP Cloud Shell Editor during the setup process.

3.1. Requirements and Permissions grant

3.1.1. GCP Permissions and Roles

3.1.2. Tools & Frameworks to be installed

Please note that when using GCP Cloud Shell Editor, these tools are installed by default. Therefore, this step can be skipped.

3.2. Step-by-step setup

3.2.1. Backend setup

Step 1: Open Cloud Shell Editor

  • Clone github repository into your account folder and to start the setup process

git clone https://github.com/looker-open-source/dashboard-summarization

-Mounting to working directory for step 1, 2, 3 and 4

cd dashboard-summarization/websocket-service/src

Create Looker API keys (client key and client secret) follow Looker documentation

  • Log into Looker UI of your organization
  • Go to Admin → Users → Select your user name.
  • Click Edit keys to create a set of client key and client secret or copy initial ones

We create looker.ini with the template provided by looker-example.init , filling out Looker client key and client secret that you have just copied to authorize Looker SDK.

IMPORTANT: use a section header that matches the host of your Looker instance. For example, if your organization Looker base URL is mycompany.cloud.looker.com , then header will be [mycompany]

Step 2: Install the dependencies with NPM and NodeJS

  • To ensures your system has the latest information on available packages and their versions before installing or upgrading packages, we need to run:

sudo apt update

  • Next, we install npm by typing into terminal:

npm install

Example output for npm install step

After running this command, a folder named node_modules will appear

  • (Optional) We can check the installed is done and npm and nodejs version by this cmd:

npm -v&&node -v

This is the expected output:

Step 3: Start the development server

  • Start development server by following command. This command essentially equivalent with executing the command node index.js , as specified on the file packages.json

npm run start

  • After running the command, you can see your terminal as follows. Now you can close and re-open a new terminal window.
  • (Optional) To check whether the server is running as expected, open a new terminal tab and type in the command

curl http://localhost:5000

This is the expected result

Step 4: For deployment on production, we will use Docker. Step 1, 2 3 above have been containerized on Dockerfile , and we need to push the image on Artifact Registry for cloud deployment.

  1. Set up Artifact Registry for deployment
  • Create a new Artifact Registry repo following this guide on Google Cloud Console.
  • Fill out your repo name and region, choose Docker as repository format. Other configurations are default.
  • After the repo is created successfully, copy the URL of that repo.

2. Push Docker image to Artifact Registry

  • Login your Google Console account and authorize by typing this command

gcloud auth login && gcloud auth application-default login

  • Build and Push Docker image to your newly created Artifact Registry

docker build . -t {Artifact_URL}/{image_name}

docker push {Artifact_URL}/{image_name}

This is example command:

docker build . -t asia-southeast1-docker.pkg.dev/joon-sandbox/dashboard-summary/dashboard-summary-image

docker push asia-southeast1-docker.pkg.dev/joon-sandbox/dashboard-summary/dashboard-summary-image

After running the command, we can see the image is available on that repo on Artifact Registry on Google Console

Step 5: Running Terraform to deploy the service on Cloud Run

  • Navigate to the terraform directory on your system:

cd .. && cd terraform

  • Replace “YOUR_PROJECT_ID” , “YOUR_REGION” & “YOUR_DOCKER_IMAGE_URL” from variables.tf file. Note that “YOUR_DOCKER_IMAGE_URL” is the URL composed on the previous step, adding tag :latest
  • After that, we run Terraform commands to deploy service. This step may take a few minutes. Please make sure you have authorized gcloud cli by your google account

terraform init

terraform plan

terraform apply

Now the back-end setup has completed, we need to save the URL endpoint for later stages.

3.2.2. Frontend setup

Coming to setting up Looker Extension, this is constructed on Node Js framework. We can notice there are several similar steps as the setting up the front-end of websocket service as above instructions.

Step 1: By this time, the working directory is on terraform folder, we should navigate to the root of the repo directory to set up front end of the extension:

cd ../..

Step 2: Add .env file into the repo, similar to .env.example , specify “WEBSOCKET_SERVICE” to be the URL endpoint of Cloud Run service we built from Step 5 (Backend). Other variables on .env.example can be left blank

Step 3: Build the bundle.js file from npm package

npm install

npm run build

There’s likely that we can encounter errors due to package version conflicts. Under that circumstance, try and run this command instead:

npm install — legacy-peer-deps

npm run build

After successfully see above output on terminal, we can see the bundle.js appear on the path ./dist/bundle.js

Step 4: Now coming to Looker side, on DEVELOPMENT mode, we need to first create a separate project for the extension

On the Looker UI, go to: Develop → Projects → New LookML Project. We can name the project: dashboard-summarization-extension and create from blank project

Step 5: Configuring and modifying files for Looker project

On this step, we will add files into an empty project that we have just created on step 4. Read more on how to add file on Looker UI: Managing LookML files and folders | Looker | Google Cloud

Add manifest.lkml file following the template provided on the README.md and make some minor changes as follows.
Please note that there is a template on repo manifest.lkml that is slightly different and outdated, therefore, for consistency we use the one provided on README.

  • Replace YOUR_CLOUD_RUN_URL by the copied URL of websocket service that we have saved after running Terraform on step 6 (deploy backend)
  • Uncomment # file: “bundle.js” , while comment out the line url: “http://localhost:8080/bundle.js"

Add bundle.js file generated from Step 3 into the project via drag-and-drop manner

Add a .model file having connection parameter, we can choose any connections available

Integrate Git for the Looker project. The Looker document can be found via here.

  • Create a new github repo
  • Connect LookML project to Github

After this step, your LookML project should have these 3 files with success Git connection test

Step 6: Deploy extension

Create commits and push into production the changes of the project

Now the extension will appear on Looker → Applications and ready to be tested

Conclusion

Congratulations 🎉🎉🎉 You have come a long way to set up this extension, now you can enjoy enhanced Looker user experience. We aware that the process is tedious and complicated, which can cost a bit of effort, so if this blog seems daunting to you, feel free to reach out to Joon Solution via website: GenAI Dashboard Summary for Looker

For additional tools powered by GenAI to enhance your Looker user experience, you can explore more on our great blogs and offers via here: VertexAI Actions and Looker Assistant.

--

--