Implementing 18F’s Digital Analytics Program (A Guide) — Part 2

Jeremy Molayem
DataLA
Published in
4 min readDec 2, 2016

In the previous post, we outlined the dashboard and key insights. In this post, we provide a tutorial on how to build it. The process follows the following steps:

A) Google Analytics Account feeds information to Analytics Reporter, an AWS instance that runs cron job every 3 minutes.

B) Analytics Reporter populates a specified S3 bucket with .json files. An example file is here.

C) The S3 .json files, then, populate the visualizations that get displayed on the static Jekyll webpage.

Google Analytics Account

Before anything, we need to create a Google Developer Account and a project.

Google Developer Console

A. Create a Project on Google Developer

Enable this project for the Analytics API by searching “analytics.” Select the first analytics API option. Click enable and then click the “Credentials” tab below Library.

When you open this tab, you will have the option to “Create Credentials” and “Service Account Key.” Proceed with this step. You will then be displayed with a dropdown — choose the option called “App Engine default service account” and download the recommended .json file. You will use the contents of this file later.

Setting Up Google Developer Permissions

Next, go to the Analytics Admin panel which is located on a separate Google domain name, https://analytics.google.com . On the top navigation bar, select “Admin” and you will see three columns — Account, Property, and View. Select “User Management,” and you will see an “Add Permissions” box. Open up the .JSON file you downloaded earlier. You will notice Google generated a compute@developer.gserviceaccount.com username. Copy and paste this generated account name into the “Add Permissions” box and give it “collaborate, read, and analyze” privileges.

This concludes work setting up Google accounts and access. If you had trouble with the above paragraph, video 2 in the next section may help.

Google Analytics — Account, Property, and View Columns

B. Analytics Reporter

Next, we will want to download the reporter script and prep it for our account information.

In the root directory of this project is a file called “env.example.” While this file isn’t actually used in the project, it is used to test that you are able to receive a response from the Google Analytics API for a single google account. Open this file and insert your relevant credentials.

#from .json file downloaded from Google Developer Console
export ANALYTICS_REPORT_EMAIL=”xxxxxxxxx-compute@developer.gserviceaccount.com”
#from analytics.google.com
export ANALYTICS_REPORT_IDS=”ga:xxxxxxxx”
#from .json file downloaded from Google Developer Console
export ANALYTICS_KEY=” — — -BEGIN PRIVATE KEY — — -
[contents of key]
— — -END PRIVATE KEY — — -
Inputting Google Analytics Account Keys into Analytics Reporter

At this point, your analytics reporter is set up and ready to be tested on your local computer.

C. Web Application

To test it, copy and paste the following code to the bottom of the file.

.bin/analytics — frequency=daily

You can save “example.env” as a .sh file. I saved it as run-reports.sh. To execute .sh files on Linux, go to your command line and type: chmod +x name_of_file.sh to make your file executable, press enter. Then, sh ./name_of_file.sh to run. You should see a .json output. If you do see it, you are ready for the next step — which is to run reports on many “GA:xxxxxxx” accounts at once.

Including all Google Analytics Accounts and setting up AWS S3 Buckets

You will see in the deploy/envs folder that each government website has it’s own GA ID number. The three .sh files will execute these environments all at once (see video above).

It is important to note that you are going to want to copy the Google credentials that were set up for our previous example .sh and paste to each of these new three .sh files — daily.sh, hourly.sh, and realtime.sh. Also, it is important to include your AWS bucket information and keys migrated from the example in a similar manner. Be sure you’re bucket is in the east region.

Note: this tutorial assumes you have set up an S3 bucket and have pushed your analytics reporter to an instance.

Setting up the Front-end Analytics Jekyll Site

From here, you can enter “make deploy” into the command line and the Jekyll site will be deployed via the S3 command line utility. If you would like to contribute, have suggestions, or would like help deploying this tool for your organization, view our Github account or email us at mayor.opendata@lacity.org.

--

--