Getting Started w/ Node.js on GCP

The “Missing Tutorials” series

Daz Wilkin
8 min readAug 30, 2017

Writing a short series of ‘getting started’ posts for those of you, like me, who may get to the point of wanting to write code against a Google service, having a language chosen but then, having not written code for a week or two, I’m stalled by “How exactly do I get started?”

Node.js

Or should that be JavaScript? Or is it ECMAScript?

I won’t bikeshed the distinction.

You likely know what I mean. Node.js is a relatively recent addition to the portfolio of Libraries provided by Google. It is not available with App Engine standard (languages and runtimes) but you may, of course, use it with Compute Engine, Container Engine, App Engine flexible and, as the only runtime for Cloud Functions. Cloud Functions is where I’ve spent most of my recent Node.js time and so I’ll use that as the basis of this post.

It’s common for professional developers to denigrate JavaScript (though Node.js has avoided some of this criticism). I’m not a professional developer and I really enjoy hacking around in JavaScript.

Setup

To avoid supporting the myriad ways you will arrive at this point, I’m just going to tell you what I’m running (Linux), node (v8.4.0) and npm (5.3.0).

PROJECT_ID=[[YOUR-PROJECT-ID]]
LANG=node.js
mkdir -p ${HOME}/${PROJECT_ID}/${LANG}
cd ${HOME}/${PROJECT_ID}/${LANG}

Google Node.js Packages

We’ll use both the API Client Libraries and Cloud Client Libraries. The first post in this series attempts to summarize the distinction between these Libraries and provide guidance and choosing your path. Google also publishes this explanation. npm is an (the?) excellent package manager for Node.js.

API Client Libraries

Please ensure you use the “Node-js (alpha) libraries not the JavaScript libraries. As I write, the current version of the Node-js libraries is 21.3.0

Google

https://developers.google.com/api-client-library/
http://google.github.io/google-api-nodejs-client/
http://google.github.io/google-api-nodejs-client/21.3.0/index.html

npm

https://www.npmjs.com/package/googleapis

GitHub

https://github.com/google/google-api-nodejs-client

package.json

"googleapis": "^21.3.0"

Cloud Client Libraries

NB #1 Google has decided to decompose the Cloud Client Libraries from one package containing all the Google Cloud services into one package for each Google Cloud service. To minimize the distribution size of your solutions and to potentially manage package versions by service, you are encouraged to install only those packages that you need for the services you will use.

NB #2 Cloud Client Libraries are released on a cycle that lags the underlying service. A service may be GA but the Cloud Client Library *may* be Alpha.

Google

https://cloud.google.com/apis/docs/cloud-client-libraries

npm

https://www.npmjs.com/package/google-cloud
https://www.npmjs.com/package/@google-cloud/storage
https://www.npmjs.com/package/@google-cloud/datastore

GitHub

https://googlecloudplatform.github.io/google-cloud-node/#/

There are 2 dropdowns at the top of this page. The first switches between languages. The second switches between the bundled (all-in-one) package (see NB #1 above) and the individual service packages. Please ensure you use this dropdown and choose one of the service-specific libraries as this affects the documentation that is presented to you. I know, it’s confusing :-(

Choose packages for specific services e.g. @google-cloud/storage

Google Cloud Storage (GCS)

I’m going to use GCS two ways in this post. First using the API Client Libraries. Then using the Cloud Client Libraries. GCS is a BLOB storage service. It includes 2 resource types: Buckets which contain Objects. Let’s create a bucket, upload some files (objects) and enumerate the results. The GCS documentation is complete and includes code samples for creating buckets using the Cloud Client Libraries.

https://cloud.google.com/storage/
https://cloud.google.com/storage/docs/reference/libraries#client-libraries-usage-nodejs

Solution #1: Using API Client Libraries

For the API Client Libraries for Node.js there is only one package to be installed. While you can install this package directly, for reproducibility, it’s a good practice to reference this as a dependency in package.json.

package.json:

{
"name": "google-cloud-storage",
"version": "0.0.1",
"dependencies": {
"googleapis": "^21.3.0"
}
}

server.js:

/* jshint strict: true */
/* jshint esversion: 6 */
/* jslint vars: true */
/* jslint node: true */
"use strict";(() => {
console.log("Done");
}}();

This should yield “Done”.

npm installnpm WARN google-cloud-storage@0.0.1 No description
npm WARN google-cloud-storage@0.0.1 No repository field.
npm WARN google-cloud-storage@0.0.1 No license field.
up to date in 0.485s

Then:

npm start> google-cloud-storage@0.0.1 start /$HOME/$PROJECT/medium/node.js
> node server.js
Done

So far, so good!

Let’s add in the code for auth using Application Default Credentials (ADCs). This is explained here:

https://developers.google.com/identity/protocols/application-default-credentials#callingnode

and here:

https://www.npmjs.com/package/googleapis#choosing-the-correct-credential-type-automatically

Before we can use the Google Library, we must ‘require’ it. Add this to the top of your code:

const google = require("googleapis");

In an attempt to keep the code more understandable, I’m going to wrap the getApplicationDefault() function in a Promise and use Promises for the remainder of the code:

auth = new Promise((resolve, reject) =>
google.auth.getApplicationDefault(
(err, authClient, projectId) => {
if (err) {
reject(err);
return;
}

if (authClient.createScopedRequired &&
authClient.createScopedRequired()) {
authClient = authClient.createScoped([
"https://www.googleapis.com/auth/devstorage.full_control"
]);
}
resolve(authClient);
return;
}));

NB ADCs gives us write-once, run-everywhere magic. When our code runs on GCE, the appropriate scopes are provided by the metadata service. When running locally, we need to provide these. Google provides a list of scopes for all its services, I found “Storage” and chose the full R/W scope:

https://developers.google.com/identity/protocols/googlescopes#storagev1

This gives us the ability to proceed using:

auth.then(...).err(...)

Aside: I find I get a little obsessive with JavaScript linters (e.g. jshint, jslint) and Google’s impressive Closure Compiler (our JavaScript → better JavaScript ;-)). It’s good practice to check occasionally just try to keep it to that.

Let’s now create a Bucket and upload some files (using the Cloud SDK) and then enumerate the Bucket(s) and Object(s) in code.

PROJECT_ID=[[YOUR-PROJECT-ID]]
BUCKET=$(whoami)-$(date +%y%m%d%H%M)
gsutil mb -p ${PROJECT_ID} gs://${BUCKET}
Creating gs://${BUCKET}/...

Now, identify an arbitrary (and preferably small) file on your local workstation. Assume it’s $FILE, then run the following command to make 10 copies of the file in your bucket:

for i in $(seq -f "%02g" 1 10)
do
gsutil cp $FILE gs://${BUCKET}/${i}
done

This will create Objects in the bucket named “01”, “02”… “10”

gsutil ls gs://${BUCKET}
gs://${BUCKET}/01
gs://${BUCKET}/02
gs://${BUCKET}/03
gs://${BUCKET}/04
gs://${BUCKET}/05
gs://${BUCKET}/06
gs://${BUCKET}/07
gs://${BUCKET}/08
gs://${BUCKET}/09
gs://${BUCKET}/10

Or, if you’d prefer to use Cloud Console:

Cloud Console’s Storage Browser

Back to the code… We need to create a client for the GCS service. The protocol with the Node.js API Client Library is:

client = google.[service-name]("[version]");

If we use Google API Explorer, we can determine that we’ll using the JSON API v1 for GCS:

Cloud Storage JSON API v1

So, we’ll use:

const gcs = google.storage("v1");

There’s no (to my knowledge) published API documentation for Node.js. We can use one of the other languages as a guide and knowing that the Libraries are consistent across languages. Or, we can use API Explorer to get a sense of the REST API call that needs to be made. Here’s API Explorer for storage.buckets.get:

https://developers.google.com/apis-explorer/#p/storage/v1/storage.buckets.list
API Explorer: storage.buckets.list

Let’s see what we can construct:

var err = (e) => console.log(e);var promiseBuckets = auth
.then((authClient) => {
return new Promise((resolve, reject) =>
gcs.buckets.list({
auth: authClient,
project: projectID
}, (err, buckets) => {
err ? reject(err) : resolve(buckets);
})
);
})
.catch(err);
promiseBuckets
.then((buckets) => {
buckets.items.forEach((bucket) => console.log(bucket.name));
})
.catch(err);

OK, there are several undocumented ‘tricks’ here that I will explain.

The lambda at top is mostly to simplify the code. In reality, it’s unlikely you’d have such a reusable (and trivial) catch function.

Using the “auth” promise created previously, if it resolves successfully (“then”), we have a authorized client that we can use to make calls. The first call we’ll make is to storage.buckets.list. As API Explorer helped us understand, we need to provide a “project” (actually a Project ID). What’s less obvious is that we must also provide our authClient value.

The call is async, so we wrap it in a Promise too and this is what we return to our next step. If this resolves successfully, we have an object (!) of Buckets resources. However, and this is less obvious, the “items” property is the array of return buckets. So we iterate over the buckets.items.

The code is almost the same to enumerate the (10) objects that we copied to the bucket. API Explorer helps us find the method storage.objects.list:

https://developers.google.com/apis-explorer/#p/storage/v1/storage.objects.list
API Explorer: storage.objects.list

This method requires a bucket name ($BUCKET) *and* as before, we’ll need to provide it with the authClient object:

var promiseObjects = auth
.then((authClient) => {
return new Promise((resolve, reject) =>
gcs.objects.list({
auth: authClient,
bucket: bucketName
}, (err, objects) => {
err ? reject(err) : resolve(objects);
})
);
})
.catch(err);
promiseObjects
.then((objects) => {
objects.items.forEach((object) => console.log(object.name));
}).catch(err);

See the section “Testing” below to run your code.

Solution #2: Using Cloud Client Libraries

See this sample:

https://cloud.google.com/storage/docs/reference/libraries#client-libraries-usage-nodejs

The Cloud Client Libraries equivalent code is very straightforward particularly using the Google-provided sample as a guide.

/* jshint strict: true */
/* jshint esversion: 6 */
/* jslint vars: true */
/* jslint node: true */
"use strict";(() => {

const bucketName = "dazwilkin-1708291403";
const gcs = require("@google-cloud/storage")();

const err = (e)=>{console.log(e);};
gcs.getBuckets()
.then((data) => {
var buckets = data[0];
buckets.forEach((bucket) => console.log(bucket.id));
})
.catch(err);
const bucket = gcs.bucket(bucketName);
bucket.getFiles()
.then((data) => {
var files = data[0];
files.forEach((file) => console.log(file.id));
})
.catch(err);
})();

See the section “Testing” below to run your code.

Testing

You may auth using your personal credentials; those you used to authenticate gcloud. If you wish to do so, you must also:

gcloud auth application-default login

You may also|additionally, authenticate with a service account. To do so, you must create a service account with sufficient permissions (the role ‘roles/storage.admin’ will be sufficient to make the buckets.list and objects.list calls. You must then also set the environment variable:

GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/key.json

Then:

npm start

You may delete the bucket (and all of its objects) when you’re done. Be very careful to specify the correct bucket name as the operation is irrevocable.

gsutil rm -r gs://${BUCKET}

Done!

--

--