Errors are the best especially when they are written in a way where you become a decipherer. I remember the good old days when all the error codes I got were only numbers and maybe letters mixed in and there wasn’t any online searching to easly get interpretations.

I’ve been working with Google Cloud products and connecting to services from my laptop like Storage and BigQuery. …


Another brief ‘how to’ for those who need to pull and push to someone else’s upstream GitHub pull request (PR).

I was working on a teammate’s GitHub pull request for Project OCEAN (a current focus of mine) to help close it out. Since her pull request came from her own personal account, there were key commands I needed in order to pull and push directly to it.

Pull Existing PR

To pull an existing PR from an upstream project.

git fetch [NAME REMOTE PROJECT] pull/[PR NUMBER]/head:[PR BRANCH NAME]Example:
git fetch upstream pull/10/head:pr_change

The command includes the pull request number like the above…


Are you trying to use Go to get files into Google Cloud Storage without pulling them onto the computer that is running the code or open them and read the contents into a new file? If yes, so was I not long ago. Here is a quick PSA to share the code I put together to solve this.

Copy Files to GCS

The following code example can get files from one url location into GCS without downloading it on the server running the code.

import (
"cloud.google.com/go/storage"
"fmt"
"io"
"net/http"
)
func storeGCS(url, bucketName, fileName string) error { // Create GCS connection ctx :=…


Let’s talk about end user authentication. I’ve been digging into the authentication space a bit and have some takeaways to share.

When you access a GCP service, there is authentication to determine who you are, authorization to determine what you can do and auditing that logs what you did. IAM is where you setup roles for authorization in regards what you can do in a project.

In the land of GCP there are a couple key ways to use credentials to access/login/authenticate to different services on the platform.


Hello again. I am sharing a couple things I learned when working with Docker recently. Main points are how to pass in environment variables, setup Google Cloud SDK in Docker and turn on debugging.

Since I haven’t written about Docker before, I’ve provided a brief overview of Docker to give context and a grounding on my pointers. I recommend doing more digging to learn more elsewhere especially considering there are many other resources out there like this Docker overview.

What is Docker

It’s a way to build and deploy software and applications in an isolated environment. You can package the code to build…


Oh the fun I’ve had learning Cloud Functions. It has actually be fun while also frustrating and definitely enlightening to explore serverless services that are supposed to make things easy. Granted I’ve been around long enough to know there is always ramp up, even when it is simple.

Cloud Functions is a service that allows you to run code on Google Cloud servers without needing to deal with server configuration or scaling. It is a pay as you go approach that means you pay for what you use and that can help optimize costs. I’ve started using it for a…


When working with tables in BigQuery, you need an understanding of a dataset structure whether it is public or you set it up and you want to review. This is a quick bit to share queries you can use to pull metadata on your datasets and tables.

In the following examples, I’m using the BigQuery public Stack Overflow database to demonstrate these commands. Change out the names as needed for the dataset and tables you are working with.

Dataset Metadata

Get a list of all tables in the dataset and the corresponding information.

SELECT *
FROM bigquery-public-data.stackoverflow.INFORMATION_SCHEMA.TABLES

Query processed 10MB when run…


Providing a quick overview on how to setup and switch between Google Cloud projects with the SDK on a single machine. This is helpful when working with multiple projects (especially when collaborating) and you are using Cloud SDK.

Setup & Authenticate

The following steps are needed whether creating a project for the first time or a project already exists and you are logging into it off a computer where you will do local development. This assumes you will run all commands from a terminal.

Create a new gcloud configuration for your project on the machine you will use to access it.

gcloud config…


When training a neural net model, time is of the essence. This is why different machine configurations including GPUs, TPUs and multiple servers are utilized.

I’ve been exploring the YouTube-8M project for the last couple months and there are previous posts about the project, the video dataset, the algorithms and how to run them in Cloud. For this post, I trained the two algorithms from the getting started code on different AI Platform standard machine configurations to see how they compared. …


Continuing the YouTube-8M exploration and blog series, this post walks through how to use AI Platform to train, evaluate and run predictions for the this dataset. Not surprising, it sets up servers faster than the server I manually configured.

The posts prior to this one provide an overview of the YouTube-8M project, data and computer vision modeling. This research has been used to further computer vision in relation to video datasets over the last several years.

Below, steps through how to setup and run the example code provided by the project on Cloud AI Platform as well as how to…

Warrick

Google DRE, feminist, software engineer, ML & DS engineer filmmaker, public speaker, continuous learner, Alzheimer caregiver, D&I advocate

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store