Image for post
Image for post

When we use the GCP product called Cloud Functions, we are supplying the body of a function that contains the code logic we wish to have executed. By doing this, we are separating ourselves from any concern or implementation of how that function is invoked. It is Cloud Functions that causes the execution of our code when incoming requests arrive. We do not have to develop any form of serving scaffolding. Cloud Functions also takes care of starting up as many instances as we need based on load and scaling down to zero when no calls are in-flight.

An alternative to Cloud Functions is the service known as Cloud Run. Cloud Run has similarities to Cloud Functions in that it scales to zero and takes care of the startup/shutdown of instances. However the development model changes considerably. With Cloud Run, it becomes the developers responsibility to construct a Docker image that will run as a container started by Cloud Run. The container will be responsible for becoming a full REST server and receiving the incoming request and then passing control to the business logic code. …


Image for post
Image for post

Google Cloud Storage (GCS) provides blob storage for data. Files can be uploaded to GCS and subsequently retrieved. The storage is cheap and provides excellent availability and durability. GCS provides a variety of programming language APIs that can be used by custom applications and many of Google’s products are pre-built to produce and consume data to and from GCS. Command line tools such as gsutil also provide scripting access. Data can automatically be ingested if it is web addressable using the Storage Transfer product.

What is not provided by Google in the out of the box GCS story is the ability to access GCS data through any of the File Transfer Protocols (FTP). In this article we describe access to GCS via the Secure File Transfer Protocol (SFTP) and the corresponding SFTP client tools. …


Image for post
Image for post

Geocoding is the notion of taking an address and determining information about its location on the Earth. This is typically the geo position given by latitude and longitude coordinates. Recently, a CDAP/Data Fusion user had a list of addresses as input and needed to enrich the data with geocoding information. While Google Maps provides an easy to use API to perform this task, there was nothing baked into CDAP that would allow us to leverage this API. …


Image for post
Image for post

When we write a CDAP pipeline, we commonly wish to transform the data received at the source to the data that we wish to expose at the sink. CDAP provides a rich set of features to achieve this task. A common data transformation we find is the need to map data values by lookup. Consider a table that contains US state names and their corresponding 2 character code:

  • Alabama -> AL
  • Alaska -> AK
  • Wyoming -> WY

Now imagine we have a source of data in CDAP that looks like:


Image for post
Image for post

Cloud Run allows us to expose REST based services implemented within Docker containers. When we consider a REST service, we find that it can consist of multiple callable operations. This article illustrates how we can secure each operation individually.

Let us work with an example service that handles social media posts.

We might have operations such as:

  • /list — List submissions
  • /get — Get a submission
  • /submit — Submit a new submission
  • /delete — Delete a submission

We quickly see that not all operations should be permitted for all users. For example, we may want to allow everyone to list and get submissions, authors to submit new submissions and moderators to delete submissions. …


Image for post
Image for post

Google’s Memorystore service provides a fully managed Redis environment. Cloud Run provides a fully managed container hosting environment with automatic scaling. This opens up the opportunity to design solutions hosted by Cloud Run that leverage the services of Memorystore. In this article we will examine what is involved to set that up.

Whenever we create a Memorystore instance, it will be given a private IP address that will be visible to a single VPC network. Further, Memorystore only permits connections from resources that are contained within the same region. …


Image for post
Image for post

The Google Cloud Platform provides a service called Cloud Run. This service allows you to supply your own Docker container that listens for incoming REST requests and processes those requests as they arrive. Cloud Run owns the management and scaling of your container instances by increasing or decreasing the number of containers over time as a function of the number of concurrent requests arriving. It will even scale to zero if there have been no requests for a while.

You will only incur a charge for using Cloud Run while there are active REST requests being processed. If there are no current requests then there is no charge. To achieve this, Google suspends CPU within the container when there are no in-flight requests. …


Image for post
Image for post

We can use Firebase hosting to host a web-site. Within the deployed site, we can specify that some or all desired URL paths can be directed to Cloud Run for processing. This allows us to generate dynamic server-side rendered content through applications running in containers. Firebase hosting is also intrinsically linked with Google’s Content Delivery Network (CDN) to provide efficient edge caching. When a browser request arrives at Google, the CDN is consulted to determine if the content can be served from the edges. If not available in cache, the request is then routed to Firebase for serving. …


Image for post
Image for post

Imagine that we wish a Memorystore redis managed instance to be available to a number of our projects. How might we achieve this?

One answer is to leverage Shared VPC and Private Service Access. Let us now look at a high level topology diagram and then we will explain its parts followed by the recipe to create it.


Image for post
Image for post

Let us imagine that we want to perform a GitHub build every night at 2:00am … how might we do this?

One recipe is to create a Cloud Build trigger associated with your GitHub project and then initiate that build on a schedule. We will now look at this in more depth.

About

Neil Kolban

IT specialist with 30+ years industry experience. I am also a Google Customer Engineer assisting users to get the most out of Google Cloud Platform.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store