Build a blog application on Google App Engine: Architecture (part 2)

Sébastien Loix
Google Cloud - Community
8 min readNov 27, 2018

This is the second part of a multipart tutorial on how to build a small Blog application in Node.js using the Google Datastore and deploy it to Google App Engine. If you haven’t read it yet, jump to the first part where I explain how to set up the project.

In this section, I will explain the architecture of the application and the different modules that will compose it.

Hexagonal Architecture

To build the application we will put in place what is called the Hexagonal Architecture or the Ports and Adapters architecture. You can find a lot of information out there about the Hexagonal architecture but I like the definition below from Apiumhub:

Hexagonal Architecture promotes the separation of concerns by encapsulating logic in different layers of the application. This enables a higher level of isolation, testability and control over your business specific code. Each layer of the application has a strict set of responsibilities and requirements. This creates clear boundaries as to where certain logic or functionality should sit, and how those layers should interact with each other.

What this means in our Node.js application is that each layer will follow this pattern:

interface BlogPostDomain {
someMethod(): boolean;
}
/**
* Export a function that
* - accepts arguments (Input)
* - returns an interface (Output)
*/
export default (ctx: Context, modules: Modules): BlogPostDomain => {
return {
someMethod() {
return true;
}
}
}

We can see that our BlogPost Domain layer takes 2 arguments as Input (Context and Modules) and returns (Output) the BlogPostDomain interface containing 1 method (someMethod()). With this approach, it will be very easy for us to test this layer in isolation providing mock values for our context and modules as long as they have the same contract. It will also be very easy to modify or replace this layer as long as we don’t break the signature defined (the Input and Output).

Furthermore, if we are using Typescript along with this pattern, it will be extremely safe to refactor our application as Typescript will automatically tell us which part of our code is broken because of a contract this is not respected.

Application Modules

We will split our application into 4 modules:

  • Blog Module: read, create, edit, delete BlogPosts and Comments.
  • Admin Module: small CMS to list our posts and edit or delete them. It is mainly an interface to manage the entities of our Blog Module.
  • Image Module: upload or delete featured images to the Google Storage.
  • Utils Module: utility functions. By definition, this module does not depend on any module as the other modules should be able to depend on it. To simplify the tutorial, this module is already included on the starting branch, feel free to reach out in the comments if you don’t understand something.

Let’s now got into the “modules” folder (remember, all our Typescript code is in the “src/server” folder) and create 3 folders (admin, blog, images). Inside each of those folders add an “index.ts” file and add the following:

export default () => {
return {};
};

With that, we have defined the entry point of each module, let’s now export them from our main “modules.ts” file. Open the “modules.ts” file at the root and make the following modification:

// modules.tsimport initBlogModule from './modules/blog'; // Add
import initAdminModule from './modules/admin'; // Add
import initImagesModule from './modules/images'; // Add
import initUtilsModule from './modules/utils';
export default () => {
const utils = initUtilsModule();
const images = initImagesModule(); // Add
const blog = initBlogModule(); // Add
const admin = initAdminModule(); // Add
return {
blog, // Add
admin, // Add
images, // Add
utils,
};
};

Great! …but there isn’t much Typing in here yet. Let define an interface for each one of the modules and export it. We will then be able to declare global AppModule Type that will contain all of our 4 modules.

Open the “modules/admin/index.ts file and make the following modification::

// modules/admin/index.tsexport interface AdminModule {} // Add this lineexport default () => {
return {};
};

Do the same with the 2 other modules.
In “modules/blog/index.ts”, add export interface BlogModule {}.
In “modules/images/index.ts”, add export interface ImagesModule {}.

Now that we have defined an interface for each one of our modules, we can create an AppModules Type that will contain each of them. For that, create a “models.ts” file at the root and add the following

// models.tsimport { BlogModule } from './modules/blog';
import { AdminModule } from './modules/admin';
import { ImagesModule } from './modules/images';
import { UtilsModule } from './modules/utils';
export type AppModules = {
blog: BlogModule;
admin: AdminModule;
images: ImagesModule;
utils: UtilsModule;
};

We can now import our AppModules Type in our “modules.ts” file and define it as return type (the Output).

// modules.ts
...
import initUtilsModule from './modules/utils';
import { AppModules } from './models'; // Add this line
export default (): AppModules => { // Add the return Type
...

Great, we now have our 4 modules declared with their interface exported. We will see each module in detail in a separate blog post. For now, let’s keep building the skeleton of our application.

Application Configuration

Our application will need some configuration to be passed to the layers and modules. We will follow the same approach as for the modules: we will have multiple files for different configurations and then one file that will export them from a single place. First, let’s create a config folder at the root of our server folder and add an “index.ts” file in it. Add the following into the file:

// config/index.tsexport type Config = {};const config: Config = {};export default config;

Nothing fancy here, we declare our application Config Type and then export it as default config object. Let’s now create the different configuration files. Inside the “config” folder create a “common.ts” file and add the following

// config/common.tsimport joi from "joi";const envVarsSchema = joi
.object({
NODE_ENV: joi
.string()
.valid(["development", "production", "test"])
.required()
})
.unknown();
const { error, value: envVars } = joi.validate(
process.env,
envVarsSchema
);
if (error) {
throw new Error(`Config validation error: ${error.message}`);
}
export type CommonConfig = {
env: string,
isTest: boolean,
isDevelopment: boolean,
apiBase: string
};
export const config: CommonConfig = {
env: envVars.NODE_ENV,
isTest: envVars.NODE_ENV === "test",
isDevelopment: envVars.NODE_ENV === "development",
apiBase: "/api/v1"
};

Let’s see in detail what happens here. First, we import Joi, a validator for Javascript objects. Joi validates object with the help of Schemas. Here we have defined an envVarsSchema with 1 property into it: NODE_ENV. We specify that NODE_ENV has to be a string and that its valid values are (“development”, “production” or “test”). We also marked it as required.

We then ask Joi to validate the process.env global variable against this Schema. If process.env does not contain a NODE_ENV property or if its value is not valid, an error will be thrown at boot time and the program will exit (this is what is called the “Fail-Fast principle”).

Finally, we declare a Type for the common configuration and returned it.

Let’s create the other configuration objects for gcloud, logger and server. They will all follow the same pattern so I won’t go into details.

// config/glcoud.tsimport joi from "joi";const envVarsSchema = joi
.object({
GOOGLE_CLOUD_PROJECT: joi.string().required(),
GCLOUD_BUCKET: joi.string().required(),
DATASTORE_NAMESPACE: joi.string()
})
.unknown();
const { error, value: envVars } = joi.validate(
process.env,
envVarsSchema
);
if (error) {
throw new Error(`Config validation error: ${error.message}`);
}
export type GcloudConfig = {
projectId: string,
datastore: {
namespace: string
},
storage: {
bucket: string
}
};
export const config: GcloudConfig = {
projectId: envVars.GOOGLE_CLOUD_PROJECT,
datastore: {
namespace: envVars.DATASTORE_NAMESPACE
},
storage: {
bucket: envVars.GCLOUD_BUCKET
}
};

And the config for the server:

// config/server.tsimport joi from "joi";const envVarsSchema = joi
.object({
PORT: joi.number().default(8080)
})
.unknown();
const { error, value: envVars } = joi.validate(process.env, envVarsSchema);if (error) {
throw new Error(`Config validation error: ${error.message}`);
}
export type ServerConfig = {
port: number
};
export const config: ServerConfig = {
port: Number(envVars.PORT)
};

…and for the logger:

// config/logger.tsimport joi from "joi";const envVarsSchema = joi
.object({
LOGGER_LEVEL: joi
.string()
.allow(["error", "warn", "info", "verbose", "debug", "silly"])
.default("info"),
LOGGER_ENABLED: joi
.boolean()
.truthy("TRUE")
.truthy("true")
.falsy("FALSE")
.falsy("false")
.default(true)
})
.unknown();
const { error, value: envVars } = joi.validate(
process.env,
envVarsSchema
);
if (error) {
throw new Error(`Config validation error: ${error.message}`);
}
export type LoggerConfig = {
level: string,
enabled: boolean
};
export const config: LoggerConfig = {
level: envVars.LOGGER_LEVEL,
enabled: !!envVars.LOGGER_ENABLED
};

Now that we have our 4 configuration files defined, let’s import them in our “config/index.ts” and export them as a single config object.

// config/index.tsimport { config as common, CommonConfig } from "./common";
import { config as gcloud, GcloudConfig } from "./gcloud";
import { config as server, ServerConfig } from "./server";
import { config as logger, LoggerConfig } from "./logger";
export type Config = {
common: CommonConfig,
gcloud: GcloudConfig,
server: ServerConfig,
logger: LoggerConfig
};
const config: Config = {
common,
gcloud,
server,
logger
};
export default config;

Environment variables

As we have seen, all the validation for our application configuration is done against the process.env global variable, which is where Node store the environment variables. Having the application configuration defined this way is one of the principles of the Twelve-Factor App. It allows us to have different configurations according to where the application runs.
When we will deploy our application to Google App Engine, we will define the environment variables in the app.yaml descriptor file, but we sill have a long way to go before having to worry about that... :)

During development, we will use the very useful dotenv npm package to set the required environment variable. This package will look for an “.env” file and add its content to the process.env global object at runtime.

You will find an .example.env” file at the root of the repository, rename it to “.env” and update its variables values (mainly GOOGLE_CLOUD_PROJECCT and GCLOUD_BUCKET).

Now create an “env.ts” file in the “config” folder and add the following:

// config/env.tsimport dotenv from "dotenv";if (process.env.NODE_ENV === "development") {
/**
* In development, read the environment variables from .env file
*/
dotenv.config();
}

…and import it in the “index.ts

// config/index.ts// Make sure to import this first!
import "./env";
import { config as common, CommonConfig } from "./common";
...

Great! we now have our application configuration defined. In a normal development process, we would not know in advance all the properties needed in the config object and we would add them incrementally as they are required. But even like that I always follow the same approach when creating the skeleton of my apps: export a single configuration object from a config/index.ts file and inject it to the different layer of the application.
For the purpose of this tutorial and for simplicity we just added all the configuration in one block.

Datastore emulator

You might have seen in the “.env” file the following variable defined:

DATASTORE_EMULATOR_HOST=localhost:8081

When this variable is set, the datastore instance (from the google-cloud/datastore library) will not connect to the live Datastore but to the emulator running locally on our machine, allowing us to develop offline. The project has an npm script to launch the Datastore local emulator that will save the entities data inside our project folder. In a separate terminal window run the following:

npm run local-datastore
# or
yarn local-datastore

If you have never run the datastore emulator, it will first ask you to install it. After that, you will have a local Datastore emulator to develop against.

And with that, we are done with the application architecture and configuration. I hope you are following along, please let me know of any issue in the comments below.

In the next post, we will create the application context. An object containing application wide dependencies like the database connection ( gstore) or the Google Storage instance. Let’s jump right into it!

--

--