Building a Minimal Static Website Generator and Development Environment with Node.js

Jascha Ephraim
8 min readJun 11, 2014

Note: This project was built in 2013, so some of the tools and technologies have changed.

In certain circumstances, static website generators offer clear advantages over dynamic CMSs and blogging platforms. This is often the case for the personal sites of individual developers for whom it is more user-friendly (and efficient, and maintainable, etc.) to manage content in text files rather than nice-looking web forms.

I’ll describe my own minimal static site generator and development environment built using Node.js (inspired by Jeff Escalante’s Roots), specifically for the management of my personal site,

This article will refer to the code in the GitHub repository Its may also be helpful.

My Requirements

  • Generate HTML using Jade
  • Generate CSS using Stylus
  • Add content using Markdown files with YAML metadata
  • Write all client-side JavaScript in Node-like CommonJS modules
  • Use Bower for managing third-party client-side JavaScript libraries
  • Everything will eventually be compiled down to static files
  • Include a development server with intelligent LiveReload
  • The development server should serve the compiled code directly from memory without having to actually save it to files


I’ll try to avoid confusion by using the following terminology:

  • the application refers to this Node.js application, which serves and compiles my static site
  • project refers to the set of directories and files—including Jade, Stylus, Markdown, images, and JSON—that the application will be reading and compiling into static files
  • the static site is the end product, generated by the application from the project files

Website structure

My website is a simple portfolio, consisting of a single page that groups short project descriptions with images under a few tags (so that a project can appear under multiple tags). In the end there will be one CSS file and one JavaScript file.

Command line interface

I will use the command stat-gen (for ‘static generator’) to create the default files and directories for starting a new project, to start up the development server, and to eventually export the static files.

Express server

A simple Node.js web server will serve the site locally during development using Express .


While the server is running, a file watcher will trigger LiveReload upon file changes. Changes to content or JavaScript will cause the browser to refresh, and changes to CSS will be injected into the browser without refreshing.

Creating the Application

After creating a new directory for the application—which I call ‘static-generator’—the package.json file can be generated using npm init, or written manually. It lists the application’s dependencies which can be installed by running npm install.

It also specifies:

"preferGlobal": "true",
"bin": {
"stat-gen": "bin/stat-gen.js"

This tells npm that this package is intended to be installed globally, and that when it is installed, a symlink should be created (in my case at /usr/local/bin/stat-gen) to bin/stat-gen.js, making it globally executable.

Command Line Interface

Since I work on a Mac, bin/stat-gen.js is written for Unix-like environments. The ‘shebang’ at the top indicates that this script should be interpreted by node. This file must have executable permissions.

The global process.argv is an array of the arguments that launched the node process. So once the application is installed globally, typing stat-gen into the terminal would result in process.argv = [ ‘node’, ‘stat-gen’ ]. As indicated in the code, a particular command can be passed to stat-gen (stored in process.argv[ 2 ], defaulting to ‘start’) which will then require a script with a corresponding name in the commands/ directory (and will pass any remaining arguments to that script).

‘New’ command

stat-gen new project-name

This command will run commands/new.js, passing the given project name. This uses the ncp package to copy the template/ directory to a given path in order to begin a new project.

‘Start’ command

stat-gen [start project-dir]

This command (commands/start.js) compiles the project, starts up the development server, and begins watching the project’s files for changes (all by requiring server.js). If the project directory is the current working directory, one only has to type stat-gen into the terminal.

‘Export’ command

stat-gen export [project-dir] static-export-dir

This command (commands/export.js) compiles the given project just like stat-gen start, but writes the entire site to its final static form instead of starting up a local web server.

This application has a module called compilers (covered next), which manages each of the compiler modules for HTML, CSS, and JavaScript. It has a pipe function which takes as arguments a type string (‘html’, ‘css’, or ‘js’) and a writable destination stream. So the exporting of each file (aside from static files like images, which are just copied) is performed by the asynchronous function (out of context, but should be understandable):

function( next ) {  var dest_path = dest_dir + '/'
+ compilers[ file_type ].filename;
var write_stream = fs.createWriteStream( dest_path );
write_stream.on( 'finish', next );
compilers.pipe( file_type, write_stream );


Compiler module

As mentioned above, the compilers module manages the compilers for HTML, CSS, and JavaScript. It also has two functions: the pipe function mentioned above, and the function compileAll, which calls the various compilers’ compile functions in parallel.

HTML compiler

The HTML compiler (compilers/html.js) first builds the locals object that will be passed to Jade, making its contents available to the project’s templates. locals.pretty tells Jade whether to generate prettier HTML with whitespace, and it depends on process.env.NODE_ENV—which is set to ‘development’ by stat-gen start and ‘production’ by stat-gen export. locals.imgUrl is a convenience function for templates that takes an image file name and returns the full path to the image after export. locals.livereload_script is either the inline LiveReload JavaScript or an empty string, again depending on process.env.NODE_ENV.

The compile function requires the project’s config.json file, which includes the array of tags to be used in the site (this allows me to define their order):

var config_path = process.cwd() + '/config.json';
delete require.cache[ config_path ];
locals.config = require( config_path );

I first delete the cached config in case this compilation is being performed after the watcher detects a change to the config.json file.

Looping through the config’s tags array, the data for each tag is read from the markdown-loader module and added to locals.tags. The data for each “article” (the descriptions with images listed under each tag) is likewise assigned to locals.content from markdown-loader.

Now that all of the site’s content is assigned to locals, the project’s Jade templates can take care of the rest. Since this site is a one-pager, the final compilation is done by:

function( err, html ) {
if ( err ) throw err;
module.exports.result = html;

CSS compiler

The CSS compiler (compilers/css.js) is much simpler. It reads the project’s styl/app.styl, and passes the contents to the stylus package, which is also set to compress or not based on process.env.NODE_ENV:

stylus( styl )
.set( 'filename', module.exports.filename )
.set( 'compress', compress )
.use( require( 'axis-css' )() )
.render( function( err, css ) {
if ( err ) throw err;
module.exports.result = css;
} );

JavaScript compiler

As mentioned in the requirements section above, all of the project’s JavaScript will written as CommonJS modules, necessitating the use of the browserify package. By also using the debowerify package, I can require third-party JavaScript libraries from my project’s bower_components/ directory (as long as they fulfill the necessary requirements). I also use uglify-js to compress the resulting JavaScript.

Once everything is configured in compilers/js.js, the actual compilation is done using a stream module (covered next) that I use for convenience, which creates a writable stream that receives the bundled JavaScript from Browserify and—like the other two compilers—assigns it to module.exports.result:

var compiled_js = '';
var save_stream = stream.writable(
function( js_buf, enc, next ) {
compiled_js += js_buf.toString();
}, function( err ) {
if ( err ) throw err; var js = process.env.NODE_ENV === 'production'
? uglify.minify( compiled_js, { fromString: true } ).code
: compiled_js;
module.exports.result = js;
browserify.bundle().pipe( save_stream );

Stream2 Convenience Module

The stream module mentioned above isn’t super relevant to this particular project, but it’s a simple, convenient module that I end up using often. It allows me to create stream2 streams in one function call.

For instance, in the excerpt from compilers/js.js above, a writable stream is created by calling the function stream.writable. The first argument is a function that handles each streamed chunk, and the second argument is a function that is called when the stream is finished.

compilers/index.js also makes use of this module’s function readString, which takes a string as its argument and returns a readable stream that pumps it out.

Markdown Loader

markdown-loader.js reads the Markdown files (in parallel) that define the site’s tags and content items, parses them, and stores the resulting data in for other modules to use.

It also exports a single function, load, which reads and parses everything in the project if no arguments are passed, or loads a specified file if passed an array. For example, to reload the file tags/, you would call markdown_loader.load( [ ‘tags’, ‘example’ ] ). This is so that if a change to a particular file is detected by watcher, only that file is reread before the templates are recompiled and the browser is refreshed.

Express Server

server.js calls compilers.compileAll, and in the callback starts up a simple Express server, begins listening for LiveReload using the tiny-lr package, and starts up the file watcher.

The routes for the server are very simple, since this will never be used in production. They are set up to mimic the simple structure of the exported site. Static files in the projects static/ directory (static/img/, static/fonts/, etc.) are served directly (at /img/, /fonts/, etc.):

server.use( express.static( 'static' ) );

The compiled and compressed CSS and JavaScript are served at /app.min.css and /app.min.js, and anything else routes to the single HTML page:

server.get( '/' + compilers.css.filename, function( req, res ) {
respond( 'css', res );
} );
server.get( '/' + compilers.js.filename, function( req, res ) {
respond( 'js', res );
} );
server.get( '*', function( req, res ) {
respond( 'html', res );
} );
function respond( file_type, res ) {
res.type( file_type );
compilers.pipe( file_type, res );

File Watcher

The last module in the application is the file watcher (watcher.js), which uses the gaze package. At the time of this writing, Gaze is at version 0.6.4, which behaves erratically on my operating system for whatever reason. This application uses version 0.5.1, which works flawlessly for me.

When Gaze detects a change, the changed file’s extension is passed to a switch statement. If a Markdown file was changed, markdown-loader rereads and re-parses the file, the project’s Jade is recompiled into HTML, and a refresh request is sent to the browser by tiny-lr. If a Stylus or JavaScript file is changed, the appropriate files are recompiled and a refresh request is sent. The default condition—meaning that an image, font, etc. has been added or changed—simply sends a refresh request.


If you’re just getting into Node.js, I hope that this article helps by covering a simple application that involves a number of components—creating a command line tool, a simple web server, etc.

Feel free to check out the code from GitHub and use it if you are building a website that fits within the narrow constraints of this application.

If you have suggestions for correcting or improving the code, feel free to let me know on GitHub, and if you have suggestions for correcting or improving this article, please leave a comment.