How I Started a JavaScript Project in 2016 With Bare Minimum of Tools

The JavaScript environment seems like a scary place these days. People with bare to moderate level of understanding of JavaScript and front-end engineering share articles just to make fun out of it. If you take them seriously, then you will likely give up starting a new application from scratch before even trying. However, it doesn’t have to be so complicated.

This year I had the opportunity to build a custom single-page app from scratch. With my “Stick to basics” approach (similar to the one Addy Osmany shares) I was able to set-up something that boosts productivity, allows for solid quality of the code base, is future-proof and is fun to work on. Let’s review this solution.

Disclamer: We are talking here about a custom application, not a standard web site. I intentionally do not include any frameworks or libraries, but it is perfectly possible to add them later on, although the architecture will not allow to manage library versions with NPM. I am focused on the latest versions of all browsers, including IE11, but there is no technical reason to drop support for older browsers if it is required.

A Couple of Principles Beforehand

I decided to stick to the following principles before taking decisions about the configuration and the architecture of the application:

  • “Only tools with great value” principle: Use a transpiler only if it is really worth it and it is possible to debug the original code in the browser. Do not include one just because it is cool to write in EcmaScript 6.
  • Productivity principle: If using a transpiler, make sure the build time is less than 0.1 seconds. I love the productivity front-end engineering brings — write code, see the changes in the browser immediately, repeat. I want to keep that instead of watching cat videos on youtube between builds.
  • “Simple configuration” principle: If possible, do not include one of the big and modern front-end build tools. Avoid using a ton of plugins that leads to configuration hell. Stick to pure NPM instead.
  • “Stick to standards” principle: Use whatever is part of the standard. Add polyfills if necessary, but avoid custom libraries.

With these principles in mind, let’s see the decisions I needed to make one at a time.

On EcmaScript 6

The first decision to be taken when starting a new JavaScript project is whether to support EcmaScript 6, to what extend, and how. (On a more abstract level, I think the main question will remain the same in the years to come. We should only replace EcmaScript 6 with whatever kind of JavaScript is trendy.)

I had the opportunity to work with Babel and full-blown EcmaScript 6 support on a project before starting the new one. My conclusion is that there is one game-changing feature in EcmaScript 6 — the EcmaScript 6 modules. It allows you to scale your app without making a mess out of your dependencies. Require.js was a great idea in the past, but it’s time is almost over. I don’t want to wrap my code in closures only for the sake of dependency management. Also, EcmaScript 6 modules are a standard, and they are here to stay, so I don’t really need to discuss any other benefits. It aligns with my “Stick to standards” principle.

There are a couple of other great features in EcmaScript 6 — including arrow functions, promises, multi-line string support (finally!), for-of operator, and constants. But they aren’t in any way game-changers (well, maybe Promises are, but they can be supported with a polyfill instead of a transpiler). I would use them when all browsers support them, but do I need to include a transpiler just for the sake of writing =>? I have written function() {} almost all my career, so I don’t mind doing it again in 2016.

There are some controversial features all well. The let keyword is a good addition for someone that expects JavaScript to have a block scope, but is otherwise useless. We can all stick to function scope instead.

There is one really bad feature in EcmaScript 6, however — classes. Classes in JavaScript are even worse than classes in other languages due to their lack of private members. Also, JavaScript — being a programming language that supports encapsulation through closures, functions as first class citizens, object literals, and constructor functions, seems too powerful to downgrade it with classes. This topic, however, is out of the scope of this article. The bottom line is, I wanted to avoid supporting classes in my app.

Welcome to Sourcemaps

I’ve always been against using transpilers or all kind of languages that compile to JavaScript, because you have to debug a different, poorly formatted code in the browser. However, enter the world of sourcemaps. This is not an issue anymore. So I was free to add any transpiler that supports EcmaScript 6 modules and allow sourcemaps.

On Using Babel

Initially, I was ok to use Babel. However, when I added it to the project, it failed with the following error: Uncaught ReferenceError: require is not defined. It turns our that Babel is only an interface and requires you to add an additional module loader. You can refer to this great answer on StackOverflow for more information. Sticking to my “Simple configuration” principle, I decided to avoid having Babel and an additional module loader in order to support EcmaScript 6 modules. I wanted a single solution. In the same StackOverflow anwer, one of the provided options for supporting EcmaScript 6 is Rollup.js + Babel plugin.


It turned out that Rollup.js is just what I needed — a small transpiler that supports only EcmaScript 6 modules. You can add Babel plugins as you need to support more features, but I didn’t really find any motivation to do so. Rollup.js does not have a huge comunity as Babel, so I expected to have some problems with that. Google searches do not show many results. However, it turned out that rollup work really well, because it does only one single thing — transpiling modules into EcmaScript 5 code. So I decided to use it and I am happy with that decision.

On Building Tools, And Using Pure NPM

Now that I am using a transpiler, the next question is obvious — how to build the project.

Rollup comes with its own command-line interface, but of course I needed to add a couple more things, like SASS, browsersync for updating the CSS without refreshing the browser, tools for copying static resources from src/ to dist/, etc. I guess that’s why people use Webpack, Gulp or Browserify instead of searching and adding NPM packages one by one. However, I wanted to stick to the basics. What opened my eyes regarding NPM were several articles showcasing the usage of NPM as a build tool, and the scripts functionality in particular.

Using Scripts in package.json

Basically, you can define any script in your package.json:

"scripts": {
"rollup": "rollup --config --watch --sourcemap"

And call it from the terminal with npm run rollup, which is equivalent to calling rollup --config --watch --sourcemap.

This is the way I transpile the JavaScript files in src/ to dist/bundle.js. By the way, if you are interested in the rollup.config.js, here it is:

export default {
entry: 'src/js/main.js',
dest: 'dist/bundle.js'

Having understood the scripts feature of NPM, everything becomes pretty straightforward. Like copying the index.html from src/ to dist/ with npm run copy-html:

"scripts": {
"copy-html": "copyfiles -u 1 src/index.html dist"

Or building and watching SASS with npm run build-sass and npm run watch-sass:

"scripts": {
"build-sass": "node-sass --include-path scss src/sass/main.scss dist/styles.css",
"watch-sass": "nodemon -e scss -x \"npm run build-sass\""

Or starting browsersync with npm run browsersync:

"scripts": {
"browsersync": "browser-sync start --files dist/styles.css --server dist --port 3001",

It turns out that browsersync starts its own HTTP server, so I didn’t even need to add http-server, although I added it later to test CORS.

So after all, I ended up with a couple of scripts in my package.json. 8, to be more precise:

"scripts": {
"rollup": "rollup --config --watch --sourcemap",
"copy-html": "copyfiles -u 1 src/index.html dist",
"copy-static-resources": "copyfiles -u 1 src/index.html src/img/*/* dist",
"delete-images": "rm -R dist/img",
"build-sass": "node-sass --include-path scss src/sass/main.scss dist/styles.css",
"watch-sass": "nodemon -e scss -x \"npm run build-sass\"",
"watch-html": "nodemon --watch src/index.html -x \"npm run copy-html\"",
"browsersync": "browser-sync start --files dist/styles.css --server dist --port 3001"

The only task that remained was to trigger them all at once.

Triggering All Scripts Asynchronously

That was the moment I created the start script:

"scripts": {
"start": "npm run rollup | npm run delete-images | npm run copy-static-resources | npm run build-sass | npm run watch-sass | npm run watch-html | npm run browsersync"

npm run start, or its shortcut - npm start, builds the sources from src/ to dist/ and triggers browsersync, so the application is available on http://localhost:3001. In addition, it watches the index.html, all SASS files and all JavaScript files and builds them automatically on change. And it also detects changes in dist/styles.css, so I have live update for styles. Basically, after running npm start I don’t need to do anything else in this terminal, unless rollup fails.

On OS X and Linux, you can use the | operator to run concurrent tasks. On Windows, you should search for a different solution. See this: How can I run multiple NPM scripts in parallel?

On Using a Template Engine

The third and final important decision all developers have to take when creating a 2016-ish JavaScript applications is what template engine to use.

Unfortunately, I was not able to make this decision. Whether I was overwhelmed by the huge amount of choices, or I just didn’t need a template engine at the initial phase of the project, the fact is all my HTML is up to this day placed in the index.html.

Initially, I was impressed by the opportunity to use HTML imports. However, it has been dropped from the standards, so it was a no-go for me.

React.js makes the HTML look really sweet, but adding React only for the sake of separating HTML into different logical parts is definitely overengineering.

There is a template HTML tag as part of the new standards, which sounds great. IE11 can be supported with a polyfill, so there is no reason not to use it. However, the functionality this tag provides is very basic. It does not eliminate the need to write DOM code to handle components logic by yourself. So it does not really replaces a full-blown template engine.

Whether I will add a template engine as part of this project skeleton in the future is yet to be seen. I hope some kind of a standard is in the making, so it will align with my principles.

Going Further

You can take a look at the entire skeleton of the project on GitHub. I think it will be relevant in the years to come, at least in 2017. One drawback is that you cannot easily manage external dependencies with the given configuration and tools. I don’t find this disappointing, since I don’t use a lot of libraries in general. I added the Promise polyfill and axios directly into the index.html (axios will most probably be replaced with the Fetch API in the near future). You don’t have to be ashamed of using the good old script tag from time to time.

So there it is, a beautiful boilerplate for starting a new JavaScript application, without overwhelming and complicated tools and configuration. If this is similar to your way of doing JavaScript development, feel free to take some ideas for yourself. You can also find me on twitter for further discussions.

Like what you read? Give Radoslav Popov a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.