One year ago, we embarked on a daring journey: Developing a Node.js application to be used in the very enterprise-y field of digital signature.
The system is basically a server exposing restful APIs that allow users to create and manage signature workflows. Long story short, we succeeded (http://www.validsign.com/), but it wasn’t always a smooth ride and we learned some lessons along the way that we thought would be helpful with anyone that is looking to use Node.js in enterprise environments.
Lesson 1: Security
As our application domain involves the secure signature of documents, security requirements are a big deal.
For instance we had to deal with this: https://www.owasp.org/images/0/08/OWASP_SCP_Quick_Reference_Guide_v2.pdf. Despite it being a “quick” reference it contains a LOT of rules, some of them are also quite vague or not really relevant. For our purposes with Node.js, we are just going to discuss this from an input sanitization and dependency security standpoint.
For input sanitation, we used the Joi module to write validation rules for each and every endpoint of our rest APIs to validate, sanitize, trim and convert everything that comes from the network to a set of predefined types. One cool side effect of such a formal definition of our input types is that we were able to generate a detailed swagger file with the help of the hapi-swagger module. Swagger (aka openAPI) files are a simple yet powerful way to formally describe RESTful APIs. This gave us the ability to generate a documented API explorer page. Moreover, we used a query builder, knex, to avoid sql injection as much as possible.
To note, we used hapi.js to implement our REST routes.
Dependency security was addressed via the use of Snyk, which is an excellent auditing tool and also gives users the ability to patch modules instead of updating. This is really useful if you have shrinkwrapped dependencies and prefer not to (or cannot) upgrade. This also brings us to the second point.
Lesson 2: Developing and testing against a very specific version of everything
Another thing I discovered about this enterprise world is that, more often than not, we have to assure that the product will work on very specific version of everything, as enterprise software needs guarantees about software stability at each release (think Red Hat).
Creating this very specific development environment was very expensive. When one new team member was introduced to the project, at least one day was spent on setting up the environment. We found a solution to this with the use of docker-compose, we created our tailored Dockerfiles and described how they interacted with a simple YAML file. This saved us when our customer tried to run the full end to end test suite on their galera infrastructure, and it managed to bring down the *entire* cluster.
We could not reproduce the issue in our environment, but with docker compose we were able to setup a simple galera cluster and run tests, ultimately finding the bug (which was a query running outside the transaction it was supposed to run into).
Essentially, a developer types “docker-compose up” and everything is set and ready to go. This wasn’t always the case as we encountered some issues in the beginning as we treated containers like virtual machines and did not understand how volume mapping worked, but the final result was satisfactory.
By the way, I gave a little talk about docker compose if you are interested (and are comfortable with Italian) — check this out: https://blog.linkme.it/adventures-with-docker-compose-and-node-js-1e11f98ae462#.dvlsb9jtc
Lesson 3: Looks like Node.js modules do everything
This is more like a celebration of the Node.js ecosystem more than a “lesson learned.”
There was an existing and very battle tested java client to interact with the signature backend, and I was terrified at the idea of porting it to Node.js. But, as I desperately searched “node java” on npm this (https://www.npmjs.com/package/node-java) showed up and it worked. It is still important that you pay a lot of attention to the quality of the packages you are going to use. There are tools out there that help with this, such as Snyk. Another tool we would like to try in the near future is npm enterprise.
Lesson 4: Typescript is a friend
Ok, you can edit the code and after you run tests you can see if something breaks, but wouldn’t it be cool to just let the compiler work for you? Yes it is cool and I love it. As your functions’ parameters and outputs have implicit types you should document, why not let the machine know it so it can help?
Actually, this wasn’t so easy because often the type definition (*.d.ts) files) for dependencies are outdated and sometimes we had to play it dirty and manually modify typings in order to make thing work. In the end, however, TypeScript has been invaluable to help us maintain the codebase. It’s not perfect but it’s easy to pick up and allows you to incrementally refine your typings as your software (and TypeScript itself) matures. Also the tooling is great and is getting better every day.
As Atom is our editor of choice we worked with the atom-typescript plugin. It was an enjoyable experience as it provides functionality such as autocomplete, type information on hover, finding usage references and more.
We are planning to try other editors or IDEs and see what else is out there to help us in our quest for improving our typescript tooling — also because it seems atom-typescript development has slowed down to a halt. Let’s also not forget tslint, which is an honest and solid typescript linting tool.
We witnessed the evolution of the external typings manager: from the now deprecated tsd, to typings, to the rise of the @types organization on npm, which allows you to install type definitions from the npm registry with a simple npm i @types/<insert package name>.
Is Node.js mature enough to be safely and productively used in an enterprise context? Absolutely yes, as long as you choose the right tools. For us the combination of hapi.js + docker-compose + Typescript + Snyk worked really well. If you are starting Node.js in the enterprise, it is imperative to invest a little time in tailoring the right environment for your needs.