Skaffold team recently announced an exciting new feature that allows to sync files between your machine and the development container, making a big step forward in supporting interpreted languages like Node.js.

Skaffold is a command line tool to develop applications against a Kubernetes cluster (either a local Minikube or a remote cluster). Skaffold handles the build, push and deploy process of the image upon code change. Until today, Skaffold was (IMHO) not well-suited to interpreted languages like Node.js, due the inherent slowness of the process. With version 0.16.0, Skaffold supports an hybrid approach, allowing to take advantage of the usual auto-reload mechanisms used by Node.js developers (e.g.: nodemon):

  • When a js file changes, Skaffold syncs it with the container and the app is restarted by nodemon
  • When a file change requires…


Use your preferred Git workflow to achieve fast and error free environment promotion with Docker

TL;DR Deploy Docker FastPath in your build system.

One of the most useful piece of information that I learned from the classic Continuous Delivery by Humble and Farley (got get your copy now!) is to Only Build Your Binaries Once.

Many build systems use the source code held in the version control system as the canonical source for many steps. The code will be compiled repeatedly in different contexts […]. Every time you compile the code, you run the risk of introducing some difference
[J. Humble, D. Farley, Continuous Delivery, p. 133]

You can notice that the book has been…


Facebook recently released Yarn, a new Node.js package manager built on top of the npm registry, massively reducing install times and shipping a deterministic build out of the box.

Determinism has always been a problem with npm, and solutions like npm shrinkwrap are not working well. This makes hard to use a npm-based system for multiple developers and on continuous integration. Also, npm slowness in case of complex package.json files causes long build times, representing a serious blocker when using Docker for local development.

This article discuss how to use Yarn with Docker for Node.js development and deployment.

xkcd take on installing code

TL;DR

  • Clone the…


Speed matters, up to a point

The Internet

A rock-solid internet connection is the essential prerequisite for effective remote working. But what constitutes a good connection? As you can imagine, speed is important, but that’s not the whole picture. Several factors contribute to your remote working experience, not all of them are captured by speed as measured using online tools.

Speed

The faster, the better, as people reached by Google fiber can attest. The rest of us need some minimum requirements to rate our connection. Your mileage may vary, but in short, as a remote worker you need a fast connection to:

  1. Download or upload bulky content (design files…


At Clevertech, we built an integrated technology stack based on Docker containers that provides local application development, building, testing, and deploy in the cloud. Containers offer big advantages in software development, quality assurance and software deployment: namely consistency, reliability and scalability. In particular, scalability is implemented from the beginning, and the system is ready to grow as the application get traction.

What are containers?

Docker Containers are an evolution of the virtual machines, which are the base of the cloud environments like Amazon AWS, providing similar benefits in a lightweight and efficient way. …


Speed up the building process of your Node.js project

There are a lot of tutorials about using Travis CI to build a Node.js project. However, when I started using Travis for our projects, what I really wanted to read about was performance issues, that is, how to achieve a fast build.

This post details the best practices you can use to speed up the building process. In particular, you are able to move faster by using new Travis’ container-based infrastructure and by caching the npm dependencies. It is possible to cut your build time by more than half, using these techniques.

The basics: Project boilerplate + Github repo

We start by creating a boilerplate Node.js project


Lean methodologies allow us to develop reliable software, and to do so more quickly. This must be coupled with a software delivery process that runs smoothly and keeps the pace with software development.

DevOps (Development + Operations) is an interdisciplinary approach that encompasses software development and IT infrastructure operations. DevOps advocates for an holistic approach: lean methodologies are implemented in every aspect of company activity, from software development, to cloud service operations, to maintenance and support.

The common goal is automation. By automating processes it is possible to reduce the time spent on handling basic operations, like manually deploying the latest version of the application on the development server, preparing a cloud server for live release, or monitoring the application’s health. This allows team members to focus solely on development and testing.

A…


One of the main obstacles to the adoption of Ethernet technology in carrier-grade metropolitan and wide-area networks is the large recovery latency, in case of failure, due to spanning tree reconfiguration. In this paper we present a technique called Bounded Latency Spanning Tree Reconfiguration (BLSTR), which guarantees worst case recovery latency in the case of single faults by adopting a time-bounded bridge port reconfiguration mechanism and by eliminating the bandwidth-consuming station discovery phase that follows reconfiguration. BLSTR does not replace the Rapid and Multiple Spanning Tree reconfiguration protocols, which remain in control of network reconfiguration, whereas it operates in parallel with them.

Martino Fornasa, Michele Stecca, Massimo Maresca, Pierpaolo Baglietto, “Bounded latency spanning tree reconfiguration”, Computer Networks, Volume 76, 15 January 2015, Pages 259–274.

Get the full paper at: http://www.sciencedirect.com/science/article/pii/S1389128614003934.


Cloud computing has taken over many areas of computing/communication systems, in particular in companies with large distributed computer systems and databases. Cloud computing is based on both dynamic computer system virtualization and dynamic network virtualization. In this article we focus on dynamic network virtualization and more specifically on Software Defined Network technology, which basically involves a shift from increasingly “smart” routers/switches to much simpler routers/switches only providing data plane functionalities, whereas the network control plane functionalities are provided by software platforms hosted in computer systems possibly distant from the actual networks.

Philip A. McGillivary, Martino Fornasa, Pierpaolo Baglietto, Michele Stecca, Giovanni Caprino and Massimo Maresca, “Cloud Computing Goes To Sea: Ship Design Around Software Defined Networking (SDN) Systems”, Ocean News & Technology, September 2014

Read the full article.


A huge amount of data is everyday managed in large organizations in many critical business sectors with the support of spreadsheet applications. The process of elaborating spreadsheet data is often performed in a distributed, collaborative way, where many actors enter data belonging to their local business domain to contribute to a global business view. The manual fusion of such data may lead to errors in copy-paste operations, loss of alignment and coherency due to multiple spreadsheet copies in circulation, as well as loss of data due to broken cross-spreadsheet links. In this paper we describe a methodology, based on a…

Martino Fornasa

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store