Albert Alises
Sep 10, 2018 · 11 min read

At QMENTA we have been rebuilding the front-end from scratch, seeking simplicity and performance not only in design, but also in the technology stack used for it . This article provides a comprehensive overview of the different parts that compound the new platform, such as how decorators are extensively used or the choice of technology stack.


The QMENTA platform has been going on for quite some time now. The old platform front-end was built using well established technologies like JQuery, Dojo or D3. While being feature rich, one of the problems was the scalability and maintainability of such a big codebase and it being quite complex in terms of UX/UI. In medical imaging storing platforms, data and operations are complex enough for the user to manage, so making it more difficult in terms of user experience or visual design is a no-go zone. 🙅🏻

One of the main challenges coming into this year was to rebuild the front-end from scratch to accommodate the new necessities of a growing neuroimaging processing hub, to make it clean and in a way that can be easily maintainable, scalable and up to date with the latest technologies on front-end development.

Figure 1: Register Page for the new front-end
Figure 2: Project List view for all our projects on the new front-end

Having speed, performance and minimalism as a flagship, the aim was to build a single page web application using the FrameworkOfChoice, which can be VueJS, AngularJS, React, or other frameworks such as Hyperapp. Up until there, pretty standard and mainstream definition for a web application project. After pondering the different frameworks and technologies, we chose Preact coupled with Typescript as the core libraries of choice for the project.

Figure 3: Preact and Typescript logos.

But… why Preact + Typescript?

Preact, according to its official documentation, is a “fast 3kB alternative to React with the same modern API”. It offers everything we like and love about React such as the Virtual DOM, Component-Based development, lifecycle hooks and jsx syntax for generating dynamically, data-driven interfaces.

Preact does all this while keeping the API leaner (React 16.2.0 + React DOM is 31.8Kb while Preact is only 4Kb, both gzipped), making it faster, more lightweight, and reducing a lot of the noise added by React. The principal differences with React can be seen here . Some of them are:

  • this.props and this.state are passed to render() for you.
  • You can just use class for CSS classes.
  • Getting rid of a lot of React / React DOM specific functions like React.CreateElement or ReactDOM.render , being separate exports thus avoiding the aliasing, making your code cleaner and more readable.

Regarding performance, testing it on a simple TODO application, the results are astounding, with Preact being among the fastest of them approached by Vue.js and circa 170ms faster than React. Amazing, isn’t it? 🌠

Figure 4: TodoMVC Performance comparison between different frameworks. Seen at: https://developit.github.io/preact-perf/

You can perform the performance test on your device by going here .

Not everything is great of course, one can find that the community is not as big or responsive as the React one (it is not that widely used after all). Some sweet functionalities are still not there, such as support for fragments (So you still have to create some good old div wrappers). Furthermore, some libraries are still not supported, but worry not, preact-compat creates a layer on top of Preact to make it fully compatible with React. Magic!⚡️

Figure 5: Analysis List View

We ❤️ Typescript. A lot. For an application that is rich in data and manages a lot of different metadata and results from APIs, Typescript offers us static type checking that comes very handy when defining the different data structures that our application handles, as well as the structure of the props and state of the different components, so we know what to expect, have it documented and have all the data to be consistent at all different stages.

You can create interfaces with Typescript that characterize the input/output data your application manages. In our case, it helps modeling the data necessary to characterize a subject, or an analysis so everything is consistant and we know what to expect when dealing with those structures

With Typescript, you can also create Custom Components like the one below. This enforces the classes extending ComponentForm to implement methods for handling the form change and also setting the form model (i.e the object with all the fields required for the form such as the user, password, etc…), this model is then required as the state of the component, also a method submitForm()has to be implemented. With that, we have a skeleton or structure that all forms follow with a generic form component that can have any given number of fields.

An example of a simplified generic Preact Component our application uses that enforce Typescript. We can see the different lifecycle hooks and how the props and state of the component are set as Typescript interfaces.


Decorators, Decorators… 💎

Ah, Javascript decorators. As a concept, they simply act as wrappers to another function, extending its functionality, like higher-order functions. We extensively use decorators to keep our code cleaner and separate the back-end, state or application management concerns and provide a more structured way to define common functions across components like connecting to the Redux store, connecting to the API or defining some asynchronous behavior to happen before or after the responses are sent (sort of like method proxies), so we do not have to repeat code and provide an elegant, minimal way to define these behaviors. We use them for:

  • Asynchronous Method Interceptors

For managing asynchronous behavior we use kaop-TS , which is a decorator library that provides some method interceptors written in Typescript. With them, we can plug in behavior at a join point on the asynchronous method, like perform some operations before the method starts, or after the method has finished, or plugging in a method that intercepts an error and performs some logic. We can see the beforeMethod decorator being used in the APIDecorators.ts snippet.

  • Connecting with and Managing API calls

For managing the calls to the QMENTA API, we implemented a @platformRequest(url,method) decorator that you can wrap in any function of with signature function(params,error?,result?) and convert it into an asyncronous call to the API with certain parameters to send in the request and receive the result JSON object from the call or the error thrown by the request. The implementation of the decorator can be seen below as well as an example method calling it.

The decorator uses axios under the hood to perform the request 🕵

Also, the state management and routing of our application use decorators to extend functionality to the components to, for example, connect it to the store or listen to be rendered at a specific route. On the next section we will talk a little bit more about it.

Figure 6: Specific Subject files viewing with the edit file metadata modal.

State Management and Routing 🚀

State management is one of the main concerns of any growing React application. When the number of entities and components keep growing, the need for global state management arises. Redux is one of the mainstream solutions that aims to solve that. This post assumes some previous knowledge of how Redux operates; if not, here you can find a guide on how it works .

In our application, we did not want to have a big store and we tried to keep it as small as possible, reducing the use of the shared state, enforcing encapsulation and separation of concerns. Given so, we decided to use a lightweight (less than 1Kb) version called Redux-Zero . While having some differences, it also reduces a lot of the unnecessary overhead (for the purpose of our application) that Redux has. It still has a store (albeit just one), which provides all the global state that we want to manage; in our case, session and project information, current pages, notifications of the current project id, among others. It also has no reducers, which ironically does reduce the complexity of the library quite a lot.

Figure 7: Brain Tractography Fiber visualization on the front-end using WebGL

To get it up and working, just wrap your root component with the <Provider store={store}/>component and set the store on it, an object created by the createStore(state) method, where the state is the object that contains our shared state. To change that state, you create actions which are just pure functions that update this global state , e.g setSession = (state,val) => ({sessions: val});

To call those actions from a component, we have to connect that component to the store. Connecting a given component to the store allows the component to gain access to some actions and the global state via props. We created a decorator to simplify the process of connecting a component to the store.

With this decorator, we can plug it in on top of any component specifying the actions the component will get as props. e.g by putting @connectStore({removeSession, setCurrentPages}); right on top of the declaration of the component, you have access to these actions inside the component which update the global state on the store, while also having access to the global state itself via props ( this.props.removeSession(); ). With this method, we provide a cleaner, more elegant and compact way to connect a component to the store.

Another integral part of any modern application is the option to route between the different views of the project depending on the URL or other parameters, being able to pass parameters to the routes, etc. A common solution is to use the amazing router that comes with React . As much as we like it, it comes with a lot of functionality that we would not really be using, so we made our own preact-routlet , a simple router for Preact/React based in decorators.

The intricacies of it are simple, just wrap your application with the router component <RouterOutlet /> and you are ready to go. You can also wrap some components with <PathLookup shouldRender={condition}/> specifying a condition to render some path, or use the <RouterOutlet redirect="/route" shouldRedirect={condition} /> to specify a condition which, in case it is met, the router automatically redirects to the specified route (for example, we use it to redirect to the login if the session is invalid or has expired).

To navigate to a route you just have to call navigate("/route") and to specify a Component to be rendered at a specific route, you just have to plug the decorator on top of the component e.g@renderOnRoute("forgot-password") making it clear and visual in which route the component is rendered.

With that, we have a compact way of representing routing and state management with the signatures on top of the component that makes it very readable. A dummy component that connects to the store and is rendered on a given route can be seen below.


Bundling with Parcel 📦

Parcel, not to be confused with the australian band Parcels, defines itself as a “blazing fast, zero configuration web application bundler”. At the start of the project, we used the fairly standard Webpack for bundling our application. It required quite a lot of plugins and configuration. (webpack.config.ts). Switching to Parcel, we don’t need configuration for typescript or html files anymore, just adding npm install --save-dev parcel-bundler parcel-plugin-typescript does the trick.

The only remaining thing is to specify an entry point and output directory and voilà, it just works. The difference in speed and performance compared to webpack is not very acute in our case (it’s essentially the same in terms of speed), but it’s the zero configuration and minimality of Parcel what makes it our bundler of choice for the project.

Figure 8: Screenshot of the build of the whole application. Parcel’s magic💥

The only downside is that, in order to get the hot reloading working for the dev server in Preact + Typescript, you have to add the module.hot.accept() flag to your root component and specify some types in the render function (the third argument as foo.lastChild as Elementfor Typescript not to complain. The fix can be seen on the following snippet.


Unit and e2e acceptance testing with Jest + Puppeteer

Testing is an integral part of any project. In our project, we use the fairly standard Jest for testing our components coupled with Puppeteer , wich is a web scraper, or having your own highly trained monkey perform the operations you tell him/her on the browser, like clicking a certain button or dragging the mouse over an element. Using those, we can perform some operations on the front-end via a headless Chrome API and then check for expected results with Jest, like checking an element of confirmation appears or the warning message displayed is correct👌. If you want to learn more on how to use Jest+Puppeteer for testing in React, there is a nice article talking about it.

Just an example of testing the pagination component using Jest

The building, testing and deploying process is automated using Jenkins , with the tests running in a docker container in order to perform the tests in an environment with all the graphical libraries puppeteer requires. The *fairly simple* pipeline can be seen in Figure 9:

Figure 9: Jenkins pipeline for deploying the front-end

We build the binaries, perform the unit tests and then deploy to a testing dominion which varies based on the git branch we are on. Then we perform the e2e tests on that dominion.

The docker image used for the container in which Jenkins runs the pipeline is a custom version of the Puppeteer Docker Image , based on the ubuntu 16.04 image with node 8. The dockerfile can be seen below:

We add the “deployuser” to the container because Jenkins mounts its workspace directory (owned by deployuser) and it requires the user inside the container and outside to be the same in order to perform all the operations required for the deployment to be succesful.

This way, we avoid the whole application to suddenly crash and burn or turn into one of those amazing 90’s websites.


Summary

I n this article we have presented an overview of the technology stack we used for rebuilding the QMENTA front-end and how it focuses on being small, efficient and simple. We have also seen how we use decorators in our project and the different parts that compound it such as API calls, routing, state management etc.

By setting the foundations and patterns of the project, the next steps are continuously improving the new front-end in a scalable and maintainable way and adding meaningful features to match the functionality of the old platform, while keeping it minimal. You can see how it goes here. 👋


Special kudos to Ciro Ivan for setting the foundations of the project, carrying the heavyweight through most of it and teaching me the ways of the H̶a̶c̶k̶e̶r̶m̶a̶n̶ Javascript ninja. 🤖 🏋🏻

Want to experiment with these technologies in your next projects? Yay! Here you have some boilerplates we created in github to get you up and running as soon as possible.🤘🏼


Who are we?

QMENTA is a brain imaging and data analytics platform in the cloud aiming to accelerate the discovery and development of cures for neurological diseases.

Want to contribute unraveling the brain mysteries and join our growing team? Check out all our engineering jobs and apply here! 😃

QMENTA Tech Blog

Tech blog of QMENTA

Thanks to Amelia Hocine, Tim Peeters, and Ciro Ivan

Albert Alises

Written by

Software Engineer at QMENTA Inc. MSc. in Computational Biomedical Engineering. AR/VR , Web Development and Whatnot.

QMENTA Tech Blog

Tech blog of QMENTA

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade