How Overlay compiles React and Vue.js components from designer tools?

Kévin Jean
Overlay Blog
Published in
8 min readJul 23, 2020

--

Overlay is a design-to-code tool for creating production ready web components from designer tools. In short, it is a Sketch plugin (other design tools plugin are coming) that compiles design data as JSON and generates code out of it.

In this article, we will explain how the Overlay engineering team has designed and built Hyperion, the API responsible for building web components.

Before designing the solution, we chose to setup our technical constraints:

  • As we know, the web development world changes continuously , we want to be able to adapt our product to it. The architecture must allow us to easily add new designer tool inputs (Figma, AdobeXD) and code generation outputs (new JS framework, new UI framework, etc…).
  • Each framework-specific code generation algorithm must be independent of each other. We don’t want that React code generation can impact Vue.js code generation.
  • We don’t want to have dependencies between a designer tool input and a code generation output, a team can choose either Sketch/React or Figma/React or Sketch/Vue.js.

That’s what we want it looks like:

A vision of the Overlay engineer’s mind

With these constraints in mind, we decided to choose one of the most well-known programmer architecture: a compiler. But we didn’t have any clue about how and where to start, this is our journey in the compiler’s wonderful world.

To make the article concise, we will focus on compiler principles and how we implement these principles at Overlay. Further articles will deal with about styles/performance/automation and so many other tasty tech stuff.

A little pinch of compiler’s theory

A compiler is a computer program which translates code in a language A to code in a language B. There are many compiler types, the “compiler” term is primarily used to design programs that translate top-level code to lower level language source code.

In our case, the source code is a JSON input sent by the designer tool input and our targets are JS Frameworks (React, Vue.js). Programs which translate between source codes and target code at the same level of abstraction (in this case high level), are usually called a transpiler or transcompiler.

Under the hood, a compiler is structured in three consecutive stages: the front-end, the optimizer, the back end.

To be able to generate several targets (components coded with a particular framework) from several sources (design tools), our compiler uses an abstract data structure called intermediate representation (IR). A good IR needs to be capable of representing source code without loss of information and be independent of any particular source or target code. At Overlay, we decided to take a rose tree mapped on HTML structure as IR because :

  • It’s straightforward to translate HTML structure to modern JS framework code.
  • As we are web developers we know how to optimize layout and HTML for building clean and reusable components. Which is important for building the compiler optimizer part.

Here is an example of component intermediate representation.

This component will be translated into this intermediate representation (IR).

As you see a node in this IR is a Layer and a branch represents a parent/child relation. Let’s understand how we can transform this JSON input into a first simple React Component.

The Front-End stage: Ensure quality

The front end stage task is very simple: analyse/validate the source code to build the compiler’s intermediate representation. At Overlay we use Symfony, a PHP framework, to build the backend APIs. To build our parser/analyzer we used Symfony Serializer component. These objects are meant to be used for translating JSON to PHP objects. So all we have to do is define our symbol’s intermediate representation as PHP Object.

First we defined the Layer PHP representation

With the Serializer, this JSON input will be transformed into this PHP Object

With this approach we created a strong coupling between our models and our API endpoint interface, which is a very good thing in this case because we want to ensure we don’t loose any source code information in our IR.

Now we have to ensure that our IR is correct. To do so, we used Symfony Validator component with attribute annotations. For example we ensured that each layer has a name.

Layer class with name validation

And that’s it for the front-end part ! As you can see, all the front-end stage “logic” is written in our models which are our IR and that’s a very good way to centralize our logic and make it easily maintainable for our team. Moreover, we used existing Symfony components to build this part and that helped our team to focus only on designing the IR and not implementing validation or parsing.

Overlay Front end

The Optimizer stage: where algorithm’s power takes place

The optimizer stage is where all the “magic” happens. This is the part responsible for optimizing the IR in order to improve the performance and the quality of produced code. We did multiple operation types on the IR like adding new data (padding, margin, flex behavior, etc…), changing the IR tree (removing useless layer, reordering layer children, etc…). We chose to create one optimizer for each optimization and chain them one after another. It is not the best choice from a performance point of view, but it avoid concurrency issues as all the optimizers are dealing with the same TSR tree. The Optimizer A optimizes the tree, then sends the new IR to the Optimizer B etc…

Overlay optimizer

We fixed some rules for building our optimizer:

  • An optimizer must do only one optimization, we chose this to avoid hard maintainable algorithms.
  • An optimizer can be dependent from other optimizers. For example re-ordering layer children is necessary to compute margin between the right siblings.
  • An optimizer must be developed in test-driven development. We chose this approach because we want to make sure these algorithms work as expected with a set of IR.

To avoid regressions and improve maintainability we built unit tests for each optimizer. Moreover we create functional tests on the whole optimizers chain to check there is no conflict between the optimizers and if linked optimizers work as expected. We really put an emphasis on testing automatically this part because this is the heart of our application. For each optimizer we have between 10–20 unit test cases and we have more than 50 functional test cases on the chain. This series of tests save us many times from regressions, and help us a lot to update optimizers.

The Back-End stage: specify and generate code

The back-end stage is responsible for target technology optimization and specific code generation. As we have to build and maintain several target technologies, we decided to use the same architecture for each technology. A three step architecture : First the Translator, is responsible for creating a target specific representation (TSR) from the intermediate representation, then Specific Optimizer is responsible for optimizing the TSR with language specific rules (Fragment for React, Slot for Vue.js). Finally Code Generator is responsible for building code from the TSR.

Why do we use a target specific representation (TSR) and not directly our intermediate representation (IR)?

To avoid coupling between target languages’ back-end part. That allows engineers to work on specific language features without risking to impact other languages.

To describe our TSR, we used an PHP interface. For React we called this interface ReactElement, here is an example with React, we created a Div symbol class:

Then we created a React Translator, which takes a Layer object (our IR) as an argument and returns a ReactElement (our TSR), for the example we only return Div object.

Finally we built the code generator. To do so we used the Strategy pattern on all React TSR object. To make it simple, we used the ReactElement interface to defined one method renderElement and implemented this method in all our TSR objects.

Div object implementing ReactElement interface

Then, we created our React code generator class which calls the renderElement interface method on the root tree ReactElement. We use Twig for templating the React component, it’s a lot easier to manipulate strings inside this template instead of using string concatenation.

And this is the result with our button with icon ! Our first generated component 👏

Button with Icon with basic layout

There are still improvements to make to get perfect React components: we only generate div tag but our button has a p and an img tag, we don’t have text content inside the text part. But with this approach and more time Overlay engineering team manage to built the first prod ready web component compiler: Hyperion.

Conclusion

Building a web compiler is a difficult thing to do and we learn a lot doing so:

  • When you design a software, think about what already exists. Scientists, mathematicians, and engineers work a lot on theoretical computing concepts. Don’t ignore their works, use them.
  • Always build your software thinking you are going to maintain it (even if it’s not the case). All the time spent maintaining your software is time where you don’t add any value to your product. Don’t waste your time.

If you want to know more about compilation theory, feel free to post your questions. We recommend this book, Compilers: Principles, techniques and tools, which is quite difficult to read but where you can find the compiler state of the art.

If you want to know more about overlay go here, it’s free to try:

We will build a series of technical articles to explain how Overlay engineering team designs, builds and operates our systems and we will continue to post tech content every month, if you are interested in complex tech problems, follow Overlay blog.

--

--