Piotr Bazydło
Aug 27 · 4 min read

Our main product is called Rezlynx. It is a mature, cloud-based property management system (PMS). I’m an engineer at Guestline and among other things, I’ve worked extensively on Rezlynx.

One of the achievements of our team was reducing the full release process of Rezlynx from multiple years to once a week. However, we know there is always room for improvement and want to do it faster still.

I’d like to tell you a short story about how and why we have started using micro frontends architecture here at Guestline. I’ll share our outcomes after more than a year in production.

How and Why We Started Using Micro Frontends

I picked up a task that required adding a configuration page and exposing some configuration data to one of our applications. I needed to implement both the UI and backend for it — a full vertical.

I decided to see if we can speed up our development process by decoupling it from Rezlynx. Then I consulted this idea with my team and went ahead with creating our first micro frontend at Guestline.

In the past, I have used a framework called Open Components (OC). It is an open-source framework for self-hosting micro frontends, you can check it out on Github. Open Components has all the features we needed:

  • independent components,
  • semantic versioning,
  • stateless; out-of-the-box scalability,
  • simple deployment process,
  • and great developer experience.

I started with a simple implementation which included two OC components, one to modify the data, and one to display it.

Embedding components into Rezlynx was very simple. All that needed to be done was adding a reference to a single javascript file and adding the component tag to the page.

The only real challenge was authentication. How do we make sure that the user is authenticated and can only access his data? Our first approach to this problem was to give OC the ability to read the application cookie, which contained the necessary information to authenticate the user. That way OC verifies what kind of information the user has access to on the server-side.

The whole implementation took me more time than it would have via the old method, however, we were now able to deploy updates to this component and any other new components independently which is a huge win.

A slide from our first Open Components brown bag

My team was impressed by the success of this approach and helped me share this with the rest of development. We have weekly Brown Bag peer training sessions and we decided this was a good forum to share this. Later, my team followed up with hosting an on-site workshop for other developers.

Open Components in Retrospect

We have used Open Components in production for over a year now. OC has now been successfully adopted by all teams. It’s especially rewarding to hear one of the product owners ask:

Please tell me that this feature is implemented as an OC and we can roll out this change quickly.

When I’m thinking about this quote and after a few conversations with fellow developers, I can see how the development process has improved. The culture has changed now independent deployments of small self-contained components reduce our cycle time. This results in even better customer experience through frequent and safe releases.

Another great feature is the ability to semantically version our components. This allowed us to safely introduce breaking changes and deploy them to production without waiting on consuming applications. Each consumer receives major version within its own deployment cycle.

It is much easier to write proper tests for a small self-contained component. We can simulate inputs and outputs within a reduced context. It saves time when trying to find out how exactly is a specific feature working.

The biggest disadvantage I can see so far is that we have another system to monitor, maintain and keep up-to-date. Thankfully so far it hasn’t consumed too many of our resources. Over the last year, it probably took around a week or two of dev time. Open Components is stable and we have been able to leverage our cloud infrastructure including Blob Storage, Azure AppInsights and Azure WebApps to host and monitor the service.

Summary

This approach is working out well for us so far. We have just finished our second OC workshop, which took the form of a small coding competition. OC has become a go-to tool in most Guestline dev’s toolbox.

Dynamic scoreboard during our coding competition

The next step is to improve the observability of this system. We want to start leveraging the logs OC generates to automatically create dashboards for each component. We also want to improve the functionality shared by multiple components.

If you’re considering introducing micro frontends into your solution, these would be my shortlist of things to think about:

  • What are the benefits you are expecting to see? (For us it was quick independent deployments)
  • Do these outweigh the disadvantage of making the overall set up more complicated?
  • Start small, share with others and iterate on your idea

P.S. We are constantly looking for the best people to work with. If you want to join us, check our careers page.

Guestline Labs

The Guestline product development and engineering team blog

Thanks to Dave Hillier, Marcin Bazydlo, and Juanjo Cerezuela

Piotr Bazydło

Written by

Guestline Labs

The Guestline product development and engineering team blog

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade