Making a digital business card in 5 KB

John Granström

Check out the result at, or view the source code.

I had been planning for some time to replace my personal website with a simplified “business card”-like page with some social links. Mostly because my previous personal website was more or less a bad clone of a subset of my LinkedIn profile. Why duplicate that information when I can simply provide a link for anyone interested to a platform made for that purpose.

I decided on making something that was very simple and minimal, but with some kind of original touch to it just to make it stand out somewhat. What came to mind was doing something graphical that was interactive in some way.

One requirement I set was to make it very lightweight. This was mostly because I see so many webpages that ship so much more data than you would think is necessary when you see the page, and I wanted to make an example of the opposite. My goal was to make the page seem to load more or less instantaneously from anywhere even with suboptimal network conditions.

Another requirement was to use modern tools for development and have a readable, acceptable quality code base. I did not want to optimize to the extent where I would have to start sacrificing things without any real benefit.


I wanted as little content as possible while still having a useful page. Mainly I wanted to include my email and social links.

Many websites include one or more fonts to style the text. I opted for using fonts that are already available as I think it is excessive to fetch a font to use just for a couple of paragraphs.

The social links are simply recognizable icons for each platform. There are many ways to include icons in a webpage nowadays, a common one being icon fonts like Font Awesome that bundles a bunch of commonly used icons and is very easy to use. However, I only need a few of those icons so it would be overkill to include the entire icon font. This can be managed using something like IcoMoon where you can generate an icon font with a select few icons, but I felt like I should not have to include a font asset at all just for those icons.

Since I only need each icon once I went for another option which is to include them as SVG markup. By optimizing them using SVGOMG the resulting markup is very lean, and it does not require any additional assets outside of the html itself.

// The feather icon (link to blog) as optimized SVG markup
<svg viewBox="0 0 32 32">
<path d="M0 32C4 20 14.469 0 32 0c-8.219 6.594-12 22-18 22H8L2

Graphical + Interactive

For this part I wanted to do something dynamic that would be visible in the background and somehow respond to user actions. I decided on making something that would generate a random maze to be used as the background of the page.

Generated maze as background

To highlight the fact that it is in fact a maze and to make the page more interactive I also wanted to generate a path through the maze to where the mouse pointer was. This would add a dynamic flair while still keeping it simple. This should also support touch to ensure a similar behavior on mobile devices.

Path through maze tracking mouse pointer

The path would start at the top left corner of the window, but I also wanted a bit of an easter egg where clicking in the maze would set that as a new starting point for the path. To illustrate that the background is dynamic I also settled on including a “magic button” in the bottom right corner that simply generates a new maze.

This dictates that there must always exist a path from some location in the maze to any other location. Because of this I limited the type of mazes generated to only be perfect mazes which means that there is exactly one path from anywhere to anywhere in the maze. A very efficient algorithm to generate such mazes is called Eller’s algorithm which is the one I ended up implementing.


About 95% of websites today are using some form of JavaScript. It seems to be something almost expected to be a requirement to build a useful website. This is most likely not true and a lot of website could probably be as useful without it. Still I do not think it is a major concern to have some JavaScript on a website, it is clearly very useful to achieve things that would not be possible otherwise. However, you often see websites ship a lot of it, and it is not always clear why that amount is really necessary to accomplish whatever the goal is.

One reason for this might be that it is so effortless to include third party libraries for tasks of any size, and especially web applications include frameworks like React together with all kinds of support libraries. In addition a lot of features in modern browsers are not supported by older browsers so one might also ship a bunch of code for compatibility and interoperability reasons. Then there is common things like user tracking which is often implemented with some level of JavaScript.

In my case I do need some JavaScript for the maze feature but there is no need to include a fully fledged framework, and there is actually no need to use any third party library at all. I do make use of the Web Worker API which is supported in all modern browsers, but for older browsers I simply leave that feature out.

There are two separate JavaScript files shipped, one that does all the rendering and listens to user actions, and another one that is run in a web worker and is generating mazes and paths. This is separated in order to avoid issues with UI responsiveness when generating the maze or tracing paths through it.

It is also worth mentioning that I use TypeScript on top of JavaScript as I normally do due to the major benefits of having static types.


Most websites require some form of styling. How that is done varies a lot, but it is common to use some type of framework like Bootstrap or Semantic UI which can be very useful as they provide a lot of layout options and support constructs.

In my case the layout is very simple and I had no reason to use any specific framework to accomplish that. I simply include some minimal CSS that is served as a separate asset which is useful for caching which is discussed more later on. I did however use an extension to CSS called Sass only because it provides some syntactic benefits.


As most websites include more than one asset it is often useful to bundle them together in some form before shipping them to a browser. There are many tools available today for this purpose such as Webpack and Rollup that all vary in complexity and features but with a common goal to pack everything up into neatly shippable assets.

This stage usually consists of transforming and compressing source code, optionally splitting it into multiple bundles and resolves dependencies to ensure everything is loaded at the right time.

For this simple website the bundling is rather simple, and is mainly used to minify code and append hashes to filenames in order to enable proper caching. For this reason I went with Parcel which is a bundler that comes with a lot of features without requiring any configuration.


All websites need somewhere to run so that people can access them. I deploy this site to the cheapest available instance on Digital Ocean using a docker image which is automatically built from the GitHub repository. The machine itself does not pack much of a punch and would struggle if the site for some reason would receive a lot of traffic. This is a non-issue as in front of the server sits a CDN, in this case Cloudflare which effectively handles all requests.

There are many benefits with using a CDN, one being that it is globally distributed so it can mitigate the high latency that otherwise would be expected for a user in a location that is far from where the server sits. Another one is that the CDN can cache assets so that they can be returned to the user without even reaching out to the origin server.

This is where the hash in the filename comes in, by appending a hash that is based on the contents of the asset the CDN can avoid fetching that asset from the origin server if it is not changed since it was last requested. Following the same logic the asset can be in turn be cached by the browser since if it would change on the server it will have a new hash and the browser will ask for that new version. Returning users does not even have to ask the CDN to provide the asset as it is already in the browser cache. This will substantially improve load times and save a lot of resources.

Assets with hash in filename highlighted

The only asset that should not be cached in the same way is index.html since this file effectively acts as a manifest that specifies the hashes to other assets. If this was cached by the browser the user would have to clear the browser cache to get a new version of the site. In the same way if it was cached by the CDN no user would be able to get a new version whenever it is available on the origin server.

Not caching index.html on the CDN defeats some of the benefits as on each visit the origin server would have to be reached for the browser to even know what other assets are required, and only then potentially benefit from them being cached. This means that latency induced by the geographical location of the user would still be a fact.

What I did to work around this is to cache index.html on the CDN and use the API provided by Cloudflare to purge that file from the cache whenever there is a new version of the site available at the origin server. A deploy script on the server includes a simple curl request to automate the procedure. The effect of this is that only on rare occasions are requests made to the origin server on Digital Ocean, and the entire site is served by Cloudflare from a server geographically close to the user. An added benefit is that I can shut down the origin server and the website would still continue to be served as usual.

The browser must always ask the CDN about index.html to ensure it will get the latest assets should they have changed, but by using headers such as ETag Last-Modified and If-Modified-Since the CDN can respond with an empty 304 response if the browser already have an up to date index.html so that it does not have to be downloaded again.


The result is a page that is fast to load for a number of different network configurations. This is tested with Firefox and cache disabled using various network throttling configurations. I include one test with browser cache enabled for reference.

Page load times for different network configurations

Of course for some networks there will be some level of latency that you have to accept. The page is also fast to load from different locations around the world. This is tested using Sucuri load time tester.

Global page load times

As a bonus I can run the site on very limited hardware as Cloudflare will take care of most of the traffic as indicated by their analytics overview.

Cloudflare analytics showing bandwidth saved

Overall I am happy with the result and I think it shows that you might not need to ship massive payloads to make something interesting, and that there are great tools available to help you distribute your creation efficiently.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade