React, Websockets & a Raspberry Pi — Building Keen’s New Site

James Mikrut
TRBL
Published in
11 min readFeb 20, 2018

Hi Medium — happy to be here. I’ve made great use of this platform for years and for the first time I’ve decided to write a post of my own.

I’d like to share a little about the design thinking and development techniques we used for Keen’s new website. We learned a ton of things along the way and I hope some of it might be of benefit to others.

Jake creepin’ in the bottom corner

Back in October 2017, we started toying around with the idea of rebuilding our studio’s website around the goals of differentiating ourselves from our competition, demonstrating our skill sets more accurately and trying to build a more honest connection to who we are as individuals and as a collective. Over the years, we’ve drawn inspiration from tons of studio / agency websites but most have all had this sort of holier-than-thou way that they communicate who they are and what they do. They’re all trying to compete to see who can speak marketing lingo better than the others.

Our branding process is better than yours. Focus, alignment, inspiration, execution, etc. Blah blah blah.

Hundreds of different shops with hundreds of different creative processes to do the same thing. We don’t want in on that rat race.

We want to be who we are and want our website to act as a window into exactly that. Don’t get me wrong, you need at least some level of that marketing lingo if for nothing else other than SEO, and lots of people definitely do read through and value that type of content, but we didn’t want it to be the cornerstone of our site.

We’ve always had an issue with how websites are typically thought of as a collection of pages. It’s 2018! The internet is capable of so much more. Asking a server to send you an entirely different “page” every time you click a link seems primitive. And why should a visitor of our website be considered so differently from a visitor to the studio itself? Both cases are opportunities for us to connect with another person but the former is often underutilized.

The studio slid some creative juices toward finding a way to connect to our audience in as honest of a way as possible, and somehow try to close the gap on the current differences between a visitor coming to our website and to our studio. We landed on introducing a combination of a few different ideas — a live webcam of our studio and an interactive chat laid directly on top of the webcam stream.

We were certainly not the first people to think of these ideas, but we didn’t need to be. The combination could allow for a variety of interesting ways to connect to our visitors — and that’s what we were after. A visitor could say hi and we could wave back to the camera. Maybe one day we’d all come to work with horse masks on. Who knows.

But hopefully people would leave our site feeling like they have an idea of who we are and not like they were just spoon-fed marketing gibberish.

Time to shine

Now that we had a little sapling of an idea, we needed to figure out how to make it happen.

We began mocking up ideas and tossing different approaches back and forth between the designers. We landed on an aesthetic and UX fairly quickly. Everything fed off our original idea and I personally find design to be the most fulfilling when it unravels that way.

Client side

Development started with scaffolding out the project’s client side first. We’ve been using React & Redux quite a bit at Keen and decided to build the entirety of our frontend with Create React App which I highly recommend. It’s just painless to use, and we don’t run into any problems due to it being overly opinionated. We barely even realize it’s there once we’re up and running writing code. And I have to say — I feel like React makes writing code fun again.

Components and 2018

When you start thinking about UI in terms of components, it’s like having a baseball bat made of logic smack you in the dome. The compartmentalization of code makes functionality much more reusable and extensible. It’s like each component is a little bundle of joy that you can just drag and drop wherever you want and have it render & perform perfectly. The web has a long way to go before this becomes native, but we’re excited to see progressions with the Shadow DOM and the ideas arising from it. The future looks bright, but also far away. C’mon, digital prodigies, prove me wrong.

An example we found to be a testament to exactly why components are so helpful to today’s web ecosystem can be found on our Subscriptions for SaaS project.

In the midst of building our own site, we designed, developed and launched a quick microsite for one of our favorite clients based in Germany. The site uses the WP Rest API for content, combined with a client-sided React app and was a very fulfilling project to design and build.

We ended up writing a few animated, graphical modules from React components for the microsite, were pretty pleased with how they turned out, and wanted to include them in our new site as a portfolio project.

It was basically as easy as copy and pasting a folder into another project, and boom — all the logic, the styles, and the markup were embedded into a completely different site, looking and performing perfectly. Sure, we had to bring over some typefaces and a tiny amount of supporting CSS, but it was a breath of fresh air. That’s some good stuff.

The state of CSS

Being a project for ourselves, we were able to do some exploration into new technologies and new techniques without the risk of introducing something too new into a production environment. Early on, we played around a good bit with how we would architect the CSS side.

We have been evaluating CSS-in-JS for a while now and really like its premise, and see the benefits of scoping styles directly to components, but unfortunately it comes with drawbacks and shortcomings that make it not a real candidate for us to rely on for anything close to universal adoption. Inline styles are slower for the browser to parse and pseudo elements, hovers, and other CSS aspects are more difficult as of now to implement. Also, in the back of my mind it’s always been sort of a shoe-horned solution to a much larger problem with web development and I don’t like relying on stop-gap solutions like that.

I mostly disagree with Create React App’s documentation and stance on the use of Sass within React. Yes, the component-based nature of the library means there should be less sharing of code, which eliminates a lot of why Sass is used in the first place, but we still use Sass functions, nesting and variables fairly often. Also, what would happen if we needed to provide a CMS with pre-styled HTML components that could be used in WYSIWYG fields? Think giving an admin the ability to insert a <button class=”btn”> into a WYSIWYG field at the end of a paragraph. That’s gotta be styled from somewhere and it’s not a React component. And any styles that compose this button should be reused throughout any other actual React Button components present in the codebase.

For these reasons, we decided to write our CSS still heavily reliant on Sass, and scoped each component to its own partial that is as isolated as possible from the components surrounding them, just as React says to. But almost all of our components do import a small subset of Sass partials as needed (mixins, vars, baseline, structure, etc) which are all the same for us from project to project. This works for us but still leaves something to be desired. I plan to write a post about this in the future.

The Server

For most of Keen’s web projects, we build locally, use a fresh droplet from DigitalOcean and configure it for production based on the tech stack of the project.

We often use Let’s Encrypt for SSL and did so for our own site. Nginx would act as a reverse proxy to our Node app. Pretty simple stuff, so far.

CMS

I could talk for three years about the state of CMS on the web. I yearn for a fully JS stack, and if we were going to use a PHP / MySQL CMS we might as well use the WP Rest API paired with Advanced Custom Fields and keep it decoupled from the client (which is pretty awesome to be honest). I don’t see much of a real reason to go playing around with any of the other PHP options out there. Different ways to do the same thing. But we had big plans for our Node server — it needed to beam a webcam stream to users as well as provide a realtime chat interface. The last thing we wanted to do was hodge-podge a LAMP stack with Node and websockets.

To fulfill our dreams of a fully JS stack, we ended up implementing KeystoneJS and relied on its Express webserver to serve both the Keystone admin UI as well as the static files from Create React App and a few specific routes we added on top of Keystone.

Keystone actually limited us a great deal in practice when it came to content management though. For example, each one of the projects featured on the site has a unique layout and therefore unique content model. Keystone is totally based on Lists, which as far as I can see, are limited to a specific content model per each list. Each one of our projects was to be built with React components though — and we have never liked or condoned the idea of just plopping in a big bundle of HTML into a WYSIWYG and then parsing it into HTML.

That means Keystone was of no use to us for the majority of why we’d use it to handle our content. We ended up statically storing our projects, team members and services pages as objects in JS arrays, each with their own properties and React components — imported into CRA and bundled. Lean and mean. We’re developers. That being said, we do plan to use Keystone for future Keen site content management needs where its list-based nature could be more appropriately applied (think blog posts, a future studio mood board, etc.).

CMS UPDATE (27 Aug, 2021):

Since I wrote this post, I left Keen to start a new digital agency called TRBL and there, my team and I built Payload CMS as a direct solution to what we’ve always wanted. We launched it in January of 2021 to insanely positive reception, and its community is bustling. It’s a super powerful self-hosted, Node / TypeScript headless CMS meant specifically for React developers.

Check it out!

Webcam Setup

The first real challenge of the site was trying to figure out how to stream a live video feed from our office to visitors on our site. There are lots of out-of-the-box solutions that can provide a live webcam stream to an app or to the internet, but we wanted a little more control over what was happening than what could have been provided with a premade solution.

After some research, we found two very helpful software packages that we could use to make the webcam happen.

But first up on the hardware side, we picked up a Raspberry Pi and a cheap $40 webcam from Amazon to both capture and stream the feed. From there, we selected ffmpeg, a command line tool that helps deal with converting and streaming media, to send a stream from the USB webcam to the HTTP server that we’d be launching somewhere in the wild.

Setting up the Pi before mounting it on the ceiling

ffmpeg is really cool. One command beams a webcam feed from our office to anywhere we want.

ffmpeg -re -f video4linux2 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:vb 2000k -vf hue=s=0 http://recipient-url-goes-here.com

Here, we set ffmpeg to use a physical camera connected via USB at /dev/video0, at a resolution of 640x480 and bitrate of 2000k, to stream to a URL set up to accept it and distribute it. We also used some handy built-in filters to convert the stream to grayscale.

We wrote a quick shell script on the Pi that would automatically run the above command on reboot and keep it alive if it failed.

The URL set up to receive the stream is a Node script running on our DigitalOcean droplet, based mostly off of this very helpful repo. It’s pretty simple actually. Anhttp server accepts the incoming stream and then distributes it to all connected clients via websocket. The droplet running the script can be responsible for pushing out quite a bit of bandwidth, so we’re keeping an eye on it and scaling it as necessary — but we don’t get a massive amount of traffic as of right now, so the 2000k bitrate per client should be plenty doable for now.

The client React app relies on a package called JSMpeg to connect to an mpeg-1 stream, decode and display it, which can also be found in the above repo. But, we actually opted for an extension of JSMpeg called jsmpeg-player because it provides a few more options as well as event callbacks.

Man, Javascript is great. So many otherwise impossibly hard tasks done so gracefully for us by other secret super geniuses, for free.

Chat

We used Socket.IO for the chat component itself. There are about 100,000 tutorials out there for how to build chat interfaces with SocketIO, so I won’t go into too much detail there, but it was a pretty fun experience working it into React and Redux, and being able to tailor it so easily with Node on the backend.

If a Keen admin is present and logged into the site, the chat enables itself to the public automatically, but otherwise it stays disabled. The admin is given a list of all online sockets, each with their own conversation thread, and notifications are fired to both the admin and the user when either party sends a message. Users are only exposed to their own conversations with Keen admins, and messages don’t persist past memory at all. The system is totally surface-level deep. Disconnect, lose your history. The chat is not made for work — it’s made for play.

The rest of the site

Development went on for around 4 months, worked between client projects and slotted in after hours. During that time, multiple dependencies of ours released breaking changes, most notablyreact-transition-group, which we relied heavily on for our randomized view transitions. It never ceases to amaze me how fast things change on the web.

The way that we present our work within the site went through a few sets of drastic changes due to rendering limitations of CSS and older hardware, and we actually scrapped our original approach just weeks before launch. As of early 2018, keeping above (or as close to) 60 FPS in all CSS animations on all devices (including older phones) is a tough task, indeed.. There are so many gotcha’s with the way that CSS handles intensive rendering still and I think the web has a long way to go performance-wise when dealing with animations and transitions.

Regardless, we’re pumped to hear what everyone thinks of the new site and excited to read some of the nonsense that’ll inevitably be sent through the chat. Bring it on.

Thanks for reading. Give us a shout on the site if you feel inclined.

--

--