A Case Study on Feeling Limited by the Web

or the hundredth article complaining about web development

Jamie Wilkenson
8 min readAug 19, 2018

At the beginning of the year, inspired by one too many blog posts, I started having this desire to combine all the features I’ve heard great websites have and build the perfect web app. It had to be fast. Fast to first paint and first interactivity. Fast in the million ways websites have to be these days because you are both sending the app itself and having the app run on any of a thousand different types of devices. I also wanted to make it accessible and work, to some extent, with any browser settings. For instance, it had to still be operational if JavaScript or cookies are disabled. And, on top of that, I wanted a comfortable and understandable development experience, which, to me, meant using as few command line tools and libraries in the process.

The whole thing felt pretty daunting, and I was a little scared to start since I didn’t want to write the whole thing only to realize I didn’t hit one of my benchmarks and have to rewrite it all to make it fit. In the back of my head I kept thinking there was some key ingredient to web development I was missing that would make everything click. Until then though, I just let the idea float around my head nagging me.

Then Google IO 2018 came around and I stumbled upon Jeff Posnick’s excellent video on multipage PWA’s. The idea of actually using a browsers streaming capabilities and writing universal JavaScript so you could both generate the HTML from the server for initial load, or as a fallback, and then also generate it from the service worker seemed like a really novel idea to make your app fast and available. All while using very few libraries, command line tools, or having complicated build steps/configurations.

The one flaw I felt it had was that it wasn’t using components, so it seemed a little limited compared to what other web development methods allowed. I think that might be slightly part of its charm, but I figured I could always stream JavaScript at the end of the file to replace certain HTML elements with components if I needed to.

After watching the video a few times and reading a blog on streaming, I felt I finally found the design I’ve been looking for.

The Project

Now that I figured out the how, I needed to figure out the what. I wanted to build something slightly unique and would actually use on occasion. I noticed that most of the popular podcasting apps are native (except maybe Spotify if you count it). I felt, with the advances in offline web support, I could create just as good of an experience in a web app. Maybe my app wouldn’t be able to store quite as many audio files (it’s hard to tell because the way some browsers ration space to websites can be a little mysterious), but, besides that, I should be able to create a fast web-app at almost feature parity with any native experience.

Why isn’t this article about the months I spent building this? Well, I ended up quitting two weeks into playing around with the idea. So why am I writing an article about it anyway? Well, I want to share my struggles with people who are thinking of setting out on a similar path and hope this will be useful to them. Or maybe I just want to whine. But anyway, below are the problems I ran into converting a mainly native experience to a web one. A few of these issues are specific to the project, and others have workarounds, but facing these issues made what I thought would be an interesting experience into a frustrating one.

The Problems

Universal JavaScript

Universal in that it can run on the server and the client. It can’t run on IE. That would be crazy.

As mentioned above, I wanted to write JavaScript that worked in both Node.js and in the browser. Since Node.js supports modules in experimental mode (as long as you use the .mjs extension) and most browsers support it naively now, I felt comfortable writing JavaScript in modules without relying on build tools. Any browser that doesn’t support modules would just get the more static experience that users with JavaScript disabled get.

The obvious issue, that I’m not sure why I didn’t realize sooner, is that no browsers currently support running modules in service workers. They only support importing scripts into service workers via importScripts. And since a large portion of the JavaScript I was writing would be used to generate HTML and needed to be run in the service worker, I ran into some problems. I briefly thought I could just import the files using importScripts instead of import, but the service worker couldn’t parse the export or import keywords in the modules themselves so it didn’t work.

This problem isn’t that big of a deal, however. Browser vendors are aware of the issue and are actively working on it. And, in the meantime, there are some workarounds. I could run a build tool/library that will solve it, like this one, or just not use modules. At the time, I decided to just copy all the module code into one file for service worker usage and once I saw it working I planned to implement the build tool or use a library. But it wasn’t long until I ran into another problem.

DOMParser

The worker context laughing at my folly once again.

The first real step in this project involved parsing the RSS feed of the selected podcast to get episode information. Browsers have the DOMParser interface that makes it easy to parse XML. Unfortunately, Node.js doesn’t have anything in their standard library. After searching through some libraries on NPM I found one with the same API as DOMParser. So, when a user searches for a podcast, I fetch the feed, parse it, generate the HTML, and stream that to the user. Great! Now I just need to do it from the service worker.

It seemed pretty straightforward. Once the service worker was installed, when a user made a request for episode information I would just intercept the request. Pull out the information on what podcast they wanted. Fetch that feed. And finally, parse the XML, generate HTML, and feed it to the response I created. All from the service worker.

Since I was writing universal JavaScript I thought it wouldn’t be a problem to just plug in my server side code. But, and you may see where this is going, you can’t do DOM related things in the service worker. Also, it seems that DOMParser not being thread safe is also an issue. I looked a bit for an external library to parse XML that would work in the service worker (the one that I was using server side didn’t seem to), but couldn’t find anything with the same API. Perhaps I was being too picky, but I decided to just give up at this point because while I was dealing with this another, larger, issue kept popping up that made me feel like this project was futile.

CORS

Every time I deal with CORS the browser turns into HAL 9000. “Give me the response information, browser.” “I’m sorry, developer, I’m afraid I can’t do that.” Although, at least I understand HAL’s reasoning.

In the above section I made it sound like fetching the feed in the service worker was a simple process. And why wouldn’t it be. Well, it ends up very few feeds have a CORS header so the service worker, or any JavaScript run from the domain, won’t have access to the content in the request. I could proxy the feed from my server and just let the service worker parse that. It’s not a bad idea until I realized that in most cases the audio files also wouldn’t have a CORS header on them.

You don’t actually need to make a CORS request for audio files in order for the user to be able to play them from your site, but I had recently read this article by Gerardo Rodriquez that discusses how opaque requests for cross origin images take up a lot of space in the service worker cache for various reasons and I assume the same is true for audio files. Since a large part of the app involved saving the audio files in the cache for offline use, it was going to be a problem.

I did eventually find one podcast hosting company that did add a CORS header to both their feeds and audio files, Libsyn. I figured I could at least test the app using one of the podcasts they host. But there was a very devious trap waiting for me. It ends up that the link Libsyn provides in their feed for the audio files redirects you to another site. That’s fine by itself, since browsers were recently fixed to handle CORS requests that respond with a 302 status. But, the server also responds to the preflight request with a 302 status, which is an invalid response so the request fails. At least that’s what I think is happening. It’s always a little hard to tell when it comes to CORS.

This is definitely going on my list of top 10 anime betrayals.

That was pretty disheartening and made me realize it’s probably going to be a while until this is really feasible. Of course, like I mentioned above, there’s always the option to set up a proxy to fulfill your requests. However, I didn’t like that method because not only was bandwidth a concern, but part of the design of the app was that the service worker would be able to handle most of the requests and very little communication to the server would be needed. Proxying everything feels like it would undermine that idea somewhat.

Conclusion

I really like web development as well as the direction it’s going. I tried to build something very specific at the edges of what is currently possible without making any compromises so it’s no surprise I ran into problems. It’s honestly amazing I got as far as I did and a lot of that wouldn’t even be possible a year ago on most browsers. Additionally, some of the problems I mentioned will likely be fixed in the near future.

Overall, it was a great learning experience and I really like this idea of streaming a site from both the server and the service worker. So far I’ve only heard of it being used for toy websites, but I think it will really gain traction in the future. I can’t wait to try making this website again in a year or two and see how far I get.

P.S. While writing this article I realized it had essentially turned into a pessimistic version of a much better article written by Paul Kinlan where he accomplishes something very similar and works around all the issues I mention above. I found it right around when I was giving up on this project, but didn’t realize how similar my article ultimately came out until I had finished. This article is basically an unofficial glass half empty companion piece to that one.

--

--