Welcome DenoStore:

Modular Caching for GraphQL APIs with Servers in Deno

Jake Van Vorhis
10 min readMay 5, 2022

When Inspiration Struck

The team behind DenoStore spends most of their time building full stack JavaScript apps with technologies such as Node/Express, React, SQL and NoSQL databases, and a variety of design architectures. Through the development of these web applications, the team noticed an increasingly-pressing problem that needed to be addressed: the difficulty of caching GraphQL queries, especially with apps running in Deno.

Caching is a part of every modern application’s system design architecture at scale. If you are unfamiliar, in system design context, caching is a semi-temporary flexible storage of fetched data held closer to the client in order to avoid duplicate server calls. You can think of a cache like a layer of data between the client and server, most often data that the client has requested at least once already. By adding commonly requested data to the cache and responding with cached data first, the application is able to avoid the cost and speed/performance lag of an overwhelmed server.

When using RESTful API architecture, the client’s browser handles some caching of fetched resources automatically so they are readily available on re-request. Specific header options can also be sent with every HTTP GET request to direct the client’s caching, so implementing a caching system can be fairly simple.

However, GraphQL has no such cache-control header options given that it receives query strings via POST requests. There are tools available to assist with GraphQL caching, but we realized when trying to build an application with both GraphQL and a server in Deno that an efficient and modular open source solution was sorely needed. Deno is such a new backend technology that it lacks support and tooling in many areas, and it became clear the caching functionality we were seeking needed to be built from scratch. As a result, the idea for DenoStore was born.

A Brief Intro to GraphQL

If this is your first time hearing about it, GraphQL is a powerful API query language which allows the client to request exactly the amount of data it needs from the backend, no more and no less, which is predefined in a strictly-typed schema shared between both sides of the stack. This allows specific queries to be made in a concise manner (only including the necessary fields) and all routed to a single endpoint. GraphQL ensures that only the precise information sought by the client is fetched, thus avoiding the common problem of over- or under-fetching information, which is inevitable in traditional REST architecture.

Through the specification of types and fields, GraphQL achieves its highly-reputed efficiency not only through a reduction in the amount of data transferred, but also through clarity of data shape so the frontend receives a data format back from the server in exactly the manner it expects. Tracing the data flow, all client requests are queries sent to a single endpoint, which are then parsed by GraphQL with the data being retrieved at the resolver function level (set in the pre-defined schema for each query), with the exact information queried then sent back in response.

A Brief Intro to Deno

Deno, released in 2018 by Ryan Dahl, is a “simple, modern and secure runtime for JavaScript and TypeScript that uses V8 and is built in Rust” according to its documentation. Deno was built with the intention of improving some key weaknesses of Node.js.

One of the largest advantages of Deno is its portability in comparison to Node.js. Rather than using the Node Package Manager to install and import modules, Deno downloads and caches modules in a global directory when the script encounters an import from a URL containing the required module. This allows Deno to avoid a cumbersome package.json file and node modules folder, therefore remaining compact and light-weight.

Some other advantages are its TypeScript default, its built-in functionality around tasks like testing and environment variables (without the need for modules), its top-level await, and its greater security around limiting access to the file system, network, and environment.

While there are many strengths to Deno, and even though the core team is building out a truly featureful developer experience for a JavaScript/TypeScript backend, it is so new that support in the developer community is not yet as robust as other more established technologies. There is definitely a “chicken or egg” problem where the lack of support contributes to the lack of use of Deno, which in turn leads to the creation of fewer modules and packages.

A Brief Intro to Redis

Redis is a diverse and multi-function non-relational data store which is utilized in DenoStore as the foundation of its caching functionality.

Redis is frequently used as a caching tool because of configuration around cache instance expiration controls, time to live, and key-value pairing, so this made it a perfect foundational element of the DenoStore developer tool. Because Redis provides such efficient in-memory data storage, it leads to incredibly fast response times for cached data.

Enter: DenoStore

Having identified the glaring lack of GraphQL caching functionality in Deno, our team of developers had one goal: to make caching GraphQL in apps running in Deno both easy to implement and highly effective. We sought to combine the technologies above, each of which is powerful in isolation, and forge them into a caching tool that would contribute to the growing Deno developer community.

We started with the strongly-typed GraphQL schema and the Oak framework for Deno servers. If the caching was of a RESTful API type, a developer could easily create a middleware function that looks for a cache hit or miss and sets a cache key with the URL string of the requested data. However, because this is GraphQL, all queries go to a single endpoint. We began parsing the GraphQL abstract syntax tree (AST) and realized two things at once: that the caching would happen most effectively at the query schema’s resolver-level, and that a one-size-fits-all approach would not suffice.

By setting the cache at the resolver level of each query, DenoStore provides middleware-esque decision-making power during implementation regarding which queried data to cache and the duration it should live within the cache. Data that is most likely to be mutated, or perhaps mutated very frequently, can simply not be cached. Queries for data that are mutated at a fixed interval or periodically can be saved to the cache for a developer-specified amount of time. Queries for things like historical data that never change but are frequently requested can be set indefinitely to the cache.

This implementation-level optionality allows the developer utilizing DenoStore in their GraphQL schema to make an optional caching decision for each query based on the type of data it contains and its mutability. Effectively, DenoStore brings a modular and middleware-style design pattern into the fold of GraphQL’s architecture, even though all queries are still sent to just one endpoint by the frontend.

DenoStore Caching Design Benefits

An additional design benefit of implementing caching in the GraphQL resolver function is that queries are cached more completely, once again boosting an app’s performance and potentially lowering its server bills. Because GraphQL handles the resolver-level caching before responding to the client, the cache for a query is set with all fields that could be included within the query on its initial fetch. Upon a subsequent query of the same name and arguments but with differing fields inside, the response would actually still be sent from a cache hit rather than from the server.

Consider a query for oneRocket that receives an argument of an id and has access to the fields: id, active, stages, first_flight, country, wikipedia, description, rocket_id, rocket_name, and rocket_type.

Though it’s just for the rocket_name field, the following initial query (utilizing DenoStore) would set the cache with all possible fields for oneRocket with an id of 1:

Because of this, a future oneRocket query with different fields included would still be found in the DenoStore cache, even if this query had no fields in common with the original query:

Tradeoffs

As with all things in life, there are tradeoffs to this approach, some more obvious than others. On the one hand, the performance boost is immediate and powerful because more data is going to be cached. The client will receive a response from the cache more often because all fields in a query’s schema, not just the initially requested ones, are set to the cache. However, this speed and performance boost can come with a cost.

Take for example a social media app running a GraphQL query for all the friend_connections of a user, and in nested subfields, all the friend_connections of the user’s friends. If the DenoStore caching functionality was added to the resolver of this query, the application could end up with a bloated and very large cache. This could end up being expensive in its own right.

To address this tradeoff, our team ultimately decided to make a few design considerations that leave as much of the opinionated decision making as possible in the hands of the developer implementing a DenoStore cache. First, we added a key configuration option to the first argument of the DenoStore cache function to give any individual cached data an expiration time value. Second, we added a default expiration option to the initial instantiation of DenoStore so the developer can give every cache key value pair an automatic expiration time. Third, we believe a tool is most powerful when it is flexibly wielded, and that it is the schema design and implementation that should ultimately determine which queries to cache. Lastly, when the Redis server is activated from the command line or during containerization, there is always an option to set a maximum size to the cache in the configuration, so the risk of a mind-bogglingly large cache is low.

One additional feature our team decided to include is an IDE Playground for query visualization during development, which is optional and allowed during DenoStore class instantiation but defaults to ‘false.’

DenoStore in Summary

With the performance-optimized functionality of Redis and the power of GraphQL’s schema resolvers at the root of DenoStore, our tool gives an end user a potentially monumental reduction in the total server and API call volume and response time. Server calls that initially may take several seconds can now be returned from the cache potentially in single or double digit milliseconds. This means that apps built with DenoStore caching run both faster and cheaper with less latency. Implementation is simple, customization is modular and flexible, and caching is powerful.

Setup

For greatest detail regarding setup, implementation, and configuration, see the DenoStore documentation.

To set up the DenoStore cache router, add the below code to your Deno/Oak server file:

Caching

After your DenoStore instance is configured in your server, all GraphQL resolvers have access to that DenoStore instance and its methods through the resolver’s Context object argument. Your schemas do not require any DenoStore imports.

Here is a simple example of a query resolver before and after adding the cache method from DenoStore. This is a simple query to pull information for a particular rocket from the SpaceX API.

Example query resolver without DenoStore caching:

Example query resolver with DenoStore caching:

As you can see, it only takes a few lines of code to add modular caching exactly how and where you need it.

How to Contribute

DenoStore is an open-source product developed in collaboration with tech accelerator OS Labs. We at DenoStore encourage seasoned and aspiring developers alike to iterate on our project and provide feedback on their experience with our developer tool.

If DenoStore made your life as a developer easier, we would greatly appreciate a GitHub star. Like all new modules in Deno, the more DenoStore is introduced to the programming world, the greater the potential for a great developer experience in Deno.

Learn More

To learn more or test out DenoStore in your application, view our demo website, documentation, Github repository, or the module on Deno.land.

Team

Jake Van Vorhis // GitHub | LinkedIn

James Kim // GitHub | LinkedIn

Jessica Wachtel // GitHub | LinkedIn

Scott Tatsuno // GitHub | LinkedIn

Ting Xian Ho // GitHub | LinkedIn

--

--