Skip to content

Add Response Caching to a Node.js Express Server

In this tutorial we will use a Node.js Express middleware to implement basic in-memory caching for our server. Caching can help improve the performance of our server as well as serve requests to users faster.

By the end of this tutorial, you should be able to:

  • Set up an in-memory cache
  • Cache responses

This tutorial is part 7 of 7 tutorials that walk through using Express.js to create an API proxy server.

Goal

Add a caching middleware to increase server performance using Node.js Express.

Prerequisites

Watch: Add Response Caching to an Express Server

Check page load times

Let’s see how the server is performing before we add caching. Open a terminal and start the server:

Terminal window
npm run dev

Load localhost:3000/planets in your browser and open the browser’s DevTools panel to the “Network” tab. Do a hard refresh of the page (CMD + SHIFT + R on Mac for most browsers, and CTRL + SHIFT + R on Windows).

If you look at the terminal window, you should see the output of the middleware we created earlier to log how long requests take to complete from the server’s perspective. Your browser’s DevTools Network tab shows how long a response took to complete from the client’s perspective.

The planets endpoint can take anywhere from 4 seconds to 30 seconds to complete, which is something we’ll want to improve. It’s so slow because the NASA Exoplanets API we are requesting data from can sometimes take up to 30 seconds to respond.

In this tutorial we’re going to add caching to decrease the amount of time it takes to send a response.

Set up caching

We will implement our own caching middleware using a simple in-memory cache library. Creating your own middleware for caching will make it easier to swap out the memory cache for another caching layer, like Redis or Memecached, if you wish to do so in the future.

Install node-cache module, which we will use as our in-memory cache store:

Terminal window
npm i node-cache

Create a file called routeCache.js in the root of the project:

Terminal window
touch routeCache.js

Open routeCache.js in your editor. Require the node-cache package and create a new cache instance.

const NodeCache = require("node-cache");
const cache = new NodeCache();

This module will export a function that takes a duration of seconds to cache a response, and returns an Express middleware function that implements the getting and setting caching logic.

Let’s create that function and export it from the file:

const NodeCache = require("node-cache");
const cache = new NodeCache();
module.exports = duration => (req, res, next) => {};

This is how our caching middleware will work:

  • Is the request a GET request? If not, skip caching, call next and end execution.
  • Use the path of the request as the cache key.
  • Check if the key exists in the cache
  • If it does, send the cached result as a response. End execution, don’t call next.
  • If it’s not in the cache, replace Express’s send method with a new method that will put the response body into the cache. Call next.
  • If any errors occur with caching, log them, but allow the request to handled like normal without caching. Call next, but without passing the error to Express.

Let’s implement the first part, checking if the request is a GET:

const NodeCache = require("node-cache");
const cache = new NodeCache();
module.exports = duration => (req, res, next) => {
if (req.method !== "GET") {
console.error("Cannot cache non-GET methods!");
return next();
}
};

If the request isn’t a GET, we will call next() and skip the rest of the middleware. Using return makes sure that none of the rest of the function executes.

For the next part, we will use the cache.get method to check if we have an entry in the cache for the path requested:

module.exports = duration => (req, res, next) => {
if (req.method !== "GET") {
console.error("Cannot cache non-GET methods!");
return next();
}
const key = req.originalUrl;
const cachedResponse = cache.get(key);
if (cachedResponse) {
console.log(`Cache hit for ${key}`);
res.send(cachedResponse);
}
};

The cache.get function will return undefined if the key is not found, or it has expired past the time to live (ttl).

If we do have a cached response, send it to the client.

If a cached response isn’t found, we need to handle the request like normal, and then store the generated response into the cache.

To do that, we will augment the res.send function and call next() to hand execution off to the next middleware. This way, the request is handled by our route handlers like normal, but when it’s time to send the response to the user, our augmented send function will be called.

module.exports = duration => (req, res, next) => {
if (req.method !== "GET") {
console.log("Cannot cache non-GET methods!");
return next();
}
const key = req.originalUrl;
const cachedResponse = cache.get(key);
if (cachedResponse) {
console.error(`Cache hit for ${key}`);
res.send(cachedResponse);
} else {
console.log(`Cache miss for ${key}`);
res.originalSend = res.send;
res.send = body => {
res.originalSend(body);
cache.set(key, body, duration);
};
next();
}
};

The cache.set function takes the path as the key (including query parameters), the response body we generated, and a duration in seconds for the response to be cached on the server.

Note: To be clear, when an item isn’t in the cache, we’re not sending the response right now in this middleware. We are augmenting the send function so that eventually when send is called by Express to send the response back, it calls this special function instead. Express calls the res.send function whenever you’re sending a response, even if you use the res.json function to send the response.

Now that we’ve created our caching middleware, we can put it to work to speed up our requests.

Let’s cache the responses from the planets routes for 5 minutes (300 seconds). Require the routeCache.js module in our routes/planets.js file and pass the cache function with a duration into the route handler:

const cache = require("../routeCache");
router.get("/planets", cache(300), async (req, res, next) => {
/* ... */
});
router.get("/planets/since/:year", cache(300), async (req, res, next) => {
/* ... */
});

Start the server in a terminal window and open your browser to localhost:3000/planets.

In the terminal you should see output similar to this:

Terminal window
Cache miss for /planets
GET /planets Completed in 4.817 seconds

Then reload the page and you should see output like this:

Terminal window
Cache hit for /planets
GET /planets Completed in 0.008 seconds

At first we didn’t have anything in the cache, so we got a cache miss and had to wait for data from the NASA Exoplanet API. The second time around we had the response cached, a cache hit, and were able to start sending the request after only a few milliseconds. That’s a significant improvement! Keep in mind, our server is only logging how long it took before we sent the request, not how long it took for the browser to receive the data we sent it.

If you do the same test in your browser and check the Network section of your DevTools, you can see the time it took for the request to complete. My browser completed the request when the cache was hit in less than 500 milliseconds, which is a huge improvement over the 4+ seconds we were originally seeing.

By caching the result of the time intensive API call to generate the response, we are saving our server from doing that work again. While it has the data cached, it’s able to respond as quickly as the data can be retrieved from memory.

Recap

In this tutorial we sped up the response time of our Express server by implementing a cache. We cached responses to our most expensive API routes with a simple in-memory cache on the server. Caching the responses allowed our application to serve any request it had previously cached at the speed it can read the data out of the cache. Ultimately we increased the performance of our server several times over by implementing the cache.

Further your understanding

  • Currently our caching is implemented entirely on the server. Browsers can also cache results locally based on Cache-Control headers. How would adding Cache-Control headers to our responses change how requests are handled by the browser?
  • How could you implement the cache to only cache data we request from the Exoplanet API, instead of the responses our server sends?

Additional resources