An ideal pattern to combine React Router with TanStack Query

November 18, 2024
1 comment React, JavaScript

I'm writing this blog post from an admin interface I built, which is a web app frontend for the backend behind peterbe.com. It's built as a single-page app in Vite, with React.
Vite, unlike frameworks like Remix or Next, doesn't come with its own routing. You have to add that yourself. I added React Router. Another thing you have to do yourself is a way to load remote data into the app for display and for manipulation. This is done as XHR requests happen on the client side. For that, I chose TanStack Query. If you haven't used it but used React, it's sugar for this type of code:


// DON'T DO THIS. USE TANSTACK QUERY

const [stuff, setStuff] = useState(null)
const [error, setError] = useState(null)
useEffect(() => {
  fetch('/api/some/thing')
    .then(r => r.json())
    .then(data => setStuff(data)
    .catch(err => setError(err))
}, [])

Truncated! Read the rest by clicking the link below.

How I make my Vite dev server experience faster

October 22, 2024
0 comments React, Node, JavaScript

I have a web app that operates as a SPA (Single Page App). It's built on regular Vite + React + TypeScript. In production you just host the built static assets, but in development (for hot module reloading and stuff) I use the built-in dev server in Vite. This is what you get when you type vite without any other command line arguments.

But here's what happens, you're in the terminal and you type npm run dev and hit Enter. Then, using your trackpad or keyboard shortcuts you switch over to a browser tab and go to http://localhost:5173/. When you get there to the first request, Vite starts at the entry point, which is src/main.tsx and from there, it looks at its imports and starts transpiling the files needed. You can see what happens with vite --debug transform.

With debug:

Truncated! Read the rest by clicking the link below.

The performance benefits of code-split an SPA

October 12, 2024
0 comments React

This isn't a comprehensive in-depth analysis but I have this SPA which is built with Vite + React.
When you run npm run build it produces:

vite v5.4.8 building for production...
✓ 8210 modules transformed.
dist/index.html                             0.76 kB │ gzip:   0.43 kB
dist/assets/images-xTxpPavl.css             2.02 kB │ gzip:   0.55 kB
dist/assets/index-IHK6QBxo.css            200.12 kB │ gzip:  29.85 kB
dist/assets/index-jgmGYYS9.js               0.79 kB │ gzip:   0.51 kB
dist/assets/open-graph-image-Ca6hLYnz.js    1.47 kB │ gzip:   0.82 kB
dist/assets/images-CwbhV2EW.js             28.75 kB │ gzip:  10.37 kB
dist/assets/pageviews-C6NSq649.js         378.67 kB │ gzip: 106.42 kB
dist/assets/index-HpyQl1NK.js             490.15 kB │ gzip: 154.11 kB
✓ built in 4.46s

Truncated! Read the rest by clicking the link below.

How to handle success and failure in @tanstack/react-query useQuery hook

September 16, 2024
0 comments React, JavaScript

What @tanstack/react-query is is a fancy way of fetching data, on the client, in a React app.

Simplified primer by example; instead of...


function MyComponent() {
  const [userInfo, setUserInfo] = useState(null)
  useEffect(() => {
    fetch('/api/user/info')
    .then(response => response.json())
    .then(data => {
      setUserInfo(data)
    })
  }, [])

  return <div>Username: {userInfo ? userInfo.user_name : <em>not yet known</em>}</div>
}

Truncated! Read the rest by clicking the link below.

Wouter + Vite is the new create-react-app, and I love it

August 16, 2024
0 comments React, Node, Bun

If you've done React for a while, you most likely remember Create React App. It was/is a prepared config that combines React with webpack, and eslint. Essentially, you get immediate access to making apps with React in a local dev server and it produces a complete build artefact that you can upload to a web server and host your SPA (Single Page App). I loved it and blogged much about it in distant past.

The create-react-app project died, and what came onto the scene was tools that solved React rendering configs with SSR (Server Side Rendering). In particular, we now have great frameworks like Gatsby, Next.js, Remix, and Astro. They're great, especially if you want to use server-side rendering with code-splitting by route and that sweet TypeScript integration between your server (fs, databases, secrets) and your rendering components.

However, I still think there is a place for a super light and simple SPA tool that only adds routing, hot module reloading, and build artefacts. For that, I love Vite + Wouter. At least for now :)
What's so great about it? Speed

Truncated! Read the rest by clicking the link below.

Notes on porting a Next.js v14 app from Pages to App Router

March 2, 2024
0 comments React, JavaScript

Unfortunately, the app I ported from using the Pages Router to using App Router, is in a private repo. It's a Next.js static site SPA (Single Page App).

It's built with npm run build and then exported so that the out/ directory is the only thing I need to ship to the CDN and it just works. There's a home page and a few dynamic routes whose slugs depend on an SQL query. So the SQL (PostgreSQL) connection, using knex, has to be present when running npm run build.

In no particular order, let's look at some differences

Build times

With caching

After running next build a bunch of times, the rough averages are:

  • Pages Router: 20.5 seconds
  • App Router: 19.5 seconds

Without caching

After running rm -fr .next && next build a bunch of times, the rough averages are:

  • Pages Router: 28.5 seconds
  • App Router: 31 seconds

Note

Truncated! Read the rest by clicking the link below.

Switching from Next.js to Vite + wouter

July 28, 2023
0 comments React, Node, JavaScript

Next.js is a full front-end web framework. Vite is a build tool so they don't easily compare. But if you're building a single-page app ("SPA"), the difference isn't that big, especially if you bolt on a routing library which is something that Next.js has built in.

My SPA is a relatively straight forward one. It's a React app that uses wonderful Mantine UI framework. The app is CRM for real-estate agents that I've been hacking on with my wife. SEO is not a concern because you can't do anything until you've signed in. So server-side rendering is not a requirement. In that sense, it's like loading Gmail. Yes, users might want a speedy first load when they open it in a fresh new browser tab, but the static assets are most likely going to be heavily (browser) cached by the few users it has.

With that out of the way, let's skim through some of the differences.

Build times

Immediately, this is a tricky one to compare because Next.js has the ability to cache. You get that .next/cache/ directory which is black magic to me, but it clearly speeds things up. And it's incremental so the caching can help partially when only some of the code has changed.

Running, npm run build && npm run export a couple of times yields:

Next.js

Without no .next/cache/ directory

Total time to run npm run build && npm run export: 52 seconds

With the .next/cache/ left before each build

Total time to run npm run build && npm run export: 30 seconds

Vite

Total time to run npm run build: 12 seconds

A curious thing about Vite here is that its output contains a measurement of the time it took. But I ignored that and used /usr/bin/time -h ... instead. This gives me the total time.
I.e. the output of npm run build will say:

✓ built in 7.67s

...but it actually took 12.2 seconds with /usr/bin/time.

Build artifacts

Perhaps not very important because Next.js automatically code splits in its wonderfully clever way.

Next.js

❯ du -sh out
1.8M    out
❯ tree out | rg '\.js|\.css' | wc -l
      52

Vite

❯ du -sh dist
960K    dist

and

❯ tree dist/assets
dist/assets
├── index-1636ae43.css
└── index-d568dfbf.js

Again, it's probably unfair to compare at this point. Most of the weight of these static assets (particularly the .js files) is due to Mantine components being so heavy.

Routing

This isn't really a judgment in any way. More of a record how it differs in functionality.

Next.js

In my app, that I'm switching from Next.js to Vite + wouter, I use the old way of using Next.js which is to use a src/pages/* directory. For example, to make a route to the /account/settings page I first create:


// src/pages/account/settings.tsx

import { Settings } from "../../components/account/settings"

const Page = () => {
  return <Settings />
}
export default Page

I'm glad I built it this way in the first place. When I now port to Vite + wouter, I don't really have to touch that src/components/account/settings.tsx code because that component kinda assumes it's been invoked by some routing.

Vite + wouter

First I installed the router in the src/App.tsx. Abbreviated code:


// src/App.tsx

import { Routes } from "./routes"

export default function App() {
  const { myTheme, colorScheme, toggleColorScheme } = useMyTheme()
  return (
    <ColorSchemeProvider
      colorScheme={colorScheme}
      toggleColorScheme={toggleColorScheme}
    >
      <MantineProvider withGlobalStyles withNormalizeCSS theme={myTheme}>
        <Routes />
      </MantineProvider>
    </ColorSchemeProvider>
  )
}

By the way, the code for Next.js looks very similar in its src/pages/_app.tsx with all those contexts that Mantine make you wrap things in.

And here's the magic routing:


// src/routes.tsx

import { Router, Switch, Route } from "outer"

import { Home } from "./components/home"
import { Authenticate } from "./components/authenticate"
import { Settings } from "./components/account/settings"
import { Custom404 } from "./components/404"

export function Routes() {
  return (
    <Router>
      <Switch>
        <Route path="/signin" component={Authenticate} />
        <Route path="/account/settings" component={Settings} />
        {/* many more lines like this ... */}

        <Route path="/" component={Home} />

        <Route>
          <Custom404 />
        </Route>
      </Switch>
    </Router>
  )
}

Redirecting with router

This is a made-up example, but it demonstrates the pattern with wouter compared to Next.js

Next.js


const { push } = useRouter()

useEffect(() => {
  if (user) {
    push('/signedin')
  }
}, [user])

wouter


const [, setLocation] = useLocation()

useEffect(() => {
  if (user) {
    setLocation('/signedin')
  }
}, [user])

Linking

Next.js


import Link from 'next/link'

// ...

<Link href="/settings" passHref>
  <Anchor>Settings</Anchor>
</Link>

wouter


import { Link } from "wouter"

// ...

<Link href="/settings">
  <Anchor>Settings</Anchor>
</Link>

Getting a query string value

Next.js


import { useRouter } from "next/router"

// ...

const { query } = useRouter()

if (query.name) {
  const name = Array.isArray(query.name) ? query.name[0] : query.name
  // ...
}

wouter


import { useSearch } from "wouter/use-location"

// ...

const search = useSearch()
const searchParams = new URLSearchParams(search)

if (searchParams.get('name')) {
  const name = searchParams.get('name')
  // ...
}

Conclusion

The best thing about Next.js is its momentum. It gets lots of eyes on it. Lots of support opportunities and great chance of its libraries being maintained well into the future. Vite also has great momentum and adaptation. But wouter is less "common".

Comparing apples and oranges is often counter-productive if you don't take all constraints and angles into account and those are usually quite specific. In my case, I just want to build a single-page app. I don't want a Node server. In fact, my particular app is a Python backend that does all the API responses from a fetch in the JavaScript app. That Python app also serves the built static files, including the dist/index.html file. That's how my app can serve the app straight away if the current URL is something like /account/settings. A piece of Python code (more or less the only code that doesn't serve /api/* URLs) collapses all initial serving URLs to serve the dist/index.html file. It's a classic pattern and honestly feels a bit dated in 2023. But it works. And what's so great about all of this is that I have a multi-stage Dockerfile that first does the npm run build (and some COPY --from=frontend /home/node/app/dist ./server/out) and now I can "lump" together the API backend and the front-end code in just 1 server (which I host on Digital Ocean).

If you had to write a SPA in 2023 what would you use? In particular, if it has to be React. Remix is all about server-side rendering. Create-react-app is completely unsupported. Building it from scratch yourself rolling your own TypeScript + Eslint + Rollup/esbuild/Parcel/Webpack does not feel productive unless you have enough time and energy to really get it all right.

In terms of comparing the performance between Next.js and Vite + wouter, the time it takes to build the whole app is actually not that big a deal. It's a rare thing to do. It's something I do after a long coding/debugging session. What's more pressing is how npm run dev works.
With Vite, I type npm run dev and hit Enter. Faster than I can almost notice, after hitting Enter I see...

VITE v4.4.6  ready in 240 ms

  ➜  Local:   http://localhost:3000/
  ➜  Network: use --host to expose
  ➜  press h to show help

and I'm ready to open http://localhost:3000/ to play. With Next.js, after having typed npm run dev and Enter, there's this slight but annoying delay before it's ready.

The technology behind You Should Watch

January 28, 2023
0 comments You Should Watch, React, Firebase, JavaScript

I recently launched You Should Watch which is a mobile-friendly web app to have a to-watch list of movies and TV shows as well being able to quickly share the links if you want someone to "you should watch" it.

I'll be honest, much of the motivation of building that web app was to try a couple of newish technologies that I wanted to either improve on or just try for the first time. These are the interesting tech pillars that made it possible to launch this web app in what was maybe 20-30 hours of total time.

All the code for You Should Watch is here: https://github.com/peterbe/youshouldwatch-next

The Movie Database API

The cornerstone that made this app possible in the first place. The API is free for developers who don't intend to earn revenue on whatever project they build with it. More details in their FAQ.

The search functionality is important. The way it works is that you can do a "multisearch" which means it finds movies, TV shows, or people. Then, when you have each search result's id and media_type you can fetch a lot more information specifically. For example, that's how the page for a person displays things differently than the page for a movie.

Next.js and the new App dir

In Next.js 13 you have a choice between regular pages directory or an app directory where every page (which becomes a URL) has to be called page.tsx.

No judgment here. It was a bit cryptic to rewrap my brain on how this works. In particular, the head.tsx is now different from the page.tsx and since both, in server-side rendering, need some async data I have to duplicate the await getMediaData() instead of being able to fetch it once and share with drop-drilling or context.

Vercel deployment

Wow! This was the most pleasant experience I've experienced in years. So polished and so much "just works". You sign in, with your GitHub auth, click to select the GitHub repo (that has a next.config.js and package.json etc) and you're done. That's it! Now, not only does every merged PR automatically (and fast!) get deployed, but you also get a preview deployment for every PR (which I didn't use).

I'm still using the free hobby tier but god forbid this app gets serious traffic, I'd just bump it up to $20/month which is cheap. Besides, the app is almost entirely CDN cacheable so only the search XHR backend would linearly increase its load with traffic I think.

Well done Vercel!

Playwright and VS Code

Not the first time I used Playwright but it was nice to return and start afresh. It definitely has improved in developer experience.

Previously I used npx and the terminal to run tests, but this time I tried "Playwright Test for VSCode" which was just fantastic! There are some slightly annoying things in that I had to use the mouse cursor more than I'd hoped, but it genuinely helped me be productive. Playwright also has the ability to generate JS code based on me clicking around in a temporary incognito browser window. You do a couple of things in the browser then paste in the generated source code into tests/basics.spec.ts and do some manual tidying up. To run the debugger like that, one simply types pnpm dlx playwright codegen

pnpm

It seems hip and a lot of people seem to recommend it. Kinda like yarn was hip and often recommended over npm (me included!).

Sure it works and it installs things fast but is it noticeable? Not really. Perhaps it's 4 seconds when it would have been 5 seconds with npm. Apparently pnpm does clever symlinking to avoid a disk-heavy node_modules/ but does it really matter ...much?
It's still large:

du -sh node_modules
468M    node_modules

A disadvantage with pnpm is that GitHub Dependabot currently doesn't support it :(
An advantage with pnpm is that pnpm up -i --latest is great interactive CLI which works like yarn upgrade-interactive --latest

just

just is like make but written in Rust. Now I have a justfile in the root of the repo and can type shortcut commands like just dev or just emu[TAB] (to tab autocomplete).

In hindsight, my justfile ended up being just a list of pnpm run ... commands but the idea is that just would be for all and any command under one roof.

End of the day, it becomes a nifty little file of "recipes" of useful commands and you can easily string them together. For example just lint is the combination of typing pnpm run prettier:check and pnpm run tsc and pnpm run lint.

Pico.css

A gorgeously simple looking pure-CSS framework. Yes, it's very limited in components and I don't know how well it "tree shakes" but it's so small and so neat that it had everything I needed.

My favorite React component library is Mantine but I definitely love the piece of mind that Pico.css is just CSS so you're not A) stuck with React forever, and B) not unnecessary JS code that slows things down.

Firebase

Good old Firebase. The bestest and easiest way to get a reliable and realtime database that is dirt cheap, simple, and has great documentation. I do regret not trying Supabase but I knew that getting the OAuth stuff to work with Google on a custom domain would be tricky so I stayed with Firebase.

react-lite-youtube-embed

A port of Paul Irish's Lite YouTube Embed which makes it easy to display YouTube thumbnails in a web performant way. All you have to do is:


import LiteYouTubeEmbed from "react-lite-youtube-embed";

<LiteYouTubeEmbed
   id={youtubeVideo.id}
   title={youtubeVideo.title} />

In conclusion

It's amazing how much time these tools saved compared to just years ago. I could build a fully working side-project with automation and high quality entirely thanks to great open source or well-tuned proprietary components, in just about one day if you sum up the hours.

How to change the current query string URL in NextJS v13 with next/navigation

December 9, 2022
4 comments React, JavaScript

At the time of writing, I don't know if this is the optimal way, but after some trial and error, I got it working.

This example demonstrates a hook that gives you the current value of the ?view=... (or a default) and a function you can call to change it so that ?view=before becomes ?view=after.

In NextJS v13 with the pages directory:


import { useRouter } from "next/router";

export function useNamesView() {
    const KEY = "view";
    const DEFAULT_NAMES_VIEW = "buttons";
    const router = useRouter();

    let namesView: Options = DEFAULT_NAMES_VIEW;
    const raw = router.query[KEY];
    const value = Array.isArray(raw) ? raw[0] : raw;
    if (value === "buttons" || value === "table") {
        namesView = value;
    }

    function setNamesView(value: Options) {
        const [asPathRoot, asPathQuery = ""] = router.asPath.split("?");
        const params = new URLSearchParams(asPathQuery);
        params.set(KEY, value);
        const asPath = `${asPathRoot}?${params.toString()}`;
        router.replace(asPath, asPath, { shallow: true });
    }

    return { namesView, setNamesView };
}

In NextJS v13 with the app directory.


import { useRouter, useSearchParams, usePathname } from "next/navigation";

type Options = "buttons" | "table";

export function useNamesView() {
    const KEY = "view";
    const DEFAULT_NAMES_VIEW = "buttons";
    const router = useRouter();
    const searchParams = useSearchParams();
    const pathname = usePathname();

    let namesView: Options = DEFAULT_NAMES_VIEW;
    const value = searchParams.get(KEY);
    if (value === "buttons" || value === "table") {
        namesView = value;
    }

    function setNamesView(value: Options) {
        const params = new URLSearchParams(searchParams);
        params.set(KEY, value);
        router.replace(`${pathname}?${params}`);
    }

    return { namesView, setNamesView };
}

The trick is that you only want to change 1 query string value and respect whatever was there before. So if the existing URL was /page?foo=bar and you want that to become /page?foo=bar&and=also you have to consume the existing query string and you do that with:


const searchParams = useSearchParams();
...
const params = new URLSearchParams(searchParams);
params.set('and', 'also')

Make your NextJS site 10-100x faster with Express caching

February 18, 2022
0 comments React, Node, Nginx, JavaScript

UPDATE: Feb 21, 2022: The original blog post didn't mention the caching of custom headers. So warm cache hits would lose Cache-Control from the cold cache misses. Code updated below.

I know I know. The title sounds ridiculous. But it's not untrue. I managed to make my NextJS 20x faster by allowing the Express server, which handles NextJS, to cache the output in memory. And cache invalidation is not a problem.

Layers

My personal blog is a stack of layers:

KeyCDN --> Nginx (on my server) -> Express (same server) -> NextJS (inside Express)

And inside the NextJS code, to get the actual data, it uses HTTP to talk to a local Django server to get JSON based on data stored in a PostgreSQL database.

The problems I have are as follows:

  • The CDN sometimes asks for the same URL more than once when in theory you'd think it should be cached by them for a week. And if the traffic is high, my backend might get a stamping herd of requests until the CDN has warmed up.
  • It's technically possible to bypass the CDN by going straight to the origin server.
  • NextJS is "slow" and the culprit is actually critters which computes the critical CSS inline and lazy-loads the rest.
  • Using Nginx to do in-memory caching (which is powerfully fast by the way) does not allow cache purging at all (unless you buy Nginx Plus)

I really like NextJS and it's a great developer experience. There are definitely many things I don't like about it, but that's more because my site isn't SPA'y enough to benefit from much of what NextJS has to offer. By the way, I blogged about rewriting my site in NextJS last year.

Quick detour about critters

If you're reading my blog right now in a desktop browser, right-click and view source and you'll find this:


<head>
  <style>
  *,:after,:before{box-sizing:inherit}html{box-sizing:border-box}inpu...
  ... about 19k of inline CSS...
  </style>
  <link rel="stylesheet" href="/_next/static/css/fdcd47c7ff7e10df.css" data-n-g="" media="print" onload="this.media='all'">
  <noscript><link rel="stylesheet" href="/_next/static/css/fdcd47c7ff7e10df.css"></noscript>  
  ...
</head>

It's great for web performance because a <link rel="stylesheet" href="css.css"> is a render-blocking thing and it makes the site feel slow on first load. I wish I didn't need this, but it comes from my lack of CSS styling skills to custom hand-code every bit of CSS and instead, I rely on a bloated CSS framework which comes as a massive kitchen sink.

To add critical CSS optimization in NextJS, you add:


experimental: { optimizeCss: true },

inside your next.config.js. Easy enough, but it slows down my site by a factor of ~80ms to ~230ms on my Intel Macbook per page rendered.
So see, if it wasn't for this need of critical CSS inlining, NextJS would be about ~80ms per page and that includes getting all the data via HTTP JSON for each page too.

Express caching middleware

My server.mjs looks like this (simplified):


import next from "next";

import renderCaching from "./middleware/render-caching.mjs";

const app = next({ dev });
const handle = app.getRequestHandler();

app
  .prepare()
  .then(() => {
    const server = express();

    // For Gzip and Brotli compression
    server.use(shrinkRay());

    server.use(renderCaching);

    server.use(handle);

    // Use the rollbar error handler to send exceptions to your rollbar account
    if (rollbar) server.use(rollbar.errorHandler());

    server.listen(port, (err) => {
      if (err) throw err;
      console.log(`> Ready on http://localhost:${port}`);
    });
  })

And the middleware/render-caching.mjs looks like this:


import express from "express";
import QuickLRU from "quick-lru";

const router = express.Router();

const cache = new QuickLRU({ maxSize: 1000 });

router.get("/*", async function renderCaching(req, res, next) {
  if (
    req.path.startsWith("/_next/image") ||
    req.path.startsWith("/_next/static") ||
    req.path.startsWith("/search")
  ) {
    return next();
  }

  const key = req.url;
  if (cache.has(key)) {
    res.setHeader("x-middleware-cache", "hit");
    const [body, headers] = cache.get(key);
    Object.entries(headers).forEach(([key, value]) => {
      if (key !== "x-middleware-cache") res.setHeader(key, value);
    });
    return res.status(200).send(body);
  } else {
    res.setHeader("x-middleware-cache", "miss");
  }

  const originalEndFunc = res.end.bind(res);
  res.end = function (body) {
    if (body && res.statusCode === 200) {
      cache.set(key, [body, res.getHeaders()]);
      // console.log(
      //   `HEAP AFTER CACHING ${(
      //     process.memoryUsage().heapUsed /
      //     1024 /
      //     1024
      //   ).toFixed(1)}MB`
      // );
    }
    return originalEndFunc(body);
  };

  next();
});

export default router;

It's far from perfect and I only just coded this yesterday afternoon. My server runs a single Node process so the max heap memory would theoretically be 1,000 x the average size of those response bodies. If you're worried about bloating your memory, just adjust the QuickLRU to something smaller.

Let's talk about your keys

In my basic version, I chose this cache key:


const key = req.url;

but that means that http://localhost:3000/foo?a=1 is different from http://localhost:3000/foo?b=2 which might be a mistake if you're certain that no rendering ever depends on a query string.

But this is totally up to you! For example, suppose that you know your site depends on the darkmode cookie, you can do something like this:


const key = `${req.path} ${req.cookies['darkmode']==='dark'} ${rec.headers['accept-language']}`

Or,


const key = req.path.startsWith('/search') ? req.url : req.path

Purging

As soon as I launched this code, I watched the log files, and voila!:

::ffff:127.0.0.1 [18/Feb/2022:12:59:36 +0000] GET /about HTTP/1.1 200 - - 422.356 ms
::ffff:127.0.0.1 [18/Feb/2022:12:59:43 +0000] GET /about HTTP/1.1 200 - - 1.133 ms

Cool. It works. But the problem with a simple LRU cache is that it's sticky. And it's stored inside a running process's memory. How is the Express server middleware supposed to know that the content has changed and needs a cache purge? It doesn't. It can't know. The only one that knows is my Django server which accepts the various write operations that I know are reasons to purge the cache. For example, if I approve a blog post comment or an edit to the page, it triggers the following (simplified) Python code:


import requests

def cache_purge(url):
    if settings.PURGE_URL:
        print(requests.get(settings.PURGE_URL, json={
           pathnames: [url]
        }, headers={
           "Authorization": f"Bearer {settings.PURGE_SECRET}"
        })

    if settings.KEYCDN_API_KEY:
        api = keycdn.Api(settings.KEYCDN_API_KEY)
        print(api.delete(
            f"zones/purgeurl/{settings.KEYCDN_ZONE_ID}.json", 
            {"urls": [url]}
        ))    

Now, let's go back to the simplified middleware/render-caching.mjs and look at how we can purge from the LRU over HTTP POST:


const cache = new QuickLRU({ maxSize: 1000 })

router.get("/*", async function renderCaching(req, res, next) {
// ... Same as above
});


router.post("/__purge__", async function purgeCache(req, res, next) {
  const { body } = req;
  const { pathnames } = body;
  try {
    validatePathnames(pathnames)
  } catch (err) {
    return res.status(400).send(err.toString());
  }

  const bearer = req.headers.authorization;
  const token = bearer.replace("Bearer", "").trim();
  if (token !== PURGE_SECRET) {
    return res.status(403).send("Forbidden");
  }

  const purged = [];

  for (const pathname of pathnames) {
    for (const key of cache.keys()) {
      if (
        key === pathname ||
        (key.startsWith("/_next/data/") && key.includes(`${pathname}.json`))
      ) {
        cache.delete(key);
        purged.push(key);
      }
    }
  }
  res.json({ purged });
});

What's cool about that is that it can purge both the regular HTML URL and it can also purge those _next/data/ URLs. Because when NextJS can hijack the <a> click, it can just request the data in JSON form and use existing React components to re-render the page with the different data. So, in a sense, GET /_next/data/RzG7kh1I6ZEmOAPWpdA7g/en/plog/nextjs-faster-with-express-caching.json?oid=nextjs-faster-with-express-caching is the same as GET /plog/nextjs-faster-with-express-caching because of how NextJS works. But in terms of content, they're the same. But worth pointing out that the same piece of content can be represented in different URLs.

Another thing to point out is that this caching is specifically about individual pages. In my blog, for example, the homepage is a mix of the 10 latest entries. But I know this within my Django server so when a particular blog post has been updated, for some reason, I actually send out a bunch of different URLs to the purge where I know its content will be included. It's not perfect but it works pretty well.

Conclusion

The hardest part about caching is cache invalidation. It's usually the inner core of a crux. Sometimes, you're so desperate to survive a stampeding herd problem that you don't care about cache invalidation but as a compromise, you just set the caching time-to-live short.

But I think the most important tenant of good caching is: have full control over it. I.e. don't take it lightly. Build something where you can fully understand and change how it works exactly to your specific business needs.

This idea of letting Express cache responses in memory isn't new but I didn't find any decent third-party solution on NPMJS that I liked or felt fully comfortable with. And I needed to tailor exactly to my specific setup.

Go forth and try it out on your own site! Not all sites or apps need this at all, but if you do, I hope I have inspired a foundation of a solution.

Previous page
Next page