simple-hmr.txt

Tutorial

Simple Hot Module Replacement

Over the past few weeks I’ve been fixated on designing a minimalist Hot Module Replacement (HMR) pattern for any web development server. I want it to be incredibly simple, no Node, no Vite or webpack, no npm dependencies, no transpiling or plugins - just a simple little server that can hot reload CSS or live reload JavaScript when a file changes.

“Why?” surely you must ask. Sometimes I work on a project that uses Go or PHP and doesn’t have Node as a dependency, nor much JavaScript in general, so having to default to Vite to serve some assets sometimes feels like overkill. Then I started using entr and this got me thinking…

Can I design a HMR client with the least amount of dependencies?

I think so! So if a strange foray into seeing an h1 change colour when you save a .css file without reloading the page with around 120 lines of code seems interesting, read on!

Setup

This article will assume a mainly static project structure like below, but can be easily adapted to any structure.

/hmr-demo
    /src
        main.js
        main.css
    /public
        index.html
        /scripts
            hmr.js
    dev.sh
    server.file

Requirements

  1. An HTTP server that can handle both GET and POST requests written in any language.
  2. entr installed on your system.

Part 1: The Shell Script

Let’s start by going through the dev.sh file chunk by chunk. The example repo will have a more expanded version that includes some checks. For brevity I’ve just hard-coded most of the dynamic values like a URL, but in a working script I recommend source these values from a .env file.

Handling File Changes

handle_file_change() {
  current_dir=$PWD
  file_path="$0"
  result=${file_path#"$current_dir"}
  curl -X POST -d "name=$result" http://localhost:8000/_hmr
}

export -f handle_file_change

It shortens the full path to the file ("$0"), .e.g /user/projects/demo/src/main.css to /src/main.css, then uses curl to send the changed file to the server using a POST request.

Both the URL and how you format the file path is completely determined by your needs. If your server delivers assets in a certain file path, rewrite it to that in this function:

/src/main.css -> /assets/styles/main.css

trimmed=${file_path#"$current_dir"}
public_path="assets/styles"
result=${trimmed/src/"$public_path"}

Watching For File Changes

watcher() {
    echo "Development server started"
    echo "Ctrl+C to stop"

	while true; do
        find ./src -name "*.js" -o -name "*.css" |
        entr -cdp sh -c "handle_file_change $0" /_ &&
		break
    done
}

The find directory, file types or even the search program can be swapped out, (I use fd).

The watcher function starts the entr process, which is inside a while loop so new files added to the project are tracked and published. The executed portion of the entr command is the bash -c 'handle_file_change' "$0"' /_, which tells entr to run this function and pass the changed file as the argument. The -cdp flags are for:

  • c clears the terminal screen on change.
  • d tracks the directories of the watched list.
  • p postpones the first callback.
server() {
  # INSERT YOUR SERVER COMMAND HERE, npm start, etc.
}

The server function runs any server initialization, it could be npm start, node server.js, deno run -A main.ts, cargo run, go run main.go or whatever you are using to start a server.

(trap 'kill 0' SIGINT; server & watcher)

The final part runs the server and watcher processes in a trap so they can both be canceled together, and if one fails, the next one will not run.

Make sure to make the shell script executable by running $ chmod +x dev.sh.

Part 2: The HTTP API

I want to stress that Node is in no way prescribed as the only way to accomplish this portion of the setup, I’ve just gone with JavaScript here and in the example repository because it’s the simplest fit for demonstration purposes.

The first part is the GET request at /_hmr – it needs to return an text/event-stream response:

HTTP/1.1 200 OK
Content-Type: text/event-stream
Cache-Control: no-cache
Connection: keep-alive

The request is initiated by EventSource on the browser, and it expects these headers and the stream body to be set. Ensure the response is not closed if the client is still connected, e.g. res.end();

const clients = new Map();

const server = http.createServer((req, res) => {
  if (req.url === "/_hmr" && req.method === "GET") {
    res.headers.set("Content-Type", "text/event-stream");
    res.headers.set("Cache-Control", "no-cache");
    res.headers.set("Connection", "keep-alive");

    const id = crypto.randomUUID();
    clients.set(id, res);

    c.res.on("close", () => {
      clients.delete(id);
      res.end();
    });
  }
});

The body should be configured in this format:

event: change\n
data: /src/main.css\n\n

Note the first line break after the first line, and 2 new lines after the final line. This is required to send a valid EventStream response.

The last piece is handling notifications from the file watcher script, which sends an HTTP POST request with the changed file path in the body.

const server = http.createServer((req, res) => {
  // ...

  if (req.url === "/_hmr" && req.method === "POST") {
    let body = "";
    req.on("data", (chunk) => (body += chunk));
    req.on("end", () => {
      const params = new URLSearchParams(body);
      const file = params.get("name");
      clients.forEach((res) => {
        res.write("event: change\n");
        res.write(`data: ${file}\n\n`);
      });
      res.write("OK");
      res.end();
    });
  }
});

The endpoint just needs to deserialize the request body and write to the stream of each connected client.

Part 3: The Client Side Code

The last piece of the puzzle is to include this JavaScript either as an inline script or a separate, development environment only include. It connects to the event stream URL and listens for change events, which is a custom event type. Read more about EventSource here.

const url = new URL("/_hmr", location.href);
const listener = new EventSource(url);
listener.addEventListener("change", handler);

function handler(event) {
  for (const link of document.getElementsByTagName("link")) {
    const linkUrl = new URL(link.href);

    if (linkUrl.host === location.host && event.data) {
      if (linkUrl.pathname === event.data) {
        const next = link.cloneNode();
        next.href = `${linkUrl.pathname}?t=${Date.now()}`;
        next.onload = () => link.remove();
        link.parentNode.insertBefore(next, link.nextSibling);
        return;
      }
    }
  }

  location.reload();
}

On change events the handler loops over all the link tags and clones them, updates them with the cache-busted URL and then deletes the old one onload. Any other event will just reload the page, but hit save on a CSS file and watch the webpage’s styles update without a reload.

Next Steps

This is really just a fun little proof-of-concept, and there are so many directions it could be improved or extended, like:

  • Reading host, port and other server details from a .env file.
  • Support JavaScript hot modules and no-build variations of frameworks like Preact, Lit or Petite Vue.
  • Listen to changes to a database and notify the client new data is ready.
  • Skip HTTP and use PID’s instead to send a signal to the server that a file has changed.
  • Inserting esbuild or some other build tool between the entr file change event and the curl request. This could cover most of the dependency tree shortcomings.
  • Adding some form of unit or end-to-end tests.
  • A build command.

Wrapping Up

Hope this was at the very least an entertaining deep dive into a shallow little project. Feel free to fork from the example repo and adapt it to your project!

1172 Words

Published

Relevant Links