Developing a Web Site *Without* Node/NPM in 2020

Every few months, I find myself booting up an old client project to make a few quick changes, only to discover that one or a dozen of its build dependencies no longer exist, are no longer compatible with each other, or fail to build for some obscure reason. Five minutes turns into an hour, and I'm reminded:

NPM, like magic, is free, but that doesn't mean it has no cost.

Unfortunately, modern web development is virtually impossible without Node. It doesn't matter if you use Grunt or Gulp or Webpack as your task runner. It doesn't matter if you use Tailwind, Bulma, or Bootstrap as your CSS framework. They all require Node.

So is that it? Can you even build a production-quality web site — separate source and distribution files, asset minification, syntax linting, etc. — without Node anymore?

No, probably not.

But you can certainly reduce Node's role. The less Node is entangled in your build process, the fewer points of failure it will have when you go to resurrect it a year from now.

just

The simplest and most beneficial workflow change you can make is to move away from using Node or Node-based task runners to run tasks.

Enter just, an agnostic task-runner favored by the Rust development ecosystem!

It functions essentially like a glorified Makescript, with visible and hidden tasks, consisting of arbitrary code or commands to be executed in the shell of your choosing. That means you can simply run arbitrary CLI commands in a shell like /bin/sh or pass code through an interpreter like /usr/bin/env php, or both!

To give you an idea of what this might look like in a web development context, let's take a look at the common task of needing to generate Gzip- and Brotli-encoded variants of compiled CSS and Javascript assets. In a Node-based workflow, you would probably use plugins like gulp-brotli and gulp-gzip to handle this, but all those actually do is install brotli and gzip to your computer, and run those two programs for you. Install brotli and gzip yourself — once — and you can control your own destiny!

To do it the just way, simply create a new file in your project root called justfile, and populate it with something like the following:

# Define a path variable or two:
css_dir := justfile_directory() + "/dist/assets/css"
js_dir  := justfile_directory() + "/dist/assets/js"

# Compress CSS/JS.
@compress:
    #!/usr/bin/env bash

    # Clean old files first.
    find "" \( -name "*.css.br" -o -name "*.css.gz" \) -type f -delete
    find "" \( -name "*.js.br" -o -name "*.js.gz" \) -type f -delete

    # Gzip!
    find "" -name "*.css" -type f -print0 | xargs -0 gzip -k -9
    find "" -name "*.js" -type f -print0 | xargs -0 gzip -k -9

    # Brotli!
    find "" -name "*.css" -type f -print0 | xargs -0 brotli -q 11
    find "" -name "*.js" -type f -print0 | xargs -0 brotli -q 11

There is a little nuance here that is covered in more detail on the just Github page, but essentially, this gives you a task called "compress" that can be run via just compress. Under the hood, it simply uses find to locate the relevant files, and gzip and brotli to encode them.

The above example might be a little verbose. It is often useful to move repetitive code into a subtask with arguments. The following example does the same thing as the previous one:

# Define a path variable or two:
css_dir := justfile_directory() + "/dist/assets/css"
js_dir  := justfile_directory() + "/dist/assets/js"

# Compress CSS/JS.
@compress:
    just _compress "" "css"
    just _compress "" "js"

# Actually Compress *.EXT in DIR.
_compress DIR EXT:
    #!/usr/bin/env bash

    # Check the "raw" inputs real quick:
    [ -d "" ] || exit 1
    [ -n "" ] || exit 1

    # Clean the old.
    find "" \( -name "*..br" -o -name "*..gz" \) -type f -delete

    # Compress anew.
    find "" -name "*." -type f -print0 | xargs -0 gzip -k -9
    find "" -name "*." -type f -print0 | xargs -0 brotli -q 11

Short and sweet and re-usable! And best of all, it won't ever break unless Gzip or Brotli themselves die, in which case you won't need the task anyway!

"Okay," I hear you say, "that's fine for executing real programs, but what about running tasks that require Node scripts?"

Node doesn't really change the equation. If the script offers a CLI "bin", you can run it in from the terminal, meaning you can make a task for it. For example, say you want to minify Javascript using terser. Add a task like:

# Compress CSS/JS.
@minify-js:
    cat "/some-file.js" | terser -m -o "/some-file.min.js"

If terser isn't installed globally, use npx terser -m -o... instead. Same diff.

Find Alternatives

With the task runner detached from Node dependence, you're free to replace individual build components when and where a viable alternative presents itself.

Whether or not there are alternatives depends on your workflow and your persistence. If you're on Team Typescript, you're shit out of luck. Typescript requires NPM. The same goes for virtually any framework that has to be built or customized to "work". (If you can download a raw .js file, of course, do that, and you've avoided NPM.)

Nonetheless, here are a few suggestions that might help some readers:

Watching for Changes

You need not look further than watchexec. Point it at a directory, filter by the extensions you care about, and tell it what to do when something happens, and you've got the equivalent of grunt watch!

SCSS

If you write framework-free SCSS, you can simply use SASSC to compile it down to CSS. As the name suggests, SASSC is a SASS/SCSS compiler written in C#. Debian/Ubuntu users, for example, can simply run apt-get install sassc and you're off to the races. The use syntax is simply:

# Make some CSS:
sassc --style=compressed /path/to/in.scss /path/to/out.css

Javascript Minification

If you write framework-free Javascript and don't mind marking up your code with (a ton of) JSDOC, check out Google's Closure Compiler. The documentation is terrible, but if you're willing to fight with it for a while, it will eventually give you insanely compact production JS code. (And unlike with Typescript, if Google ever discontinues Closure, at least you're left with actual Javascript sources and not a bunch of unusable meta-language gibberish.)

Lossless Image Compression

If you need to losslessly compress JPEG and PNG images, look no further than Oxipng for PNGs, and MozJPEG for JPEGs. You can also use programs like cwebp from the WebP libraries to convert images to more compact formats.

Random Tasks

Chances are, your system already has a suite of useful CLI programs to cover a lot of bases. If you need to concatenate files, just use cat. If you need to replace substrings, just use sed. If you need to fetch files, use curl or wget or aria2c.

If you're feeling more adventurous, a lot of these basic Unix apps have alternatives of their own. If you enjoy the power of sed but find its regular expression syntax frustrating, check out sd. If you're tired of piping find into parallel, check out fd.

The Rest

Well, for everything else, there's still Node.

Bundling Node

There is, actually, a semi-workaround for the Node problem. By using (Node) libraries like nodec, pkg, or nexe, you might be able to convert a critical, one-of-a-kind Node app like SVGO into a single-file executable for Linux, Mac, or Windows. (One file each.)

This is a semi-workaround because you still need Node to make the bundled file in the first place, and the bundled file will include Node within it, but you'll be able to run that file on a system that does not have Node installed, using exactly the same syntax you would if you were just doing things the NPM way.

Out of curiosity, I tried all three programs against terser, CSSO, SVGO, and Eslint, four Node apps I can't seem to get around using. None of them worked for Eslint. Nodec worked for SVGO but nothing else. Pkg worked for CSSO and Terser but not SVGO. And Nexe didn't work on my particular system at all.

For the bundles that did get built, the file sizes were enormous, ~40–80MB, and the build time for SVGO took twenty minutes!

In theory, because the bundles stuff a specific version of Node and all the relevant dependencies inside the one file, if you save the file, there'd be less risk of future breakage than if you were depending on NPM to later pull the same things down from the cloud, but the process was so tedious and fragile, and the results so underwhelming, it probably isn't worth trying in most cases.

The Future?

Until the next Nodepocalypse, the web development landscape is unlikely to pivot away from Node-based everything.

Which is not to say all hope is lost.

Rust, especially, is seeing a lot of general cleanups and ports of projects from other languages. There are a number of different JS/CSS minifiers and image tools in the works. We're also working on a number of standalone CLI tools for common build tasks, but more on that in a future post.

It's like that tall mountain, visited each year by a bird, who grinds its beak against the rock to sharpen it before continuing its journey. Though small, that simple act of beak-sharpening weathers the mountain ever so slightly, and year after year, the mountain is made that much smaller, until after just a few million years, there's no mountain at all.

Josh Stoik
20 October 2020
Previous Randomizing Weighted Choices in Javascript
Next Solving Simple Problems With Simple Apps