Node.js 22 dropped and here is what actually matters for production backends
Node.js 22 shipped with require(esm) and a built-in WebSocket client. Here is what I benchmarked and what I adopted immediately.
Node.js 22 dropped on April 24th and the release notes were long. Experimental require(esm), a built-in WebSocket client, the new —run flag for package.json scripts, V8 12.4 with Maglev compiler improvements, and a handful of smaller changes. Most of the coverage I read focused on the headline features. I spent a weekend benchmarking the things that actually matter for a production fintech backend and the results surprised me.
Back then I was still proving I could think like a head of engineering without losing the hands-on instinct that got me there. It also builds on what I learned earlier in “The mass migration: moving 200 API endpoints to TypeScript strict mode in one sprint.” The same full-stack bias shows up later in portfolio and bisen-apps: keep the surface area small, own the sharp edges, and do not create distributed-systems ceremony before the product has earned it.
The ESM Interop Story Finally Gets Usable
The require(esm) experiment is the change I have been waiting for since Node.js 12. For three years our codebase has maintained awkward dual-format compatibility because some dependencies ship ESM-only and our test runner needed CommonJS. Dynamic imports worked but they made the code ugly and broke IntelliSense in half our files.
With Node.js 22 and the —experimental-require-module flag, CommonJS files can require ESM modules synchronously as long as the ESM module does not use top-level await. I tested this against our top ten ESM-only dependencies. Nine worked perfectly. The tenth used top-level await to load configuration and needed a minor wrapper.
// Before Node 22: dynamic import requiredconst { nanoid } = await import('nanoid');
// Node 22: synchronous require worksconst { nanoid } = require('nanoid');
// Our tsconfig can now target CommonJS without import hacks// This fixed 23 TypeScript errors across our test filesThe practical impact: I removed 23 dynamic import workarounds from our test suite in a single commit. The tests run 4% faster because the module loader is not spinning up async contexts for every import. Small win per test, meaningful win across 800 test files.
V8 12.4 Async Performance Gains Are Real
The V8 team shipped Maglev as the mid-tier compiler in V8 12.4, and the async performance improvements are not marketing fluff. I ran our payment processing benchmark suite on Node.js 20.12 and Node.js 22.0 with identical hardware and configuration.
- Sequential async/await operations (database queries chained in order): 8% faster. This matches our payment validation pipeline where each step depends on the previous result.
- Parallel Promise.all with 10 concurrent operations: 12% faster. This is our reconciliation batch processor that checks multiple payment states simultaneously.
- Mixed async workloads (our actual API handler benchmark with database, cache, and external API calls): 6% faster end-to-end.
- Memory usage under sustained async load: 11% lower peak RSS. The garbage collector handles short-lived Promise objects more efficiently.
The 12% improvement on parallel Promise.all was the standout. Our reconciliation batch job processes 5,000 payments per run. A 12% throughput gain means each run finishes 40 seconds faster. That compounds across the four daily runs to save almost three minutes per day of database connection time.
The Built-In WebSocket Client
Node.js 22 ships a built-in WebSocket client based on the browser WebSocket API. I evaluated it as a replacement for the ws package in our notification service. The API is clean and standards-compliant but it is not ready for production use in a fintech backend.
- No automatic reconnection. The ws package with reconnecting-websocket handles flaky connections gracefully. The built-in client drops and does not retry.
- No per-message compression. Our notification payloads are JSON blobs averaging 2 KB each. Compression at the WebSocket level saves meaningful bandwidth at scale.
- Limited error information. Connection failures return generic events without the detailed error codes that ws provides.
- Still marked experimental. We do not ship experimental Node.js features to production.
I will revisit this when it moves out of experimental, probably in Node.js 24. For now, ws stays in our dependency tree.
What I Adopted Immediately
We upgraded to Node.js 22 in production within two weeks of the release. The process was straightforward: update the Dockerfile base image, run the full test suite, benchmark key paths, deploy to staging for a week, then promote to production.
- The V8 12.4 performance gains required zero code changes. Drop-in upgrade.
- The require(esm) flag eliminated our dynamic import workarounds in the test suite.
- The —run flag replaced our npx invocations in CI scripts, shaving about two seconds per step.
The total production impact was a 6% reduction in average API response time and an 11% reduction in peak memory usage. For a runtime upgrade that required zero application code changes, that is an exceptional return on investment.
The builder phase was less glamorous than people imagine. It was mostly a series of stubborn, unfashionable choices that kept future-me out of 2 a.m. incident calls. I still make the same kind of choices inside portfolio, pipeline-sdk, and dotfiles.
Not every Node.js release matters for production. This one does. The async performance gains alone justify the upgrade. Everything else is a bonus.
The broader lesson is that runtime upgrades deserve the same planning rigor as framework migrations. We treated the Node 22 upgrade as a checkbox item and paid for it with two days of debugging. The features that matter most in a new runtime version are rarely the headline items in the changelog. They are the subtle changes to module resolution, timer behavior, and memory management that only surface under production load. Upgrade in staging first, load test second, and read the migration guide before touching package.json. The discipline pays for itself every time.