Building runtime-aware JavaScript packages
Building JavaScript packages for multiple runtimes, for better compatibility.
At CodeSandbox, we run your code in our cloud infrastructure, configure the environment for you and keep your code always ready, behind a shareable URL. Give it a try with this Next.js example or import your GitHub repo!
One of the great things about JavaScript is how easy it is to share modules. Using package managers like npm
, developers can easily publish their code as packages, share them with the world, and distribute them with global CDNs like jspm
.
This makes it easy for other developers to reuse the code in their own projects, saving time and effort. Additionally, the open-source nature of many of these packages means they can be constantly improved and updated by the community. All in all, this makes JavaScript a powerful and collaborative language for building applications.
However, there was no defined way for how these modules should be written and distributed. With JavaScript able to run on multiple runtimes—including browsers, Node.js, Deno, and mobile—and each runtime environment having its own built-ins to work with, there was a need to establish standard practices.
There are different standardizing bodies for various technologies. For example, tc39 is responsible for standardizing and proposing changes to the JavaScript language. Meanwhile, w3c and node-tsc take care of standards in browsers and Node.js, respectively.
Example
- file-system (fs) is available in Node, but not in browsers.
- document (DOM) is available in browsers but not in Node.
These are runtime specifics called built-ins which are dependent on the environment. However, since each environment uses JavaScript, there is nothing preventing us from using a module written for the browser in Node, Deno, or any other runtime. So package authors started using polyfills to make sure their modules continued to work in each runtime. But adding polyfills into the module itself and publishing them has its own downsides.
Built-ins
Let's say we are building a package for an authentication layer that needs to be used in browsers and Node environments. If we follow the polyfills process:
- We use
node-fetch
for making fetch calls in Node, so we bundle this into the module. But browsers have their own implementation offetch
that doesn't require a third-party polyfill. - If each
npm
package bundles its own polyfills, the dependencies get bloated. Every module in thenode_modules
will bring its own copy of each polyfill, causing the total build size to increase. - If the modules don't bring their own copy, end users need to install the
polyfills
manually.
The good news: with the recent additions to the package specifications, building packages for different runtimes is easier now. Let's look at some examples of how we can build better packages.
Imports #
With the addition of import paths mapping in Node version 16, we can define a dependency to use determined by whether the module is running in the browser or Node.
So, a call to fetch
can be mapped using:
{
"name": "@example/authentication",
"type": "module",
"imports": {
"#middleware": {
"node": "./middleware.js",
"default": "./middleware-browser.js"
}
}
}
And we can implement fetch
for both environments either by using their built-ins or by polyfilling them if needed:
// middleware-browser.js
export const authenticate = () => {
return fetch("https://example.com/authenticate");
};
// middleware.js
import fetch from "node-fetch";
export const authenticate = () => {
return fetch("https://example.com/authenticate");
};
When we consume authenticate
inside the module, we import it using:
import { authenticate } from "#middleware";
This takes care of loading the module that is specific to the environment. By following these patterns, we reduce the duplication of polyfills and use native modules in their respective environments.
Example
Let's look at an example of how chalk
, one of the most used packages, uses import paths mapping and loads the built-ins efficiently.
Here is the imports
configuration from package.json
:
"imports": {
"#ansi-styles": "./source/vendor/ansi-styles/index.js",
"#supports-color": {
"node": "./source/vendor/supports-color/index.js",
"default": "./source/vendor/supports-color/browser.js"
}
}
Exports #
In addition to the imports
field, there is also an exports
field in package.json
that allows package authors to specify how their package should be imported in different environments. The exports
field specifies different entry points for each runtime (browser, Node, default), as well as distinct formats (such as CommonJS or ES modules) for the same entry point. This allows for more efficient loading and usage of packages in different environments.
Don't get confused between imports
and exports
. Imports are for using runtime-aware external packages in your code. On the other hand, exports
expose your module in different environments, making it work across runtimes and formats.
A good example of an exports field can be found in
Preact's package.json
:
"type": "module",
"exports": {
".": {
"types": "./src/index.d.ts",
"browser": "./dist/preact.module.js",
"umd": "./dist/preact.umd.js",
"import": "./dist/preact.mjs",
"require": "./dist/preact.js"
}
}
Note: umd
is not a standard field in the specification.
WinterCG, the collaboration working group for different JavaScript runtimes, is currently standardizing these runtime keys. If you're curious, here is the specification for more details. The proposed specification tries to bring in more identifiers for the different runtimes. This helps optimize packages by taking advantage of the built-ins that are available in the target runtime—and adding polyfills only when needed.
ENV #
While bundling projects, it's a common practice to use process.env.NODE_ENV
to create different builds for production
and development
.
While this is common practice in building applications, it becomes a problem when packaging for libraries. Unfortunately, process
is not a browser built-in.
It is in the best interest of library authors to serve different versions in different build targets. This can allow authors to give developers better error messages, create better build sizes in production, etc. However, when we try to load these types of modules in browsers directly from CDNs, script execution crashes.
For example, if we try to load [email protected]
into the browser as an ES module, it fails to load because the entrypoint uses process.env.NODE_ENV
:
// react-router/index.js
if (process.env.NODE_ENV === "production") {
module.exports = require("./cjs/react-router.min.js");
} else {
module.exports = require("./cjs/react-router.js");
}
This can be handled if we import the process from node:process
.
Then the bundlers and CDNs that build these packages will detect the usage of built-ins and polyfill them accordingly.
So the above snippet becomes:
const process = require("node:process");
if (process.env.NODE_ENV === "production") {
module.exports = require("./cjs/react-router.min.js");
} else {
module.exports = require("./cjs/react-router.js");
}
This context-aware usage of built-ins gives better control for the bundlers to polyfill only those modules that are actually needed, instead of polyfilling all of the Node built-ins every time a build is carried out for browsers.
Note: This is only handled in CommonJS and is generally discouraged! It is suggested to use conditional exports and imports for branching rather than relying on code analysis.
JSPM #
We have seen multiple use cases for improving package compatibility within the ecosystem. Now, let's examine how JSPM handles pre-existing packages in npm. This is a well-known issue from the time when ES modules came into existence: the transition was not at all smooth.
For a project to work in the JavaScript ecosystem now, build pipelines must be quite complex. It requires the proper combination of bundlers, transpilers, Node, and multiple module formats. Loading a cjs
module from npm into a project that is configured to build as esm
, and vice-versa, is difficult. At some point, the transition left many builds out of sync.
Luckily, JSPM builds all packages from npm ahead of time into spec complaint ES modules and serves them using a distributed global CDN, regardless of the format in which the packages are published to npm. This makes it seamless to load any module from npm and use it in any project anytime.
package.json
browser
If a package specifies the browser field in its package.json
—and the package is used in an import map with browser
as the target—then JSPM uses this field instead of main
. In other words, the browser
field represents the package authors explicitly stating the module that needs to be used when loading inside a browser environment.
module
The module
field is not an official specification but was popularised by bundlers.
Let's say we expose a cjs
build from main
and include an esm
version in the module
field, but we don't have type
set to module
for the package. Furthermore, the entrypoint does not end with the .mjs
extension:
"main": "dist/index.js",
"module": "dist/index.es.js"
This is a common pattern when publishing packages: the builder uses the main
entrypoint and builds an ES module.
I highly recommend not using the
module
field. We don't need themain
field anymore if we expose the package using exports. Always rely on specifications when available.
exports
The package builder from JSPM parses the packages and creates an export-map
for those packages that don't have them already.
During this process, the builder is intelligent enough to detect subpaths depending on the usage of the module imports internally. This makes the package exports more efficient.
If the package already exposes the exports
map, JSPM uses it instead of generating a new export map.
"exports": {
".": {
"import": "./dist/default/lib.mjs",
"require": "./dist/default/lib.js"
}
}
Let's look at an example with react-router
. If we look into the package.json
of the package loaded from unpkg
:
"main": "./dist/main.js",
"unpkg": "./dist/umd/react-router.production.min.js",
"module": "./dist/index.js",
"types": "./dist/index.d.ts"
Note: We can deprecate all these fields in favor of exports now.
Now, if we load the same package.json
from jspm
:
"exports": {
".": {
"module": "./dist/index.js",
"default": {
"development": "./dist/dev.main.js",
"default": "./dist/main.js"
}
},
"./package.json": "./package.json.js",
"./package": "./package.json.js",
"./dist/main.js": {
"development": "./dist/dev.main.js",
"default": "./dist/main.js"
},
"./dist/index.js": "./dist/index.js",
"./dist/main.js!cjs": "./dist/main.js",
"./dist/dev.main.js!cjs": "./dist/dev.main.js",
"./package.json.js!cjs": "./package.json.js"
}
These export maps help load the modules more efficiently using import-map
.
That's all for now! In the future, we might do another write-up exploring more of how these import maps help in loading modules into any environment. Let us know if you'd like to read more about that!