Bun is complete toolkit for building and testing full-stack JavaScript and TypeScript applications. If you're new to Bun, you can learn more from theBun 1.0 blog post.
Bun 1.2 is a huge update, and we're excited to share it with you.
Here's the tl;dr of what changed in Bun 1.2:
Bun.s3
Bun.sql
(with MySQL coming soon)bun install
now uses a text-based lockfile:bun.lock
We also made Express3x faster in Bun.
Bun is designed as a drop-in replacement for Node.js.
In Bun 1.2, we started to run the Node.js test suite for every change we make to Bun. Since then, we've fixed thousands of bugs and the following Node.js modules now pass over 90% of their tests with Bun.
Here's how we did it.
In Bun 1.2, we changed how we test and improve Bun's compatibility with Node.js. Previously, we prioritized and fixed Node.js bugs as they were reported, usually from GitHub issues where someone tried to use an npm package that didn't work in Bun.
While this fixed actual bugs real users ran into, it was too much of a "wack-a-mole" approach. It discouraged doing the large refactors necessary for us to have a shot at 100% Node.js compatibility.
That's when we thought: what if we just run the Node.js test suite?
Node.js has thousands of test files in its repository, with most of them in thetest/parallel
directory. While it might seem simple enough to "just run" their tests, it's more involved than you might think.
For example, many tests rely on the internal implementation details of Node.js. In the following test,getnameinfo
is stubbed to always error, to test the error handling ofdns.lookupService()
.
const { internalBinding }=require("internal/test/binding");const cares=internalBinding("cares_wrap");const { UV_ENOENT }=internalBinding("uv");cares.getnameinfo= ()=> UV_ENOENT;
To run this test in Bun, we had to replace the internal bindings with our own stubs.
Bun.dns.lookupService= (addr,port)=> {const error=newError(`getnameinfo ENOENT${addr}`); error.code="ENOENT"; error.syscall="getnameinfo";throw error;};
There are also Node.js tests that check theexact string of error messages. And while Node.js usually doesn't change error messages, they don't guarantee it won't change between releases.
const common=require("../common");const assert=require("assert");assert.throws( ()=> Buffer.allocUnsafe(5).copy(Buffer.allocUnsafe(5),-1,0), { name:'RangeError', code:'ERR_OUT_OF_RANGE', message:'The value of "targetStart" is out of range. It must be >= 0. Received -1' });
To work around this, we had to change the assertion logic in some tests to check thename
andcode
, instead of themessage
. This is also the standard practice for checking error types in Node.js. Additionally, we sometimes update the message when Bun provides more info for the user than Node.js.
{ name:"RangeError", code:"ERR_OUT_OF_RANGE", message:'The value of "targetStart" is out of range. It must be >= 0. Received -1' message:'The value of "targetStart" is out of range. It must be >= 0 and <= 5. Received -1'},
While we do try to match the error messages of Node.js as much as possible, there are times where we want to provide a more helpful error message, as long as thename
andcode
are the same.
We've ported thousands of files from the Node.js test suite to Bun. That means for every commit we make to Bun, we run the Node.js test suite to ensure compatibility.
Every day, we are adding more and more passing Node.js tests to Bun, and we're excited to share more progress on Node.js compatibility very soon.
In addition to fixing existing Node.js APIs, we've also added support for the following Node.js modules.
node:http2
serverYou can now usenode:http2
to create HTTP/2 servers. HTTP/2 is also necessary for gRPC servers, which are also now supported in Bun. Previously, there was only support for the HTTP/2 client.
import { createSecureServer }from"node:http2";import { readFileSync }from"node:fs";const server=createSecureServer({ key:readFileSync("key.pem"), cert:readFileSync("cert.pem"),});server.on("stream", (stream,headers)=> { stream.respond({":status":200,"content-type":"text/html; charset=utf-8", }); stream.end("<h1>Hello from Bun!</h1>");});server.listen(3000);
In Bun 1.2, the HTTP/2 server is2x faster than in Node.js. When we support new APIs to Bun, we spend a lot of time tuning performance to ensure that it not only works, but it's also faster.
node:dgram
You can now bind and connect to UDP sockets usingnode:dgram
. UDP is a low-level unreliable messaging protocol, often used by telemetry providers and game engines.
import { createSocket }from"node:dgram";const server=createSocket("udp4");const client=createSocket("udp4");server.on("listening", ()=> {const { port, address }= server.address();for (let i=0; i<10; i++) { client.send(`data${i}`, port, address); } server.unref();});server.on("message", (data, {address,port })=> { console.log(`Received: data=${data} source=${address}:${port}`); client.unref();});server.bind();
This allows packages like DataDog'sdd-trace
and@clickhouse/client
to work in Bun 1.2.
node:cluster
You can usenode:cluster
to spawn multiple instances of Bun. This is often used to enable higher throughput by running tasks across multiple CPU cores.
Here's an example of how you can create a multi-threaded HTTP server usingcluster
:
n
child workers (usually equal to the number of CPU cores)reusePort
)import clusterfrom"node:cluster";import { createServer }from"node:http";import { cpus }from"node:os";if (cluster.isPrimary) { console.log(`Primary${process.pid} is running`);// Start N workers for the number of CPUsfor (let i=0; i<cpus().length; i++) { cluster.fork(); } cluster.on("exit", (worker,code,signal)=> { console.log(`Worker${worker.process.pid} exited`); });}else {// Incoming requests are handled by the pool of workers// instead of the primary worker.createServer((req,res)=> { res.writeHead(200); res.end(`Hello from worker${process.pid}`); }).listen(3000); console.log(`Worker${process.pid} started`);}
Note thatreusePort
is only effective on Linux. On Windows and macOS, the operating system does not load balance HTTP connections as one would expect.
node:zlib
In Bun 1.2, we rewrote the entirenode:zlib
module from JavaScript to native code. This not only fixed a bunch of bugs, but it made it2x faster than Bun 1.1.
We also added support forBrotli innode:zlib
, which was missing in Bun 1.1.
import { brotliCompressSync, brotliDecompressSync }from"node:zlib";const compressed=brotliCompressSync("Hello, world!");compressed.toString("hex");// "0b068048656c6c6f2c20776f726c642103"const decompressed=brotliDecompressSync(compressed);decompressed.toString("utf8");// "Hello, world!"
If you want to useC++ addons alongside your JavaScript code, the easiest way is to useN-API.
However, before N-API existed, some packages used the internal V8 C++ APIs in Node.js. What makes this complicated is that Node.js and Bun use different JavaScript engines: Node.js usesV8 (used by Chrome), and Bun usesJavaScriptCore (used by Safari).
Previously, npm packages likecpu-features
, which rely on these V8 APIs, would not work in Bun.
require("cpu-features")();
dyld[94465]: missing symbol calledfish: Job 1,'bun index.ts' terminated by signal SIGABRT (Abort)
To fix this, we undertook the unprecedented engineering effort of implementing V8's public C++ API in JavaScriptCore, so these packages can "just work" in Bun. It's so complicated and nerdy to explain, we wrote a3-part blog series on how we supported the V8 APIs... without using V8.
In Bun 1.2, packages likecpu-features
can be imported and just work.
$ bun index.ts{ arch:"aarch64", flags: { fp:true, asimd:true,// ... },}
The V8 C++ APIs arevery complicated to support, so most packages will still have missing features. We're continuing to improve support, so packages likenode-canvas@v2
andnode-sqlite3
can work in the future.
node:v8
In addition to the V8 C++ APIs, we've also added support for heap snapshots usingnode:v8
.
import { writeHeapSnapshot }from"node:v8";// Writes a heap snapshot to the current working directory in the form:// `Heap-{date}-{pid}.heapsnapshot`writeHeapSnapshot();
In Bun 1.2, you can usegetHeapSnapshot
andwriteHeapSnapshot
to read and write V8 heap snapshots. This allows you to use Chrome DevTools to inspect the heap of Bun.
While compatibility is important for fixing bugs, it also helps us fix performance issues in Bun.
In Bun 1.2, the popularexpress
framework can serve HTTP requests up to3x faster than in Node.js. This was made possible by improving compatibility withnode:http
, and optimizing Bun's HTTP server.
Bun.s3
Bun aims to be a cloud-first JavaScript runtime. That means supporting all the tools and services you need to run a production application in the cloud.
Modern applications store files in object storage, instead of the local POSIX file system. When end-users upload a file attachment to a website, it's not being stored on the server's local disk, it's being stored in a S3 bucket. Decoupling storage from compute prevents an entire class of reliability issues: low disk space, high p95 response times from busy I/O, and security issues with shared file storage.
S3 is thedefacto-standard for object storage in the cloud. TheS3 APIs are implemented by a variety of cloud services, including Amazon S3, Google Cloud Storage, Cloudflare R2, and dozens more.
That's why Bun 1.2 adds built-in support for S3. You can read, write, and delete files from an S3 bucket using APIs that are compatible with Web standards likeBlob
.
You can use the newBun.s3
API to access the defaultS3Client
. The client provides afile()
method that returns a lazy-reference to an S3 file, which is the same API as Bun'sFile
.
import { s3 }from"bun";const file= s3.file("folder/my-file.txt");// file instanceof Blobconst content=await file.text();// or:// file.json()// file.arrayBuffer()// file.stream()
Bun's S3 client is written in native code, instead of JavaScript. When you compare it to using packages like@aws-sdk/client-s3
with Node.js, it's 5x faster at downloading files from a S3 bucket.
You can use thewrite()
method to upload a file to S3. It's that simple:
import { s3 }from"bun";const file= s3.file("folder/my-file.txt");await file.write("hello s3!");// or:// file.write(new Uint8Array([1, 2, 3]));// file.write(new Blob(["hello s3!"]));// file.write(new Response("hello s3!"));
For larger files, you can use thewriter()
method to obtain a file writer that does amulti-part upload, so you don't have to worry about the details.
import { s3 }from"bun";const file= s3.file("folder/my-file.txt");const writer= file.writer();for (let i=0; i<1000; i++) { writer.write(String(i).repeat(1024));}await writer.end();
When your production service needs to let users upload files to your server, it's often more reliable for the user to upload directly to S3 instead of your server acting as an intermediary.
To make this work, you use thepresign()
method to generate apresigned URL for a file. This generates a URL with a signature that allows a user to securely upload that specific file to S3, without exposing your credentials or granting them unnecessary access to your bucket.
import { s3 }from"bun";const url= s3.presign("folder/my-file.txt", { expiresIn:3600,// 1 hour acl:"public-read",});
Bun.serve()
Since Bun's S3 APIs extend theFile
API, you can useBun.serve()
to serve S3 files over HTTP.
import { serve, s3 }from"bun";serve({ port:3000,asyncfetch(request) {const { url }= request;const { pathname }=newURL(url);// ...if (pathname==="/favicon.ico") {const file= s3.file("assets/favicon.ico");returnnewResponse(file); }// ... },});
When you usenew Response(s3.file(...))
, instead of downloading the S3 file to your server and sending it back to the user, Bun redirects the user to the presigned URL for the S3 file.
Response (0 KB) { status:302, headers: Headers {"location":"https://s3.amazonaws.com/my-bucket/assets/favicon.ico?...", }, redirected:true,}
This saves you memory, time, and the bandwidth cost of downloading the file to your server.
Bun.file()
If you want to access S3 files using the same code as the local file-system, you can reference them using thes3://
URL protocol. It's the same concept as usingfile://
to reference local files.
import { file }from"bun";asyncfunctioncreateFile(url,content) {const fileObject=file(url);if (await fileObject.exists()) {return; }await fileObject.write(content);}awaitcreateFile("s3://folder/my-file.txt","hello s3!");awaitcreateFile("file://folder/my-file.txt","hello posix!");
fetch()
You can even usefetch()
to read, write, and delete files from S3.
// Upload to S3awaitfetch("s3://folder/my-file.txt", { method:"PUT", body:"hello s3!",});// Download from S3const response=awaitfetch("s3://folder/my-file.txt");const content=await response.text();// "hello s3!"// Delete from S3awaitfetch("s3://folder/my-file.txt", { method:"DELETE",});
S3Client
When you importBun.s3
, it returns a default client that is configured usingwell-known environment variables, such asAWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
.
import { s3, S3Client }from"bun";// s3 instanceof S3Client
You can also create your ownS3Client
, then set it as the default.
import { S3Client }from"bun";const client=newS3Client({ accessKeyId:"my-access-key-id", secretAccessKey:"my-secret-access-key", region:"auto", endpoint:"https://<account-id>.r2.cloudflarestorage.com", bucket:"my-bucket",});// Sets the default client to be your custom clientBun.s3= client;
Bun.sql
Just like object storage, another datastore that production applications often need is a SQL database.
Since the beginning, Bun has had a built-inSQLite client. SQLite is great for smaller applications and quick scripts, where you don't want to worry about the hastle of setting up a production database.
In Bun 1.2, we're expanding Bun's support for SQL databases by introducingBun.sql
, a built-in SQL client with Postgres support. We also have apull request to add MySQL support very soon.
Bun.sql
You can useBun.sql
to run SQL queries usingtagged-template literals. This allows you to pass JavaScript values as parameters to your SQL queries.
Most importantly, it escapes strings and uses prepared statements for you to prevent SQL injection.
import { sql }from"bun";const users= [ { name:"Alice", age:25 }, { name:"Bob", age:65 },];awaitsql` INSERT INTO users (name, age) VALUES${sql(users)}`;
Reading rows is just as easy. Results are returned as an array of objects, with the column name as the key.
import { sql }from"bun";const seniorAge=65;const seniorUsers=awaitsql` SELECT name, age FROM users WHERE age >=${seniorAge}`;console.log(seniorUsers);// [{ name: "Bob", age: 65 }]
Bun.sql
is written in native code with optimizations like:
Optimizations stack like buffs in World of Warcraft.
The result is thatBun.sql
is up to 50% faster at reading rows than using the most popular Postgres clients with Node.js.
postgres.js
toBun.sql
TheBun.sql
APIs are inspired by the popularpostgres.js
package. This makes it easy to migrate your existing code to using Bun's built-in SQL client.
import { postgres }from"postgres";import { postgres }from"bun";const sql=postgres({ host:"localhost", port:5432, database:"mydb", user:"...", password:"...",});const users=awaitsql`SELECT name, age FROM users LIMIT 1`;console.log(users);// [{ name: "Alice", age: 25 }]
Bun is a npm-compatible package manager that makes it easy to install and update your node modules. You can usebun install
to install dependencies, even if you're using Node.js as a runtime.
npm install
withbun install
$ npm install$ bun install
In Bun 1.2, we've made the biggest change yet to the package manager.
bun.lockb
Since the beginning, Bun has used a binary lockfile:bun.lockb
.
Unlike other package managers that use text-based lockfiles, like JSON or YAML, a binary lockfile allowed us to makebun install
almost 30x faster thannpm
.
However, we found that there were a lot of paper cuts when using a binary lockfile. First, you couldn't view the contents of the lockfile on GitHub and other platforms. This sucked.
What happens if you receive a pull request from an external contributor that changes thebun.lockb
file? Do you trust it? Probably not.
That's also assuming there isn't a merge conflict! Which for a binary lockfile, is almost impossible to resolve, aside from manually deleting the lockfiles and runningbun install
again.
This also made it hard for tools to read the lockfile. For example, dependency management tools likeDependabot would need an API to parse the lockfile, and we didn't offer one.
Bun will continue to supportbun.lockb
for along time. However, for all these reasons, we've decided to switch to a text-based lockfile as the default in Bun 1.2.
bun.lock
In Bun 1.2, we're introducing a new, text-based lockfile:bun.lock
.
You can migrate to the new lockfile by using the--save-text-lockfile
flag.
bun install --save-text-lockfile
bun.lock
is a JSONC file, which is JSON with added support for comments and trailing commas.
// bun.lock{"lockfileVersion":0,"packages": [ ["express@4.21.2",/* ... */,"sha512-..."], ["body-parser@1.20.3",/* ... */],/* ... and more */ ],"workspaces": {/* ... */ },}
This makes it much easier to view diffs in pull requests, and trailing commas make it much less likely to cause merge conflicts.
For new projects without a lockfile, Bun will generate a newbun.lock
file.
For existing projects with abun.lockb
file, Bun will continue to support the binary lockfile,without migration to the new lockfile. We will continue to support the binary lockfile for along time, so you can continue to use commands, likebun add
andbun update
, and it will update yourbun.lockb
file.
bun install
gets 30% fasterYou might think that after we migrated to a text-based lockfile,bun install
would be slower. Wrong!
Most software projects get slower as more features are added, Bun is not one of those projects. We spent a lot of time tuning and optimizing Bun, so we could makebun install
even faster.
That's why in Bun 1.2,bun install
is 30% faster than Bun 1.1
package.json
Have you ever added something to yourpackage.json
and forgot why months later? Or wanted to explain to your teammates why a dependency needs a specific version? Or have you ever had a merge conflict in a package.json file due to a comma?
Often these problems are due to the fact thatpackage.json
is a JSON file, and that means you can't use comments or trailing commas in it.
{"dependencies": {// this would cause a syntax error"express":"4.21.2" }}
This is a bad experience. Modern tools like TypeScript allow for comments and trailing commas in their configuration files,tsconfig.json
for example, and it's great. We also asked the community on your thoughts, and it seemed that the status-quo needed to change.
What JS ecosystem upgrade path would you prefer to permit comments in package.json?
— Rob Palmer (@robpalmer2)April 17, 2024
In Bun 1.2, you can use comments and trailing commas in yourpackage.json
. It just works.
{"name":"app","dependencies": {// We need 0.30.8 because of a bug in 0.30.9"drizzle-orm":"0.30.8",/* <- trailing comma */ },}
Since there are many tools that readpackage.json
files, we've added support torequire()
orimport()
these files with comments and trailing commas. You don't need to change your code.
const pkg=require("./package.json");const {default: { name },}=awaitimport("./package.json");
Since this isn't widely supported in the JavaScript ecosystem, we'd advice you to use this feature "at your own risk." However, we think this is the right direction to go: to make things easier for you.
.npmrc
supportIn Bun 1.2, we added support for reading npm's config file:.npmrc
.
You can use.npmrc
to configure your npm registry and configure scoped packages. This is often necessary for corporate environments, where you might need to authenticate to a private registry.
@my-company:registry=https://packages.my-company.com@my-org:registry=https://packages.my-company.com/my-org
Bun will look for an.npmrc
file in your project's root directory, and in your home directory.
bun run --filter
You can now usebun run --filter
to run a script in multiple workspaces at the same time.
bun run --filter='*' dev
This will run thedev
script, concurrently, in all workspaces that match the glob pattern. It will also interleave the output of each script, so you can see the output of each workspace as it runs.
You can also pass multiple filters to--filter
, and you can just usebun
instead ofbun run
.
bun --filter'api/*' --filter'frontend/*' dev
bun outdated
You can now view which dependencies are out-of-date usingbun outdated
.
It will show a list of yourpackage.json
dependencies, and which versions are out-of-date. The "update" column shows the next semver-matching version, and the "latest" column shows the latest version.
If you notice there's a specific dependency you want to update, you can usebun update
.
bun update @typescript-eslint/parser# Updates to "7.18.0"
bun update @typescript-eslint/parser --latest# Updates to "8.2.0"
You can also filter which dependencies you want to check for updates. Just make sure to quote patterns, so your shell doesn't expand them as glob patterns!
bun outdated"is-*"# check is-even, is-odd, etc.
bun outdated"@discordjs/*"# check @discordjs/voice, @discordjs/rest, etc.
bun outdated jquery --filter="foo"# check jquery in the `foo` workspace
bun publish
You can now publish npm packages usingbun publish
.
It's a drop-in replacement fornpm publish
, and supports many of the same features like:
.npmrc
files for authentication..gitignore
and.npmignore
files in multiple directories.bin
,files
, etc.README
files carefully.We've also added support for commands that are useful for publishing, like:
bun pm whoami
, which prints your npm username.bun pm pack
, which creates an npm package tarball for publishing or installing locally.bun patch
Sometimes, your dependencies have bugs or missing features. While you could fork the package, make your changes, and publish it — that's a lot of work. What if you don't want to maintain a fork?
In Bun 1.2, we've added support for patching dependencies. Here's how it works:
bun patch <package>
to patch a package.node_modules/<package>
directory.bun patch --commit <package>
to save your changes. That's it!Bun generates a.patch
file with your changes in thepatches/
directory, which is automatically applied onbun install
. You can then commit the patch file to your repository, and share it with your team.
For example, you could create a patch to replace a dependency with your own code.
diff --git a/index.js b/index.jsindex 832d92223a9ec491364ee10dcbe3ad495446ab80..2a61f0dd2f476a4a30631c570e6c8d2d148d419a 100644--- a/index.js+++ b/index.js@@ -1,14 +1 @@- 'use strict';-- var isOdd = require('is-odd');-- module.exports = function isEven(i) {- return !isOdd(i);- };+ module.exports = (i) => (i % 2 === 0)
Bun clones the package from thenode_modules
directory with a fresh copy of itself. This allows you to safely make edits to files in the package's directory without impacting shared file caches.
We've also made a bunch of small improvements to makebun install
easier to use.
You can now configure CA certificates forbun install
. This is useful when you need to install packages from your company's private registry, or if you want to use self-signed certificate.
[install]# The CA certificate as a stringca="-----BEGIN CERTIFICATE-----\n...\n-----END CERTIFICATE-----"# A path to a CA certificate file. The file can contain multiple certificates.cafile="path/to/cafile"
If you don't want to change yourbunfig.toml
file, you can also use the--ca
and--cafile
flags.
bun install --cafile=/path/to/cafile
bun install --ca="..."
If you are using an existing.npmrc
file, you can also configure CA certificates there.
cafile=/path/to/cafileca="..."
bundleDependencies
supportYou can now usebundleDependencies
in yourpackage.json
.
{"bundleDependencies": ["is-even"]}
These are dependencies that you expect to already exist in yournode_modules
folder, and are not installed like other dependencies.
bun add
respectspackage.json
indentationWe fixed a bug wherebun add
would not respect the spacing and indentation in yourpackage.json
. Bun will now preserve the indentation of yourpackage.json
, no matter how wacky it is.
bun add is-odd
// an intentionally wacky package.json{"dependencies": {"is-even":"1.0.0","is-odd":"1.0.0" }}
--omit=dev|optional|peer
supportBun now supports the--omit
flag withbun install
, which allows you to omit dev, optional, or peer dependencies.
bun install --omit=dev# omit dev dependencies
bun install --omit=optional# omit optional dependencies
bun install --omit=peer# omit peer dependencies
bun install --omit=dev --omit=optional# omit dev and optional dependencies
Bun has a built-in test runner that makes it easy to write and run tests in JavaScript, TypeScript, and JSX. It supports many of the same APIs as Jest and Vitest, which includes theexpect()
-style APIs.
In Bun 1.2, we've made a lot of improvements tobun test
.
To usebun test
with CI/CD tools like Jenkins, CircleCI, and GitLab CI, you can use the--reporter
option to output test results to a JUnit XML file.
buntest --reporter=junit --reporter-outfile=junit.xml
<?xml version="1.0" encoding="UTF-8"?><testsuitesname="bun test"tests="1"assertions="1"failures="1"time="0.001"> <testsuitename="index.test.ts"tests="1"assertions="1"failures="1"time="0.001"><!-- ... --> </testsuite></testsuites>
You can also enable JUnit reporting by adding the following to yourbunfig.toml
file.
[test.reporter]junit="junit.xml"
You can usebun test --coverage
to generate a text-based coverage report of your tests.
In Bun 1.2, we added support for LCOV coverage reporting.LCOV is a standard format for code coverage reports, and is used by many tools like Codecov.
buntest --coverage --coverage-reporter=lcov
By default, this outputs alcov.info
coverage report file in thecoverage
directory. You can change the coverage directory with--coverage-dir
.
If you want to always enable coverage reporting, you can add the following to yourbunfig.toml
file.
[test]coverage=truecoverageReporter= ["lcov"]# default ["text"]coverageDir="./path/to/folder"# default "./coverage"
You can now useinline snapshots usingexpect().toMatchInlineSnapshot()
.
UnliketoMatchSnapshot()
, which stores the snapshot in a separate file,toMatchInlineSnapshot()
stores snapshots directly in the test file. This makes it easier see, and even change your snapshots.
First, write a test that usestoMatchInlineSnapshot()
.
import { expect, test }from"bun:test";test("toMatchInlineSnapshot()", ()=> {expect(newDate()).toMatchInlineSnapshot();});
Next, update the snapshot withbun test -u
, which is short for--update-snapshots
.
buntest -u
Then, voilà! Bun has updated the test file with your snapshot.
import { expect, test }from"bun:test";test("toMatchInlineSnapshot()", ()=> {expect(newDate()).toMatchInlineSnapshot();expect(newDate()).toMatchInlineSnapshot(`2025-01-18T02:35:53.332Z`);});
You can also use these matchers, which do a similar thing:
test.only()
You can usetest.only()
to run a single test, excluding all other tests. This is useful when you're debugging a specific test, and don't want to run the entire test suite.
import { test }from"bun:test";test.only("test a", ()=> {/* Only run this test */});test("test b", ()=> {/* Don't run this test */});
Previously, for this to work in Bun, you had to use the--only
flag.
buntest --only
This was annoying, you'd usually forget to do it, and test runners like Jest don't need it! In Bun 1.2, we've made this "just work", without the need for flags.
buntest
expect()
matchersIn Bun 1.2, we added a bunch of matchers to theexpect()
API. These are the same matchers that are implemented by Jest, Vitest, or thejest-extended
library.
You can usetoContainValue()
and derivatives to check if an object contains a value.
const object=newSet(["bun","node","npm"]);expect(object).toContainValue("bun");expect(object).toContainValues(["bun","node"]);expect(object).toContainAllValues(["bun","node","npm"]);expect(object).not.toContainAnyValues(["done"]);
Or, usetoContainKey()
and derivatives to check if an object contains a key.
const object=newMap([ ["bun","1.2.0"], ["node","22.13.0"], ["npm","9.1.2"],]);expect(object).toContainKey("bun");expect(object).toContainKeys(["bun","node"]);expect(object).toContainAllKeys(["bun","node","npm"]);expect(object).not.toContainAnyKeys(["done"]);
You can also usetoHaveReturned()
and derivatives to check if a mocked function has returned a value.
import { jest, test, expect }from"bun:test";test("toHaveReturned()", ()=> {const mock= jest.fn(()=>"foo");mock();expect(mock).toHaveReturned();mock();expect(mock).toHaveReturnedTimes(2);});
We've also added support for custom error messages usingexpect()
.
You can now pass a string as the second argument toexpect()
, which will be used as the error message. This is useful when you want to document what the assertion is checking.
import { test, expect }from'bun:test';test("custom error message", ()=> {expect(0.1+0.2).toBe(0.3);expect(0.1+0.2,"Floating point has precision error").toBe(0.3);});
1| import { test, expect } from'bun:test';2|3| test("custom error message", () => {4| expect(0.1 + 0.2,"Floating point has precision error").toBe(0.3); ^error: expect(received).toBe(expected)error: Floating point has precision errorExpected: 0.3Received: 0.30000000000000004
jest.setTimeout()
You can now use Jest'ssetTimeout()
API to change the default timeout for tests in the current scope or module, instead of setting the timeout for each test.
jest.setTimeout(60*1000);// 1 minutetest("do something that takes a long time",async ()=> {await Bun.sleep(Infinity);});
You can also importsetDefaultTimeout()
from Bun's test APIs, which does the same thing. We chose a different name to avoid confusion with the globalsetTimeout()
function.
import { setDefaultTimeout }from"bun:test";setDefaultTimeout(60*1000);// 1 minute
Bun is a JavaScript and TypeScript bundler, transpiler, and minifier that can be used to bundle code for the browser, Node.js, and other platforms.
In Bun 1.2, we've added support for HTML imports. This allows you to replace your entire frontend toolchain with a single import statement.
To get started, pass an HTML import to thestatic
option inBun.serve
:
import homepagefrom"./index.html";Bun.serve({ static: {"/": homepage, },asyncfetch(req) {// ... api requests },});
When you make a request to/
, Bun automatically bundles the<script>
and<link>
tags in the HTML files, exposes them as static routes, and serves the result.
An index.html file like this:
<!DOCTYPEhtml><html> <head> <title>Home</title> <linkrel="stylesheet"href="./reset.css" /> <linkrel="stylesheet"href="./styles.css" /> </head> <body> <divid="root"></div> <scripttype="module"src="./sentry-and-preloads.ts"></script> <scripttype="module"src="./my-app.tsx"></script> </body></html>
Becomes something like this:
<!DOCTYPEhtml><html> <head> <title>Home</title> <linkrel="stylesheet"href="/index-[hash].css" /> </head> <body> <divid="root"></div> <scripttype="module"src="/index-[hash].js"></script> </body></html>
To read more about HTML imports and how they're implemented, check out theHTML imports documentation.
You can usebun build --compile
to compile your application, and Bun, into a standalone executable.
In Bun 1.2, we've added support for cross-compilation. This allows you to build a Windows or macOS binary on a Linux machine, and vice versa.
You can run the following command on a macOS or Linux machine, and it will compile a Windows binary.
bun build --compile --target=bun-windows-x64 app.ts
[8ms] bundle 1 modules[1485ms] compile app.exe bun-windows-x64-v1.2.0
For Windows specific builds, you can customize the icon and hide the console window.
bun build --compile --windows-icon=./icon.ico --windows-hide-console app.ts
You can also usebun build --bytecode
flag to generate a bytecode cache. This improves the startup time of applications likeeslint
to be2x faster.
bun build --bytecode --compile app.ts
./app
Hello, world!
You can also use the bytecode cache without--compile
.
bun build --bytecode --outdir=dist app.ts
ls dist
app.js app.jsc
When Bun generates output files, it will also generate.jsc
files, which contain the bytecode cache of its respective.js
file. Both files are necessary to run, as the bytecode compilation doesn't currently compile async functions, generators, or eval.
The bytecode cache can be 8x larger than the source code, so this makes startup faster at a cost of increased disk space.
You can now set the output format to CommonJS withbun build
. Previously, only ESM was supported.
bun build --format=cjs app.ts
This makes it easier to create libraries and applications meant for older versions of Node.js.
// app.tsexportdefault"Hello, world!";
var __defProp=Object.defineProperty;var __getOwnPropNames=Object.getOwnPropertyNames;var __getOwnPropDesc=Object.getOwnPropertyDescriptor;var __hasOwnProp=Object.prototype.hasOwnProperty;var __moduleCache=/* @__PURE__ */newWeakMap;var__toCommonJS= (from)=> {var entry= __moduleCache.get(from), desc;if (entry)return entry; entry=__defProp({},"__esModule", { value:true });if (from&&typeof from==="object"||typeof from==="function")__getOwnPropNames(from).map((key)=>!__hasOwnProp.call(entry, key)&&__defProp(entry, key, {get: ()=> from[key], enumerable:!(desc=__getOwnPropDesc(from, key))|| desc.enumerable })); __moduleCache.set(from, entry);return entry;};var__export= (target,all)=> {for (var namein all)__defProp(target, name, { get: all[name], enumerable:true, configurable:true,set: (newValue)=> all[name]= ()=> newValue });};// app.jsvar exports_site= {};__export(exports_site, {default: ()=> site_default});module.exports=__toCommonJS(exports_site);var site_default="Hello, world!";
Some packagesreally want to trick bundlers and get the current module's file path, do a runtime require, or check if the current module is the main module. They try all kinds of things to make it work, such as:
"use strict";if (eval("require.main")===eval("module.main")) {// ...}
Bun supports both CommonJS and ESM; in fact, you can userequire()
andimport
in the same file. However, one of the challenges of supporting both is that there's a lot of ambiguity.
Consider the following code, is it CommonJS or ESM?
console.log("123");
There's no way to tell. Then, how about this?
console.log(module.require("path"));
CommonJS, because it's usingmodule.require()
to get thepath
module. And this?
import pathfrom"path";console.log(path);
ESM, because it's usingimport
. But, what about this?
import pathfrom"path";const fs=require("fs");console.log(fs.readFileSync(path.resolve("package.json"),"utf8"));
ESM, because it's usingimport
. If we said it was CommonJS due to the require, then theimport
would break the code. We want to simplify building stuff in JavaScript, so let's just say it's ESM and not be fussy.
Finally, what about this?
"use strict";console.log(eval("module.require('path')"));
Previously, Bun would have said ESM, because it's the default when there's no way to tell (including when the file extension is ambiguous, there's no "type" field in package.json, no export, no import, etc).
In Bun 1.2, Bun will say CommonJS, because of the "use strict" directive at the top of the file. ESM is always in strict mode, so an explicit "use strict" would be redundant.
Also, most build tools that output CommonJS include "use strict" at the top of the file. So we can now use this as a last-chance heuristic when it's completely ambiguous whether the file is CommonJS or ESM.
Bun has a universalplugin API for extending the bundlerand the runtime.
You can use plugins to interceptimport()
statements, add custom loaders for extensions like.yaml
, and implement frameworks for Bun.
onBeforeParse()
In Bun 1.2, we're introducing a new lifecycle hook for plugins,onBeforeParse()
.
Unlike the existing lifecycle hooks that run JavaScript code, this hook must be aN-API addon, which can be implemented in a compiled language like Rust, C/C++, or Zig.
The hook is called immediately before parsing, without cloning the source code, without undergoing string conversion, and with practically zero overhead.
For example, you can create a Rust plugin that replaces all occurrences offoo
withbar
.
bun add -g @napi-rs/cli
napi new
cargo add bun-native-plugin
From there, you can implement theonBeforeParse()
hook. These are advanced APIs, primarily designed for plugin and framework authors who want to use native code to make their plugins really fast.
use bun_native_plugin::{define_bun_plugin,OnBeforeParse, bun,Result, anyhow,BunLoader};use napi_derive::napi;define_bun_plugin!("foo-bar-plugin");#[bun]pubfnreplace_foo_with_bar(handle:&mutOnBeforeParse)->Result<()> {let input_source_code= handle.input_source_code()?;let output_source_code= input_source_code.replace("foo","bar"); handle.set_output_source_code(output_source_code,BunLoader::BUN_LOADER_JSX);Ok(())}
import { build }from"bun";import fooBarPluginfrom"./foo-bar-plugin";awaitbuild({ entrypoints: ["./app.tsx"], plugins: [ { name:"foo-bar-plugin",setup(build) { build.onBeforeParse( { namespace:"file", filter:"**/*.tsx", }, { napiModule: fooBarPlugin, symbol:"replace_foo_with_bar", }, ); }, }, ],});
We also made a lot of other improvements tobun build
and theBun.build()
APIs.
You can now inject environment variables from your system environment into your bundle.
bun build --env="PUBLIC_*" app.tsx
import { build }from"bun";awaitbuild({ entrypoints: ["./app.tsx"], outdir:"./out",// Environment variables starting with "PUBLIC_"// will be injected in the build as process.env.PUBLIC_* env:"PUBLIC_*",});
bun build --drop
You can use--drop
to remove function calls from your JavaScript bundle. For example, if you pass--drop=console
, all calls toconsole.log()
will be removed from your code.
import { build }from"bun";awaitbuild({ entrypoints: ["./index.tsx"], outdir:"./out", drop: ["console","anyIdentifier.or.propertyAccess"],});
bun build ./index.tsx --outdir ./out --drop=console --drop=anyIdentifier.or.propertyAccess
You can now use the banner and footer options inbun build
to add content above or below the bundle.
bun build --banner"/* Banner! */" --footer"/* Footer! */" app.ts
import { build }from"bun";awaitbuild({ entrypoints: ["./app.ts"], outdir:"./dist", banner:"/* Banner! */", footer:"/* Footer! */",});
This is useful for appending content above or below the bundle, such as a license or copyright notice.
/** * Banner! */exportdefault"Hello, world!";/** * Footer! */
Bun.embeddedFiles()
You can use the newBun.embeddedFiles()
API to see a list of all embedded files in a standalone executable, compiled withbun build --compile
.
import { embeddedFiles }from"bun";for (const fileof embeddedFiles) { console.log(file.name);// "logo.png" console.log(file.size);// 1234 console.log(await file.bytes());// Uint8Array(1234) [...]}
require.main === module
Previously, usingrequire.main === module
would mark the module as CommonJS. Now, Bun rewrites this intoimport.meta.main
, meaning you can use this pattern alongside import statements.
import*as fsfrom"fs";if (typeof require!=="undefined"&& require.main===module) { console.log("main!", fs);}
--ignore-dce-annotations
Some JavaScript tools support special annotations that can influence behavior during dead-code elimination. For example, the@__PURE__
annotation tells bundlers that a function call is pure (regardless of whether it actually is), and that the call can be removed if it is not used.
let button=/* @__PURE__ */ React.createElement(Button,null);
Sometimes, a library may include incorrect annotations, which can cause Bun to remove side effects which were needed.
To workaround these issue, you can use the--ignore-dce-annotations
flag when runningbun build
to ignore all annotations. This should only be used if dead-code elimination breaks bundles, and fixing the annotations should be preferred to leaving this flag on.
--packages=external
You can now control if package dependencies are included in your bundle or not. If the import does not start with.
,..
or/
, then it is considered a package.
bun build ./index.ts --packages external
await Bun.build({ entrypoints: ["./index.ts"], packages:"external",});
This is useful when bundling libraries. It lets you reduce the number of files your users have to download, while continuing to support peer or external dependencies.
In Bun 1.2, we implemented a new CSS parser and bundler in Bun.
It's derived from the great work ofLightningCSS, and re-written from Rust to Zig so it can be integrated with Bun's custom JavaScript and TypeScript parser, bundler, and runtime.
Bun is an complete toolkit for running and building JavaScript and TypeScript. One of the missing pieces of Bun's built-in JavaScript bundler,bun build
, is support for bundling and minifying CSS.
CSS bundlers combine multiple CSS files and assets referenced using directives likeurl
,@import
,@font-face
, into a single CSS file you can send to browsers, avoiding a waterfall of network requests.
@import"foo.css";@import"bar.css";
.foo {background:red;}
.bar {background:blue;}
To see how it works, you can try it usingbun build
.
bun build ./index.css
You'll see how the CSS files are combined into a single CSS file.
/** foo.css */.foo {background:red;}/** bar.css */.bar {background:blue;}
.css
files from JavaScriptWe've also made it possible to import.css
files in your JavaScript and TypeScript code. This will create an additional CSS entrypoint that combines all the CSS files imported from a JavaScript module graph, along with@import
rules.
import"./style.css";import MyComponentfrom"./MyComponent.tsx";// ... rest of your app
In this example, ifMyComponent.tsx
imports another CSS file, instead of adding extra.css
files to the bundle, all the CSS imported per entrypoint is flattened into a single CSS file.
bun build ./index.ts --outdir=dist
index.js 0.10 KB index.css 0.10 KB[5ms] bundle 4 modules
Bun.build()
You can also bundle CSS using the programmaticBun.build()
API. This allows you to bundle both CSS and JavaScript in the same build, with the same API.
import { build }from"bun";const results=awaitbuild({ entrypoints: ["./index.css"], outdir:"./dist",});console.log(results);
In addition to supporting Node.js and Web APIs, Bun also has a growing standard library that makes it easy to do common tasks, without adding more external dependencies.
Bun.serve()
Bun has a built-in HTTP server that makes it easy to respond to HTTP requests using standard APIs likeRequest
andResponse
. In Bun 1.2, we added support for static routes using the newstatic
property.
To define a static route, pass the request path as the key and aResponse
object as the value.
import { serve }from"bun";serve({ static: {"/health-check":newResponse("Ok!"),"/old-link": Response.redirect("/new-link",301),"/api/version": Response.json( { app:require("./package.json").version, bun: Bun.version, }, { headers: {"X-Powered-By":"bun" }, }, ), },asyncfetch(request) {returnnewResponse("Dynamic!"); },});
Static routes are up to40% faster than doing it yourself in thefetch()
handler. The response body, headers, and status code are cached in memory, so there's no JavaScript allocation or garbage collection.
If you want to reload the static routes, you can use thereload()
method. This is useful if you want to update the static routes on a schedule, or when a file changes.
import { serve }from"bun";const server=serve({ static: {"/":newResponse("Static!"), },asyncfetch(request) {returnnewResponse("Dynamic!"); },});setInterval(()=> {const date=newDate().toISOString(); server.reload({ static: {"/":newResponse(`Static! Updated at${date}`), }, });},1000);
Bun.udpSocket()
While we added support fornode:dgram
in Bun 1.2, we also introduced UDP socket support in Bun's APIs.Bun.udpSocket()
is a faster, modern alternative and is similar to the existingBun.listen()
API.
import { udpSocket }from"bun";const server=awaitudpSocket({ socket: {data(socket,data,port,addr) { console.log(`Received data from${addr}:${port}:`, data.toString()); }, },});const client=awaitudpSocket({ port:0 });client.send("Hello!", server.port,"127.0.0.1");
Bun's UDP socket API is built for performance. Unlike Node.js, it can send multiple UDP datagrams with a single syscall, and supports responding to backpressure from the operating system.
const socket=await Bun.udpSocket({ port:0, socket: {drain(socket) {// Socket is no longer under backpressure }, },});// Send multiple UDP datagrams with a single syscall:// [ <data>, <port>, <address> ][]socket.sendMany([ ["Hello",12345,"127.0.0.1"], ["from",12346,"127.0.0.1"], ["Bun 1.2",12347,"127.0.0.1"],]);
This is great for building game servers that need to broadcast game state updates to every peer.
Bun.file()
Bun has a built-inBun.file()
API that makes it easy to read and write files. It extends the Web-standardBlob
API, and makes it easier to work with files in a server environment.
In Bun 1.2, we've added support for even moreBun.file()
APIs.
delete()
You can now delete files using thedelete()
method. An alias ofunlink()
is also supported.
import { file }from"bun";awaitfile("./package.json").delete();awaitfile("./node_modules").unlink();
stat()
You can now use thestat()
method to get a file's metadata. This returns the sameStats
object asfs.stat()
in Node.js.
import { file }from"bun";const stat=awaitfile("./package.json").stat();console.log(stat.size);// => 1024console.log(stat.mode);// => 33206console.log(stat.isFile());// => trueconsole.log(stat.isDirectory());// => falseconsole.log(stat.ctime);// => 2025-01-21T16:00:00+00:00
With newly added built-in support for S3, you can use the sameBun.file()
APIs with a S3 file.
import { s3 }from"bun";const stat=awaits3("s3://folder/my-file.txt").stat();console.log(stat.size);// => 1024console.log(stat.type);// => "text/plain;charset=utf-8"awaits3("s3://folder/").unlink();
Bun.color()
To support CSS withbun build
, we implemented our own CSS parser in Bun 1.2. In doing this work, we decided to expose some useful APIs for working with colors.
You can useBun.color()
to parse, normalize, and convert colors into a variety of formats. It supports CSS, ANSI color codes, RGB, HSL, and more.
import { color }from"bun";color("#ff0000","css");// => "red"color("rgb(255, 0, 0)","css");// => "red"color("red","ansi");// => "\x1b[31m"color("#f00","ansi-16m");// => "\x1b[38;2;255;0;0m"color(0xff0000,"ansi-256");// => "\u001b[38;5;196m"color({ r:255, g:0, b:0 },"number");// => 16711680color("hsl(0, 0%, 50%)","{rgba}");// => { r: 128, g: 128, b: 128, a: 1 }
dns.prefetch()
You can use the newdns.prefetch()
API to prefetch DNS records before they are needed. This is useful if you want to pre-warm the DNS cache on startup.
import { dns }from"bun";// ...on startupdns.prefetch("example.com");// ...later onawaitfetch("https://example.com/");
This will prefetch the DNS record for example.com and make it available for use infetch()
requests. You can also use thedns.getCacheStats()
API to observe the DNS cache.
import { dns }from"bun";awaitfetch("https://example.com/");console.log(dns.getCacheStats());// {// cacheHitsCompleted: 0,// cacheHitsInflight: 0,// cacheMisses: 1,// size: 1,// errors: 0,// totalCount: 1,// }
We also added a few random utilities to Bun's APIs.
Bun.inspect.table()
You can now useBun.inspect.table()
to format tabular data into a string. It's similar toconsole.table
, except it returns a string rather than printing to the console.
console.log( Bun.inspect.table([ { a:1, b:2, c:3 }, { a:4, b:5, c:6 }, { a:7, b:8, c:9 }, ]),);// ┌───┬───┬───┬───┐// │ │ a │ b │ c │// ├───┼───┼───┼───┤// │ 0 │ 1 │ 2 │ 3 │// │ 1 │ 4 │ 5 │ 6 │// │ 2 │ 7 │ 8 │ 9 │// └───┴───┴───┴───┘
Bun.randomUUIDv7()
You can useBun.randomUUIDv7()
to generate aUUID v7, a monotonic UUID suitable for sorting and databases.
import { randomUUIDv7 }from"bun";const uuid=randomUUIDv7();// => "0192ce11-26d5-7dc3-9305-1426de888c5a"
Bun has a built-inSQLite client that makes it easy to query SQLite databases. In Bun 1.2, we've added a few new features to make it even easier to use.
When you query a SQL database, you often want to map your query results to a JavaScript object. That's why there's so many popularORM (Object-Relational Mapping) packages like Prisma and TypeORM.
You can now usequery.as(Class)
to map query results to instances of a class. This lets you attach methods, getters, and setters without using an ORM.
import { Database }from"bun:sqlite";classTweet { id:number; text:string; username:string;getisMe() {returnthis.username==="jarredsumner"; }}const db=newDatabase("tweets.db");const tweets= db.query("SELECT * FROM tweets").as(Tweet);for (const tweetof tweets.all()) {if (!tweet.isMe) { console.log(`${tweet.username}:${tweet.text}`); }}
For performance reasons, class constructors, default initializers, and private fields are not supported. Instead, it uses the equivalent ofObject.create()
to create a new object with the class's prototype and assigns the values of the row to it.
It's also important to note that this isnot an ORM. It doesn't manage relationships, generate SQL queries, or anything like that. However, it does remove a lot of boilerplate to get JavaScript objects from SQLite!
You can now usequery.iterate()
to get an iterator that yields rows as they are returned from the database. This is useful when you want to process rows at a time, without loading them all into memory.
import { Database }from"bun:sqlite";classUser { id:number; email:string;}const db=newDatabase("users.db");const rows= db.query("SELECT * FROM users").as(User).iterate();for (const rowof rows) { console.log(row);}
You can also iterate over the query using afor
loop, without callingiterate()
.
for (const rowof db.query("SELECT * FROM users")) { console.log(row);// { id: 1, email: "hello@bun.sh" }}
You can now omit the$
,@
, or:
prefix when passing JavaScript values as query parameters.
import { Database }from"bun:sqlite";const db=newDatabase(":memory:", { strict:false, strict:true,});const query= db.query(`select $message;`);query.all({ $message:"Hello world" message:"Hello world"});
To use this behavior, enable thestrict
option. This will allow you to omit the$
,@
, or:
prefixes, and will throw an error if a parameter is missing.
You can now access the number of rows changed and the last inserted row ID when running queries.
import { Database }from"bun:sqlite";const db=newDatabase(":memory:");db.run(`CREATE TABLE users (id INTEGER, username TEXT)`);const { changes, lastInsertRowid }= db.run(`INSERT INTO users VALUES (1, 'jarredsumner')`,);console.log({ changes,// => 1 lastInsertRowid,// => 1});
If you want to use 64-bit integers, you can enable thesafeIntegers
option. This will return integers as as aBigInt
, instead of a truncatednumber
.
import { Database }from"bun:sqlite";const db=newDatabase(":memory:", { safeIntegers:true });const query= db.query(`SELECT${BigInt(Number.MAX_SAFE_INTEGER)+1n} as maxInteger`,);const { maxInteger }= query.get();console.log(maxInteger);// => 9007199254740992n
You can also enable this on a per-query basis using thesafeIntegers()
method.
import { Database }from"bun:sqlite";const db=newDatabase(":memory:", { strict:true });const query= db.query("SELECT $value as value").safeIntegers(true);const { value }= query.get({ value:BigInt(Number.MAX_SAFE_INTEGER)+1n,});console.log(value);// => 9007199254740992n
using
With JavaScript'susing
syntax, you can automatically close statements and databases when their variables go out of scope. This allows you to clean up database resources, even if there's a thrown error.Read on for more details on Bun's support for this new JavaScript feature.
import { Database }from"bun:sqlite";{ using db=newDatabase("file.db"); using query= db.query("SELECT * FROM users");for (const rowof query.all()) {thrownewError("Oops!");// no try/catch block needed! }}// scope ends here, so `db` and `query` are automatically closed
We've added experimental support for compiling and running C from JavaScript. This is a simple way to use C system libraries from JavaScriptwithout a build step.
#include<stdio.h>#include<stdlib.h>intrandom() {returnrand()+42;}
import { cc }from"bun:ffi";const {symbols: { random } }=cc({ source:"./random.c", symbols: { random: { returns:"int", args: [], }, },});console.log(random());// 42
For advanced use-cases or where performance is really important, you sometimes need to use system libraries from JavaScript. Today, the most common way to do this is by compiling aN-API addon usingnode-gyp
. You might notice if a package uses this, because it runs a postinstall script when you install it.
However, this isn't a great experience. Your system needs a modern version of Python and a C compiler, which is usually installed using a command likeapt install build-essential
.
And hopefully you don't run into a compiler or node-gyp error, which can be quite frustrating.
gyp ERR!command"/usr/bin/node""/tmp/node-gyp@latest--bunx/node_modules/.bin/node-gyp""configure""build"gyp ERR! cwd /bun/test/node_modules/bktree-fastgyp ERR! node -v v12.22.9gyp ERR! node-gyp -v v9.4.0gyp ERR! Node-gyp failed to build your package.gyp ERR! Try to update npm and/or node-gyp andif it does nothelp file an issue with the package author.error:"node-gyp" exited with code 7 (SIGBUS)
In case you didn't know, Bun embeds a built-in C compiler calledtinycc
. Surprise!
Unlike traditional C compilers, likegcc
orclang
, that can take seconds to compile a simple program,tinycc
compiles simple C code in milliseconds. This makes it possible for Bun to compile your C code on-demand, without a build step.
Using thebun:ffi
APIs, you can compile and run C code from JavaScript. Here's an example project that uses theN-API to return a JavaScript string from C code.
#include<node/node_api.h>napi_valuehello_napi(napi_envenv) { napi_value result;napi_create_string_utf8(env,"Hello, N-API!", NAPI_AUTO_LENGTH,&result);return result;}
import { cc }from"bun:ffi";import sourcefrom"./hello-napi.c" with { type: "file" };const hello=cc({ source, symbols: { hello_napi: { args: ["napi_env"], returns:"napi_value", }, },});console.log(hello());// => "Hello, N-API!"
Instead of requiring a build step withnode-gyp
, as long as you have Bun, this just works.
In Bun 1.2, we've introduced a new build of Bun that works on Linux distros that use themusl libc instead of glibc, likeAlpine Linux. This is supported on both Linux x64 and aarch64.
You can also use thealpine version of Bun in Docker.
docker run --rm -it oven/bun:alpine bun --print'Bun.file("/etc/alpine-release").text()'
3.20.5
While musl enables smaller container images, it tends to perform slightly slower than the glibc version of Bun. We recommend using glibc unless you have a specific reason to use musl.
JavaScript is a language that is constantly evolving. In Bun 1.2, even more JavaScript features are available thanks to the collaboration of theTC39 committees, and the hard work of theWebKit team.
You can now specify animport attribute when importing a file. This is useful when you want to import a file that isn't JavaScript code, like a JSON object or a text file.
import jsonfrom"./package.json" with { type: "json" };typeof json;// "object"import htmlfrom"./index.html" with { type: "text" };typeof html;// "string"import tomlfrom"./bunfig.toml" with { type: "toml" };typeof toml;// "object"
You can also specify import attributes usingimport()
.
const {default: json }=awaitimport("./package.json", { with: { type:"json" },});typeof json;// "object"
using
With the newly introducedusing
syntax in JavaScript, you can automatically close resources when a variable goes out of scope.
Instead of defining a variable withlet
orconst
, you can now define a variable withusing
.
import { serve }from"bun";{ using server=serve({ port:0,fetch(request) {returnnewResponse("Hello, world!"); }, });doStuff(server);}functiondoStuff(server) {// ...}
In this example, the server is automatically closed when theserver
variable goes out of scope, even if an exception is thrown. This is useful for ensuring that resources are properly cleaned up, especially in tests.
To support this, an object's prototype must define a[Symbol.dispose]
method, or[Symbol.asyncDispose]
method if it's an async resource.
classResource { [Symbol.dispose]() {/* ... */ }}using resource=newResource();classAsyncResource {async [Symbol.asyncDispose]() {/* ... */ }}await using asyncResource=newAsyncResource();
We've also added support forusing
in dozens of Bun APIs, includingBun.spawn()
,Bun.serve()
,Bun.connect()
,Bun.listen()
, andbun:sqlite
.
import { spawn }from"bun";import { test, expect }from"bun:test";test("able to spawn a process",async ()=> { using subprocess=spawn({ cmd: [process.execPath,"-e","console.log('Hello, world!')"], stdout:"pipe", });// Even if this expectation fails, the subprocess will still be closed.const stdout=newResponse(subprocess.stdout).text();awaitexpect(stdout).resolves.toBe("Hello, world!");});
Promise.withResolvers()
You can usePromise.withResolvers()
to create a promise that resolves or rejects when you call theresolve
orreject
functions.
const { promise, resolve, reject }=Promise.withResolvers();setTimeout(()=>resolve(),1000);await promise;
This is a useful alternative tonew Promise()
, since you don't need to create a new scope.
const promise=newPromise((resolve,reject)=> {setTimeout(()=>resolve(),1000);});await promise;
Promise.try()
You can usePromise.try()
to create a promise that wraps a synchronous or asynchronous function.
constsyncFn= ()=>1+1;constasyncFn=async (a,b)=>1+ a+ b;awaitPromise.try(syncFn);// => 2awaitPromise.try(asyncFn,2,3);// => 6
This is useful if you don't know if a function is synchronous or asynchronous. Previously, you would have to do something like this:
awaitnewPromise((resolve)=>resolve(syncFn()));awaitnewPromise((resolve)=>resolve(asyncFn(2,3)));
Error.isError()
You can now check if an object is anError
instance usingError.isError()
.
Error.isError(newError());// => trueError.isError({});// => falseError.isError(new (classError {})());// => falseError.isError({ [Symbol.toStringTag]:"Error" });// => false
This is more correct than usinginstanceof
because the prototype chain can be tampered with, andinstanceof
can return false-negatives when usingnode:vm
.
import { runInNewContext }from"node:vm";const crossRealmError=runInNewContext("new Error()");crossRealmErrorinstanceofError;// => falseError.isError(crossRealmError);// => true
Uint8Array.toBase64()
You can now encode and decode base64 strings usingUint8Array
.
toBase64()
converts aUint8Array
to a base64 stringfromBase64()
converts a base64 string to aUint8Array
newUint8Array([1,2,3,4,5]).toBase64();// "AQIDBA=="Unit8Array.fromBase64("AQIDBA==");// [1, 2, 3, 4, 5]
These APIs are standard alternatives to the usage ofBuffer.toString("base64")
in Node.js.
Uint8Array.toHex()
You can also convertUint8Array
to and from hex strings.
toHex()
converts aUint8Array
to a hex stringfromHex()
converts a hex string to aUint8Array
newUint8Array([1,2,3,4,5]).toHex();// "0102030405"Unit8Array.fromHex("0102030405");// [1, 2, 3, 4, 5]
These APIs are standard alternatives to the usage ofBuffer.toString("hex")
in Node.js.
There are new APIs that make it easier to work with JavaScript iterators and generators.
iterator.map(fn)
Returns an iterator that yields the results of thefn
function applied to each value of the original iterator, similar toArray.prototype.map
.
function*range(start:number,end:number):Generator<number> {for (let i= start; i< end; i++) {yield i; }}const result=range(3,5).map((x)=> x*2);result.next();// { value: 6, done: false }
iterator.flatMap(fn)
Returns an iterator that yields the values of the original iterator, but flattens the results of thefn
function, similar toArray.prototype.flatMap
.
function*randomThoughts():Generator<string> {yield"Bun is written in Zig";yield"Bun runs JavaScript and TypeScript";}const result=randomThoughts().flatMap((x)=> x.split(""));result.next();// { value: "Bun", done: false }result.next();// { value: "is", done: false }// ...result.next();// { value: "TypeScript", done: false }
iterator.filter(fn)
Returns an iterator that only yields values that pass thefn
predicate, similar toArray.prototype.filter
.
function*range(start:number,end:number):Generator<number> {for (let i= start; i< end; i++) {yield i; }}const result=range(3,5).filter((x)=> x%2===0);result.next();// { value: 4, done: false }
iterator.take(n)
Returns an iterator that yields the firstn
values of the original iterator.
function*odds():Generator<number> {let i=1;while (true) {yield i; i+=2; }}const result=odds().take(1);result.next();// { value: 1, done: false }result.next();// { done: true }
iterator.drop(n)
Returns an iterator that yields all values of the original iterator, except the firstn
values.
function*evens():Generator<number> {let i=0;while (true) {yield i; i+=2; }}const result=evens().drop(2);result.next();// { value: 4, done: false }result.next();// { value: 6, done: false }
iterator.reduce(fn, initialValue)
Reduces the values of an iterator with a function, similar toArray.prototype.reduce
.
function*powersOfTwo():Generator<number> {let i=1;while (true) {yield i; i*=2; }}const result=powersOfTwo() .take(5) .reduce((acc,x)=> acc+ x,0);console.log(result);// 15
iterator.toArray()
Returns an array that contains all the values of the original iterator. Make sure that the iterator is finite, otherwise this will cause an infinite loop.
function*range(start:number,end:number):Generator<number> {for (let i= start; i< end; i++) {yield i; }}const result=range(1,5).toArray();console.log(result);// [1, 2, 3, 4]
iterator.forEach(fn)
Calls thefn
function on each value of the original iterator, similar toArray.prototype.forEach
.
function*randomThoughts():Generator<string> {yield"Bun is written in Zig";yield"Bun runs JavaScript and TypeScript";}const result=randomThoughts().forEach((x)=> console.log(x));// Bun is written in Zig// Bun runs JavaScript and TypeScript
iterator.find(fn)
Returns the first value of the original iterator that passes thefn
predicate, similar toArray.prototype.find
. If no such value exists, it returnsundefined
.
function*range(start:number,end:number):Generator<number> {for (let i= start; i< end; i++) {yield i; }}const result=range(0,99).find((x)=> x%100===0);console.log(result);// undefined
Float16Array
There's now support for 16-bit floating point arrays usingFloat16Array
. While 16-bit floating point numbers are less precise than 32-bit floating point numbers, they are much more memory efficient.
const float16=newFloat16Array(3);const float32=newFloat32Array(3);for (let i=0; i<3; i++) { float16[i]= i+0.123; float32[i]= i+0.123;}console.log(float16);// Float16Array(3) [ 0, 1.123046875, 2.123046875 ]console.log(float32);// Float32Array(3) [ 0, 1.1230000257492065, 2.122999906539917 ]
In addition to new JavaScript features, there are also new Web-standard APIs that you can use in Bun.
TextDecoderStream
You can now useTextDecoderStream
andTextEncoderStream
to encode and decode streams of data. These APIs are the streaming equivalents ofTextDecoder
andTextEncoder
.
You can useTextDecoderStream
to decode a stream of bytes into a stream of UTF-8 strings.
const response=awaitfetch("https://example.com");const body= response.body.pipeThrough(newTextDecoderStream());forawait (const chunkof body) { console.log(chunk);// typeof chunk === "string"}
Or you can useTextEncoderStream
to encode a stream of UTF-8 strings into a stream of bytes. In Bun, this is up to30x faster than in Node.js.
const stream=newReadableStream({start(controller) { controller.enqueue("Hello, world!"); controller.close(); },});const body= stream.pipeThrough(newTextEncoderStream());forawait (const chunkof body) { console.log(chunk);// chunk instanceof Uint8Array}
TextDecoder
withstream
optionThere is also support for thestream
option inTextDecoder
. This tells the decoder that chunks are part of a larger stream, and it should not throw an error if chunk is not a complete UTF-8 code point.
const decoder=newTextDecoder("utf-8");const first= decoder.decode(newUint8Array([226,153]), { stream:true });const second= decoder.decode(newUint8Array([165]), { stream:true });console.log(first);// ""console.log(second);// "♥"
bytes()
APIYou can now use thebytes()
method on streams, which returns aUint8Array
of the stream's data.
const response=awaitfetch("https://example.com/");const bytes=await response.bytes();console.log(bytes);// Uint8Array(1256) [ 60, 33, ... ]
Previously, you'd have to usearrayBuffer()
, then create a newUint8Array
:
const blob=newBlob(["Hello, world!"]);const buffer=await blob.arrayBuffer();const bytes=newUint8Array(buffer);
Thebytes()
method is supported by several APIs, includingResponse
,Blob
, andBun.file()
.
import { file }from"bun";const content=awaitfile("./hello.txt").bytes();console.log(content);// Uint8Array(1256) [ 60, 33, ... ]
fetch()
uploadsYou can now send afetch()
request with a streaming body. This is useful for uploading large files, or streams of data where the content length is not known ahead of time.
awaitfetch("https://example.com/upload", { method:"POST",body:asyncfunction* () {yield"Hello";yield"";yield"world!"; },});
console.group()
You can now useconsole.group()
andconsole.groupEnd()
to create a nested log messages. Previously, these were not implemented in Bun, and it would do nothing.
console.group("begin");console.log("indent!");console.groupEnd();// begin// indent!
URL.createObjectURL()
There is now support forURL.createObjectURL()
, which creates a URL from a Blob object. These urls can then be used in APIs likefetch()
,Worker
, andimport()
.
When combined withWorker
, it allows for an easy way to spawn additional threads without creating a new separate URL for the worker's script. Since worker scripts also run through Bun's transpiler, TypeScript syntax is supported.
const code=` const foo: number = 123; postMessage({ foo } satisfies Data);`;const blob=newFile([code],"worker.ts");const url=URL.createObjectURL(blob);const worker=newWorker(url);worker.onmessage= ({data })=> { console.log("Received data:", data);};
AbortSignal.any()
You can useAbortSignal.any()
to combine multiple instances ofAbortSignal
. If one of the child signals is aborted, the parent signal is also aborted.
const {signal: firstSignal }=newAbortController();fetch("https://example.com/", { signal: firstSignal });const {signal: secondSignal }=newAbortController();fetch("https://example.com/", { signal: secondSignal });// Cancels if either `firstSignal` or `secondSignal` is abortedconst signal= AbortSignal.any([firstSignal, secondSignal]);awaitfetch("https://example.com/slow", { signal });
Bun 1.2 contains a few behaviour tweaks to that you should be aware of, but we think is unlikely to break your code. We avoid making these changes unless we think the status-quo isso broken that it's worth it.
bun run
uses the correct directoryPreviously, when you ran apackage.json
script usingbun run
, the working directory of the script was the same as the current working directory of your shell.
In most cases, you don't notice a difference, because your shell's working directory isusually the same as the parent directory of yourpackage.json
file.
cd /path/to/project
ls
package.json
bun runpwd
/path/to/project
However, if youcd
into a different directory, you'll notice the difference.
cd dist
bun runpwd
/path/to/project/dist
This does not match what other package managers do, likenpm
oryarn
, and more-often-than-not causes unexpected behaviour.
In Bun 1.2, the working directory of the script is now the parent directory of thepackage.json
file, instead of the current working directory of your shell.
cd /path/to/project/dist
bun runpwd
/path/to/project/dist/path/to/project
bun test
Previously,bun test
would not fail when there was an uncaught error or rejection between test cases.
import { test, expect }from"bun:test";test("should have failed, but didn't", ()=> {setTimeout(()=> {thrownewError("Oops!"); },1);});
In Bun 1.2, this has now been fixed, andbun test
will report the failure.
# Unhandled error between tests-------------------------------1|import { test, expect }from"bun:test";2|3|test("should have failed, but didn't", ()=> {4|setTimeout(()=> {5|thrownewError("Oops!");^error: Oops! at foo.test.ts:5:11-------------------------------
server.stop()
returns a PromisePreviously, there was no way to gracefully wait for connections to close from Bun's HTTP server.
To make this possible, we madestop()
return a promise, which resolves when in-flight HTTP connections are closed.
interfaceServer {stop():void;stop():Promise<void>;}
Bun.build()
rejects when it failsPreviously, whenBun.build()
would fail, it would report the error in thelogs
array. This was often confusing, because the promise would resolve successfully.
import { build }from"bun";const result=awaitbuild({ entrypoints: ["./bad.ts"],});console.log(result.logs[0]);// error: ModuleNotFound resolving "./bad.ts" (entry point)
In Bun 1.2,Bun.build()
will now reject when it fails, instead of returning errors in thelogs
array.
const result=build({ entrypoints: ["./bad.ts"],});await result;// error: ModuleNotFound resolving "./bad.ts" (entry point)
If you want to restore to the old behaviour, you can set thethrow: false
option.
const result=awaitbuild({ entrypoints: ["./bad.ts"], throw:false,});
bun -p
is an alias forbun --print
Previously,bun -p
was an alias forbun --port
, which was used to change the port ofBun.serve()
. The alias was added before Bun supportedbun --print
.
To match Node.js, we've changedbun -p
to be an alias forbun --print
.
bun -p'new Date()'
2025-01-17T22:55:27.659Z
bun build --sourcemap
Previously, usingbun build --sourcemap
would default to inlined source maps.
bun build --sourcemap ./index.ts --outfile ./index.js
console.log("Hello Bun!");//# sourceMappingURL=data:application/json;base64,...
This was confusing, because it is the opposite of what other tools do, likeesbuild
.
In Bun 1.2,bun build --sourcemap
now defaults tolinked
source maps.
console.log("Hello Bun!");
{"version":3,"sources": ["index.ts"],// ...}
If you want to restore to the old behaviour, you can use--sourcemap=inline
.
We spend a lot of time improving performance in Bun. We post almost daily updates of "In the next version of Bun" which you can follow on@bunjavascript.
Here's a preview of some of the performance improvements we made in Bun 1.2.
node:http2
is 2x fasterIn the next version of Bun
— Bun (@bunjavascript)October 17, 2024
node:http2 server support is implemented. For the same code:
Bun v1.1.31: 128,879 req/s (2.4x faster)
Node v23.0.0: 52,785 req/spic.twitter.com/SIM0I0Td4T
node:http
is 5x faster at uploading to S3In the next version of Bun
— Ciro Spaciari (@cirospaciari)August 21, 2024
Uploading files via@aws-sdk/client-s3 gets 5x fasterpic.twitter.com/tptxegT7vh
Not to be confused with Bun's built-in S3 client, which is even 5x faster.
path.resolve()
is 30x fasterIn the next version of Bun
— Jarred Sumner (@jarredsumner)September 12, 2024
path.resolve() gets 30x fasterpic.twitter.com/ukdAHtK6lT
fetch()
is 2x faster at DNS resolutionyOu cAnT mAkE fEtCh fAsTeRpic.twitter.com/Ie8a6YM8Js
— Bun (@bunjavascript)May 18, 2024
bun --hot
uses 2x less memoryIn the next version of Bun
— Jarred Sumner (@jarredsumner)August 1, 2024
bun --hot uses less memory after many runs
left: Bun v1.1.22 (new)
right: Bun v1.1.21 (old)pic.twitter.com/eDl5iqZsme
fs.readdirSync()
is 5% faster on macOSIn the next version of Bun
— Jarred Sumner (@jarredsumner)June 24, 2024
fs.readdirSync on macOS gets 5% faster at reading small directoriespic.twitter.com/iRmoRfymxa
String.at()
is 44% fasterIn the next version of Bun & Safari
— Bun (@bunjavascript)December 12, 2024
"foo".at(i) gets 44% faster, thanks to@__sosukesuzukipic.twitter.com/UtkkJSp6Vb
atob()
is 8x fasterFor large string inputs,atob()
is up to 8x faster.
In the next version of Bun
— Bun (@bunjavascript)May 15, 2024
atob() gets 8x faster, thanks to@lemire's simdutf librarypic.twitter.com/5iL1zrZS5d
fetch()
decompresses 30% fasterIn the next version of Bun
— Jarred Sumner (@jarredsumner)July 27, 2024
fetch() decompresses gzip'd data 30% faster, thanks to libdeflate.pic.twitter.com/obCjWo2fHv
Buffer.from(String, "base64")
is 30x fasterFor large string inputs,Buffer.from(string, "base64")
is up to 30x faster.
In the next version of Bun
— Jarred Sumner (@jarredsumner)June 19, 2024
Buffer.from(str, "base64") gets 6x - 30x faster on large input, thanks to@lemire's simdutfpic.twitter.com/iFgI0Vv3sQ
JSON.parse()
is up to 4x fasterFor large string inputs,JSON.parse()
is between 2x and 4x faster.
For object inputs, it's 6% faster.
In the next version of Bun & Safari
— Bun (@bunjavascript)May 10, 2024
• 6% faster JSON.parse(object)
• 200% - 400% faster JSON.parse(long string)
Thanks@Constellation!pic.twitter.com/uvBJUl8gOs
Bun.serve()
has 2x more throughputThe fast path forrequest.json()
and similar methods now works after accessing the requestbody
. This makes throughput for someBun.serve()
applications up to 2x faster.
In the next version of Bun
— Jarred Sumner (@jarredsumner)September 3, 2024
The fast path in request.json() & similar methods now works after accessing "body"
New: 65,000 req/s
Prev: 29,000 req/spic.twitter.com/2EqYyCoq4G
Error.captureStackTrace()
is 9x fasterIn the next version of Bun
— Jarred Sumner (@jarredsumner)October 14, 2024
Error.captureStackTrace(err) gets 9x fasterpic.twitter.com/Ej7XW8KPNk
fs.readFile()
is 10% fasterFor small files,fs.readFile()
is up to 10% faster.
In the next version of Bun
— Jarred Sumner (@jarredsumner)November 10, 2024
fs.readFile gets up to 10% faster at reading small files (reminder: 1 µs == 1000ns)pic.twitter.com/3vgg8FRfMs
console.log(String)
is 50% fasterWhen you useconsole.log()
with a string as an argument, it's now 50% faster.
In the next version of Bun
— Bun (@bunjavascript)August 13, 2024
console.log(string) gets 50% faster, thanks to@justjs14pic.twitter.com/DBKd1fKODe
In Bun 1.2, we enabled theJIT on Windows. Previously, the JIT was only available on macOS and Linux.
9 years ago, JavaScriptCore's FTL JIT was disabled on Windows
— Jarred Sumner (@jarredsumner)July 9, 2024
Today,@iangrunert brought it backhttps://t.co/xY5078Zlk6
JIT, or just-in-time compilation, is a technique that compiles code at runtime, instead of ahead-of-time compilation. This makes JavaScript faster, but it's also a lot more complex to implement.
JavaScript, across the board, now runs faster on Windows. For example:
Object.entries()
is 20% fasterArray.map()
is 50% fasterThe JIT does a lot, it's over 25,000 lines of C++ code!
That's it — that's Bun 1.2, and it's still just the beginning for Bun.
We've added a ton of new features and APIs that make it easier than ever to build full-stack JavaScript and TypeScript applications.
To get started, run any of the following commands in your terminal.
curl -fsSL https://bun.sh/install| bash
powershell-c"irm bun.sh/install.ps1 | iex"
npm install -g bun
brew tap oven-sh/bun
brew install bun
docker pull oven/bun
docker run --rm --init --ulimit memlock=-1:-1 oven/bun
If you already installed Bun, you can upgrade with the following command.
bun upgrade
We're hiring engineers, designers, and contributors to JavaScript engines like V8, WebKit, Hermes, and SpiderMonkey to join our team in-person in San Francisco to build the future of JavaScript.
You can check out ourcareers page or send us anemail.
Bun is free, open source, and MIT-licensed.
We receive a lot of open source contributions from the community. So, we'd like to thank everyone who has fixed a bug or contributed a feature. We appreciate your help!