Bun v1.1.23 is here! This release fixes 27 bugs (addressing 128 👍). Support forTextEncoderStream
andTextDecoderStream
, 50% fasterconsole.log(string)
, support forFloat16Array
, better out of memory handling, Node.js<>Bun IPC on Windows, truncate large arrays inconsole.log()
, and many more bug fixes and Node.js compatibility improvements.
We're hiringsystems engineers in San Francisco to build the future of JavaScript!
v1.1.22
fixes 72 bugs (addressing 63 👍). 30% faster fetch() decompression, New --fetch-preconnect flag, improved Remix support, Bun gets 4 MB smaller on Linux, bundle packages excluding dependencies, many bundler fixes and node compatibility improvements.v1.1.0
Bundows. Windows support is here!curl -fsSL https://bun.sh/install| bash
npm install -g bun
powershell -c"irm bun.sh/install.ps1|iex"
scoop install bun
brew tap oven-sh/bun
brew install bun
docker pull oven/bun
docker run --rm --init --ulimit memlock=-1:-1 oven/bun
bun upgrade
TextEncoderStream
&TextDecoderStream
We've implemented theTextEncoderStream
andTextDecoderStream
Web APIs. These APIs are the streaming equivalents ofTextEncoder
andTextDecoder
.
You can useTextDecoderStream
to decode a stream of bytes into a stream of UTF-8 strings.
const response=awaitfetch("https://example.com");const body= response.body.pipeThrough(newTextDecoderStream());forawait (const chunkof body) { console.log(chunk);// typeof chunk === "string"}
You can useTextEncoderStream
to encode a stream of UTF-8 strings into a stream of bytes.
const stream=newReadableStream({start(controller) { controller.enqueue("Hello, world!"); controller.close(); },});const body= stream.pipeThrough(newTextEncoderStream());forawait (const chunkof body) { console.log(chunk);// chunk instanceof Uint8Array}
TextEncoderStream
in Bun is relatively fast.
TextEncoderStream in Bun v1.1.23 encodes the same input 3x - 30x faster than in Node v22.5.1pic.twitter.com/GCEfgfK0GU
— Jarred Sumner (@jarredsumner)August 10, 2024
TheTextEncoderStream
API is used in popular packages like Next.js' App Router for middleware.
Since most natively-implemented streams in Bun support both text and binary data, you rarely ever need to useTextEncoderStream
in Bun. In fact, usingTextEncoderStream
in Bun is slower than not using it because it adds significant overhead to the stream. So, don't use this API unless a library you depend on is already using it.
stream
option inTextDecoder
This release also adds support for thestream
option inTextDecoder
. This tells the decoder that chunks are part of a larger stream, and it should not throw an error if chunk is not a complete UTF-8 code point.
const decoder=newTextDecoder("utf-8");const first= decoder.decode(newUint8Array([226,153]), { stream:true });const second= decoder.decode(newUint8Array([165]), { stream:true });console.log(first);// ""console.log(second);// "♥"
console.log(string)
In this release of bun, we madeconsole.log(string)
50% faster.
In the next version of Bun
— Bun (@bunjavascript)August 13, 2024
console.log(string) gets 50% faster, thanks to@justjs14pic.twitter.com/DBKd1fKODe
Previously, Bun had a fast-path for printing a single string withconsole.log()
. In theory, this would mean that only 1 syscall would be needed to print the string. However, this was not the case becauseconsole.log()
needs to print a newline character and reset ANSI escape codes. We made it faster by removing this no-so-fast-path and buffering the write like in other cases.
Thanks to@billywhizz and@nektro for making this faster!
console.log()
When youconsole.log(largeArray)
a large array in Bun, instead of printing out the entire array element by element, Bun will now stop after printing 100 total elements, and print an ellipsis with a count (... 1,234 more items
) to indicate that there are more elements.
In the next version of Bun,
— meghan 🌻 (@nektro)August 10, 2024
console.log truncates arrays after 100 elementspic.twitter.com/qPZtkb6sup
Thanks to@nektro for working on this feature!
Float16Array
In this release of Bun, there is support for the newly addedFloat16Array
API. This is astage 3 TC39 proposal that was implemented in WebKit.
const float16=newFloat16Array(3);const float32=newFloat32Array(3);for (let i=0; i<3; i++) { float16[i]= i+0.123; float32[i]= i+0.123;}console.log(float16);// Float16Array(3) [ 0, 1.123046875, 2.123046875 ]console.log(float32);// Float32Array(3) [ 0, 1.1230000257492065, 2.122999906539917 ]
Thanks to the WebKit team for implementing this feature!
We improved how out-of-memory errors are handled inResponse
,Request,
Blob
, andnode:fs
. Previously, Bun would crash or potentially truncate if an operation exceeded an engine limit, now Bun will check if the operation will definitely exceed a limit and properly throw an error if it does.
import { expect }from"bun:test";const buf= Buffer.alloc(4294967295,"abc");try {const blob=newBlob([buf, buf]);await blob.text();}catch (e) {expect(e.message).toBe("Cannot create a string longer than 2^32-1 characters", );expect(e.code).toBe("ERR_STRING_TOO_LONG");}// Before: `catch` block would not be called// After: `catch` block is called
The improved out-of-memory error handling affects the following APIs:
text()
,json()
,bytes()
,formData()
,arrayBuffer()
methods onBlob
,Request
, andResponse
fs.writeFile()
&fs.readFile()
fs.F_OK
,fs.R_OK
,fs.W_OK
, and similar constantsWe fixed a bug wherenode:fs
constants, such asF_OK
, were not defined in Bun. These were deprecated in favor offs.constants
in Node.js 20, but are still defined for compatibility reasons.
import fsfrom"node:fs";console.log(fs.F_OK);// old wayconsole.log(fs.constants.F_OK);// new way
fs.readFile
memory & size limitsStrings and typed arrays in Bun are limited to 2^32 characters of length by the engine (JavaScriptCore). Node.js/V8 has a lower limit for strings and a higher limit for typed arrays.
When reading files larger than this limit withfs.readFile
, previously Bun would behave incorrectly. In certain cases, Bun would crash due to incorrect JavaScript exception handling and in other cases, Bun would potentially return a truncated string or typed array.
Now, Bun will throw an error if the file as soon as it is known that the file will not be able to be read from JavaScript. This saves you memory and CPU time because it avoids reading the complete file into memory before throwing an error.
We fixed a bug where callingWebSocket.send()
under high-load would cause a message to be sent multiple times. This was due to incorrect handling of backpressure when a message was rejected by the socket.
import { WebSocket }from"ws";const ws=newWebSocket("ws://example.com");ws.on("open", ()=> {for (let i=0; i<1_000; i++) { ws.send(`Hello, world!${i}`); }});// Before: | After:// ... | ...// Hello, world! 999 | Hello, world! 997// Hello, world! 999 | Hello, world! 998// Hello, world! 999 | Hello, world! 999// ... | ...
This only affectedWebSocket
clients that were created by importing thews
package, which Bun changes to use our own implementation ofWebSocket
.
Thanks to@cirospaciari for fixing this bug!
We fixed a bug where inter-process-communication (IPC) innode:child_process
would not work on Windows when sending messages between Bun and Node.js processes. In certain cases, this could cause some build tools to hang on Windows.
Thanks to@cirospaciari for fixing this bug!
node:vm
We fixed several crashes that could occur when code is JIT'd and being evaluated in anode:vm
context. This would occur when theglobalThis
object in thenode:vm
context was not the same as the realglobalThis
object.
import { Script }from"node:vm";const script=newScript(` for (let i = 0; i < 1_000_000; i++) { performance.now(); }`);script.runInContext(globalThis);// okscript.runInContext({ performance });// would crash
This affected only certain code paths after about a million times invocations:
performance.now()
TextEncoder.encode()
&TextDecoder.decode()
crypto.randomUUID()
&crypto.getRandomValues()
crypto.timingSafeEqual()
Thanks to@dylan-conway for fixing this bug!
Bun.serve()
We fixed a bug where receving then streaming a large response fromBun.serve()
would cause the response to be truncated. This was because if the data was larger than2^32-1
bytes, the flush would be truncated.
Thanks to@cirospaciari for fixing this bug!
We fixed a bundler bug where tagged template literals would be incorrectly moved during bundling. This would cause the template literal to be evaluated in the wrong scope, which would cause it to throw an error.
globalThis.foo= ()=> console.log("foo");const bar=awaitimport("./bar.ts");
console.log("bar");exportconst bar=foo`bar`;
// Before: TypeError: undefined is not a function (near '...foo`)// After: bar\nfoo
This was a regression introduced in Bun v1.1.18, and has now been fixed.
Thanks to@paperclover for fixing this bug!
fetch()
when using custom TLS certificatesWe fixed a bug where callingfetch()
with a custom TLS certificate would not abort the request due to a timeout. Bun supportstls
options when making afetch()
request, which if often needed in non-browser environments where you need to use a custom TLS certificate.
const response=awaitfetch("https://example.com", { signal: AbortSignal.timeout(1000), tls: { ca:"...", },});
Thanks to@cirospaciari for fixing this bug!
new Response
threw an errorWe fixed a memory leak where setting a customstatusText
on anew Response
would not be cleaned up. We've also added more tests to ensure that leaks toRequest
andResponse
are caught.
.toml
fileWe fixed a bug when importing an empty.toml
file would cause Bun to crash. This would also affect certain.json
files, likepackage.json
andtsconfig.json
.
import configfrom"./config.toml";console.log(config);// Before: <crash>// After: { enabled: true }
A regression introduced in 1.1.22 could cause a crash when a TLS socket failed to connect due to a DNS issue has been fixed, thanks to@cirospaciari.