Movatterモバイル変換


[0]ホーム

URL:


Bun logoBunBun
Bun v1.2.7
Bun.CookieMap is a Map-like API for getting & setting cookies. Bun's TypeScript type declarations have been rewritten to eliminate conflicts wit...
Bun v1.2.6
Faster, more compatible node:crypto. `timeout` option in Bun.spawn. Support for `module.children` in `node:module`. Connect to PostgreSQL via un...
Bun v1.2.5
Improvements to the frontend dev server, CSS modules support, a full rewrite of Node-API, faster `Sign`, `Verify`, `Hash`, `Hmac` from `node:cry...
Bun v1.2.4
Up to 60% faster Bun.build on macOS, codesigning support for single-file executables on macOS, dev server stability improvements, fixes a regres...
Bun v1.2.3
Bun gets a full-featured frontend development toolchain with incredibly fast hot reloading and bundling. Built-in routing for Bun.serve() makes ...
Bun v1.2.2
JavaScript idle memory usage drops by 10–30%. Fixes regression impacting vite build. Reliability improvements to Bun.SQL, Bun.S3Client, Bun's CS...

Bun v1.1.23


Ashcon Partovi ·August 14, 2024

Bun v1.1.23 is here! This release fixes 27 bugs (addressing 128 👍). Support forTextEncoderStream andTextDecoderStream, 50% fasterconsole.log(string), support forFloat16Array, better out of memory handling, Node.js<>Bun IPC on Windows, truncate large arrays inconsole.log(), and many more bug fixes and Node.js compatibility improvements.

We're hiringsystems engineers in San Francisco to build the future of JavaScript!

Previous releases

  • v1.1.22 fixes 72 bugs (addressing 63 👍). 30% faster fetch() decompression, New --fetch-preconnect flag, improved Remix support, Bun gets 4 MB smaller on Linux, bundle packages excluding dependencies, many bundler fixes and node compatibility improvements.
  • v1.1.0 Bundows. Windows support is here!

To install Bun

curl
npm
powershell
scoop
brew
docker
curl
curl -fsSL https://bun.sh/install| bash
npm
npm install -g bun
powershell
powershell -c"irm bun.sh/install.ps1|iex"
scoop
scoop install bun
brew
brew tap oven-sh/bun
brew install bun
docker
docker pull oven/bun
docker run --rm --init --ulimit memlock=-1:-1 oven/bun

To upgrade Bun

bun upgrade

New features

TextEncoderStream &TextDecoderStream

We've implemented theTextEncoderStream andTextDecoderStream Web APIs. These APIs are the streaming equivalents ofTextEncoder andTextDecoder.

You can useTextDecoderStream to decode a stream of bytes into a stream of UTF-8 strings.

const response=awaitfetch("https://example.com");const body= response.body.pipeThrough(newTextDecoderStream());forawait (const chunkof body) {  console.log(chunk);// typeof chunk === "string"}

You can useTextEncoderStream to encode a stream of UTF-8 strings into a stream of bytes.

const stream=newReadableStream({start(controller) {    controller.enqueue("Hello, world!");    controller.close();  },});const body= stream.pipeThrough(newTextEncoderStream());forawait (const chunkof body) {  console.log(chunk);// chunk instanceof Uint8Array}

TextEncoderStream in Bun is relatively fast.

TextEncoderStream in Bun v1.1.23 encodes the same input 3x - 30x faster than in Node v22.5.1pic.twitter.com/GCEfgfK0GU

— Jarred Sumner (@jarredsumner)August 10, 2024

Why TextEncoderStream?

TheTextEncoderStream API is used in popular packages like Next.js' App Router for middleware.

Since most natively-implemented streams in Bun support both text and binary data, you rarely ever need to useTextEncoderStream in Bun. In fact, usingTextEncoderStream in Bun is slower than not using it because it adds significant overhead to the stream. So, don't use this API unless a library you depend on is already using it.

stream option inTextDecoder

This release also adds support for thestream option inTextDecoder. This tells the decoder that chunks are part of a larger stream, and it should not throw an error if chunk is not a complete UTF-8 code point.

const decoder=newTextDecoder("utf-8");const first= decoder.decode(newUint8Array([226,153]), { stream:true });const second= decoder.decode(newUint8Array([165]), { stream:true });console.log(first);// ""console.log(second);// "♥"

50% fasterconsole.log(string)

In this release of bun, we madeconsole.log(string) 50% faster.

In the next version of Bun

console.log(string) gets 50% faster, thanks to@justjs14pic.twitter.com/DBKd1fKODe

— Bun (@bunjavascript)August 13, 2024

Previously, Bun had a fast-path for printing a single string withconsole.log(). In theory, this would mean that only 1 syscall would be needed to print the string. However, this was not the case becauseconsole.log() needs to print a newline character and reset ANSI escape codes. We made it faster by removing this no-so-fast-path and buffering the write like in other cases.

Thanks to@billywhizz and@nektro for making this faster!

Truncate large arrays inconsole.log()

When youconsole.log(largeArray) a large array in Bun, instead of printing out the entire array element by element, Bun will now stop after printing 100 total elements, and print an ellipsis with a count (... 1,234 more items) to indicate that there are more elements.

In the next version of Bun,

console.log truncates arrays after 100 elementspic.twitter.com/qPZtkb6sup

— meghan 🌻 (@nektro)August 10, 2024

Thanks to@nektro for working on this feature!

Float16Array

In this release of Bun, there is support for the newly addedFloat16Array API. This is astage 3 TC39 proposal that was implemented in WebKit.

const float16=newFloat16Array(3);const float32=newFloat32Array(3);for (let i=0; i<3; i++) {  float16[i]= i+0.123;  float32[i]= i+0.123;}console.log(float16);// Float16Array(3) [ 0, 1.123046875, 2.123046875 ]console.log(float32);// Float32Array(3) [ 0, 1.1230000257492065, 2.122999906539917 ]

Thanks to the WebKit team for implementing this feature!

Better handling of out-of-memory errors

We improved how out-of-memory errors are handled inResponse,Request,Blob, andnode:fs. Previously, Bun would crash or potentially truncate if an operation exceeded an engine limit, now Bun will check if the operation will definitely exceed a limit and properly throw an error if it does.

import { expect }from"bun:test";const buf= Buffer.alloc(4294967295,"abc");try {const blob=newBlob([buf, buf]);await blob.text();}catch (e) {expect(e.message).toBe("Cannot create a string longer than 2^32-1 characters",  );expect(e.code).toBe("ERR_STRING_TOO_LONG");}// Before: `catch` block would not be called// After: `catch` block is called

The improved out-of-memory error handling affects the following APIs:

  • text(),json(),bytes(),formData(),arrayBuffer() methods onBlob,Request, andResponse
  • fs.writeFile() &fs.readFile()

Node.js compatibility improvements

Fixed:fs.F_OK,fs.R_OK,fs.W_OK, and similar constants

We fixed a bug wherenode:fs constants, such asF_OK, were not defined in Bun. These were deprecated in favor offs.constants in Node.js 20, but are still defined for compatibility reasons.

import fsfrom"node:fs";console.log(fs.F_OK);// old wayconsole.log(fs.constants.F_OK);// new way

Fixed:fs.readFile memory & size limits

Strings and typed arrays in Bun are limited to 2^32 characters of length by the engine (JavaScriptCore). Node.js/V8 has a lower limit for strings and a higher limit for typed arrays.

When reading files larger than this limit withfs.readFile, previously Bun would behave incorrectly. In certain cases, Bun would crash due to incorrect JavaScript exception handling and in other cases, Bun would potentially return a truncated string or typed array.

Now, Bun will throw an error if the file as soon as it is known that the file will not be able to be read from JavaScript. This saves you memory and CPU time because it avoids reading the complete file into memory before throwing an error.

Fixed: Backpressure in 'ws' module

We fixed a bug where callingWebSocket.send() under high-load would cause a message to be sent multiple times. This was due to incorrect handling of backpressure when a message was rejected by the socket.

import { WebSocket }from"ws";const ws=newWebSocket("ws://example.com");ws.on("open", ()=> {for (let i=0; i<1_000; i++) {    ws.send(`Hello, world!${i}`);  }});// Before:            | After:// ...                | ...// Hello, world! 999  | Hello, world! 997// Hello, world! 999  | Hello, world! 998// Hello, world! 999  | Hello, world! 999// ...                | ...

This only affectedWebSocket clients that were created by importing thews package, which Bun changes to use our own implementation ofWebSocket.

Thanks to@cirospaciari for fixing this bug!

Cross-runtime IPC on Windows with Node.js

We fixed a bug where inter-process-communication (IPC) innode:child_process would not work on Windows when sending messages between Bun and Node.js processes. In certain cases, this could cause some build tools to hang on Windows.

Thanks to@cirospaciari for fixing this bug!

JIT crashes innode:vm

We fixed several crashes that could occur when code is JIT'd and being evaluated in anode:vm context. This would occur when theglobalThis object in thenode:vm context was not the same as the realglobalThis object.

import { Script }from"node:vm";const script=newScript(`  for (let i = 0; i < 1_000_000; i++) {    performance.now();  }`);script.runInContext(globalThis);// okscript.runInContext({ performance });// would crash

This affected only certain code paths after about a million times invocations:

  • performance.now()
  • TextEncoder.encode() &TextDecoder.decode()
  • crypto.randomUUID() &crypto.getRandomValues()
  • crypto.timingSafeEqual()

Thanks to@dylan-conway for fixing this bug!

Bugfixes

Fixed: Buffering gigabytes of data overBun.serve()

We fixed a bug where receving then streaming a large response fromBun.serve() would cause the response to be truncated. This was because if the data was larger than2^32-1 bytes, the flush would be truncated.

Thanks to@cirospaciari for fixing this bug!

Fixed: moving tagged template literals when bundling

We fixed a bundler bug where tagged template literals would be incorrectly moved during bundling. This would cause the template literal to be evaluated in the wrong scope, which would cause it to throw an error.

foo.ts
globalThis.foo= ()=> console.log("foo");const bar=awaitimport("./bar.ts");
bar.ts
console.log("bar");exportconst bar=foo`bar`;
// Before: TypeError: undefined is not a function (near '...foo`)// After: bar\nfoo

This was a regression introduced in Bun v1.1.18, and has now been fixed.

Thanks to@paperclover for fixing this bug!

Fixed: AbortSignal withfetch() when using custom TLS certificates

We fixed a bug where callingfetch() with a custom TLS certificate would not abort the request due to a timeout. Bun supportstls options when making afetch() request, which if often needed in non-browser environments where you need to use a custom TLS certificate.

const response=awaitfetch("https://example.com", {  signal: AbortSignal.timeout(1000),  tls: {    ca:"...",  },});

Thanks to@cirospaciari for fixing this bug!

Fixed: memory leak whennew Response threw an error

We fixed a memory leak where setting a customstatusText on anew Response would not be cleaned up. We've also added more tests to ensure that leaks toRequest andResponse are caught.

Fixed: crash when importing an empty.toml file

We fixed a bug when importing an empty.toml file would cause Bun to crash. This would also affect certain.json files, likepackage.json andtsconfig.json.

import configfrom"./config.toml";console.log(config);// Before: <crash>// After: { enabled: true }

Fixed: Regression with TLS sockets in 1.1.22

A regression introduced in 1.1.22 could cause a crash when a TLS socket failed to connect due to a DNS issue has been fixed, thanks to@cirospaciari.

Thank you to 6 contributors!


Bun v1.1.22

Bun v1.1.25


[8]ページ先頭

©2009-2025 Movatter.jp