- Notifications
You must be signed in to change notification settings - Fork170
@msgpack/msgpack - MessagePack for JavaScript / msgpack.org[ECMA-262/JavaScript/TypeScript]
License
msgpack/msgpack-javascript
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
This library is an implementation ofMessagePack for TypeScript and JavaScript, providing a compact and efficient binary serialization format. Learn more about MessagePack at:
This library serves as a comprehensive reference implementation of MessagePack for JavaScript with a focus on accuracy, compatibility, interoperability, and performance.
Additionally, this is also a universal JavaScript library. It is compatible not only with browsers, but with Node.js or other JavaScript engines that implement ES2015+ standards. As it is written inTypeScript, this library bundles up-to-date type definition files (d.ts
).
*Note that this is the second edition of "MessagePack for JavaScript". The first edition, which was implemented in ES5 and never released to npmjs.com, is tagged asclassic
.
import{deepStrictEqual}from"assert";import{encode,decode}from"@msgpack/msgpack";constobject={nil:null,integer:1,float:Math.PI,string:"Hello, world!",binary:Uint8Array.from([1,2,3]),array:[10,20,30],map:{foo:"bar"},timestampExt:newDate(),};constencoded:Uint8Array=encode(object);deepStrictEqual(decode(encoded),object);
- Synopsis
- Table of Contents
- Install
- API
encode(data: unknown, options?: EncoderOptions): Uint8Array
decode(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): unknown
decodeMulti(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): Generator<unknown, void, unknown>
decodeAsync(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): Promise<unknown>
decodeArrayStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
decodeMultiStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
- Reusing Encoder and Decoder instances
- Extension Types
- Faster way to decode a large array of floating point numbers
- Decoding a Blob
- MessagePack Specification
- Prerequisites
- Benchmark
- Distribution
- Deno Support
- Bun Support
- Maintenance
- License
This library is published tonpmjs.com
as@msgpack/msgpack.
npm install @msgpack/msgpack
It encodesdata
into a single MessagePack-encoded object, and returns a byte array asUint8Array
. It throws errors ifdata
is, or includes, a non-serializable object such as afunction
or asymbol
.
for example:
import{encode}from"@msgpack/msgpack";constencoded:Uint8Array=encode({foo:"bar"});console.log(encoded);
If you'd like to convert anuint8array
to a NodeJSBuffer
, useBuffer.from(arrayBuffer, offset, length)
in order not to copy the underlyingArrayBuffer
, whileBuffer.from(uint8array)
copies it:
import{encode}from"@msgpack/msgpack";constencoded:Uint8Array=encode({foo:"bar"});// `buffer` refers the same ArrayBuffer as `encoded`.constbuffer:Buffer=Buffer.from(encoded.buffer,encoded.byteOffset,encoded.byteLength);console.log(buffer);
Name | Type | Default |
---|---|---|
extensionCodec | ExtensionCodec | ExtensionCodec.defaultCodec |
context | user-defined | - |
useBigInt64 | boolean | false |
maxDepth | number | 100 |
initialBufferSize | number | 2048 |
sortKeys | boolean | false |
forceFloat32 | boolean | false |
forceIntegerToFloat | boolean | false |
ignoreUndefined | boolean | false |
It decodesbuffer
that includes a MessagePack-encoded object, and returns the decoded object typedunknown
.
buffer
must be an array of bytes, which is typicallyUint8Array
orArrayBuffer
.BufferSource
is defined asArrayBuffer | ArrayBufferView
.
Thebuffer
must include a single encoded object. If thebuffer
includes extra bytes after an object or thebuffer
is empty, it throwsRangeError
. To decodebuffer
that includes multiple encoded objects, usedecodeMulti()
ordecodeMultiStream()
(recommended) instead.
for example:
import{decode}from"@msgpack/msgpack";constencoded:Uint8Array;constobject=decode(encoded);console.log(object);
NodeJSBuffer
is also acceptable because it is a subclass ofUint8Array
.
Name | Type | Default |
---|---|---|
extensionCodec | ExtensionCodec | ExtensionCodec.defaultCodec |
context | user-defined | - |
useBigInt64 | boolean | false |
rawStrings | boolean | false |
maxStrLength | number | 4_294_967_295 (UINT32_MAX) |
maxBinLength | number | 4_294_967_295 (UINT32_MAX) |
maxArrayLength | number | 4_294_967_295 (UINT32_MAX) |
maxMapLength | number | 4_294_967_295 (UINT32_MAX) |
maxExtLength | number | 4_294_967_295 (UINT32_MAX) |
mapKeyConverter | MapKeyConverterType | throw exception if key is not string or number |
MapKeyConverterType
is defined as(key: unknown) => string | number
.
To skip UTF-8 decoding of strings,rawStrings
can be set totrue
. In this case, strings are decoded intoUint8Array
.
You can usemax${Type}Length
to limit the length of each type decoded.
decodeMulti(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): Generator<unknown, void, unknown>
It decodesbuffer
that includes multiple MessagePack-encoded objects, and returns decoded objects as a generator. See alsodecodeMultiStream()
, which is an asynchronous variant of this function.
This function is not recommended to decode a MessagePack binary via I/O stream including sockets because it's synchronous. Instead,decodeMultiStream()
decodes a binary stream asynchronously, typically spending less CPU and memory.
for example:
import{decode}from"@msgpack/msgpack";constencoded:Uint8Array;for(constobjectofdecodeMulti(encoded)){console.log(object);}
decodeAsync(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): Promise<unknown>
It decodesstream
, whereReadableStreamLike<T>
is defined asReadableStream<T> | AsyncIterable<T>
, in an async iterable of byte arrays, and returns decoded object asunknown
type, wrapped inPromise
.
This function works asynchronously, and might CPU resources more efficiently compared with synchronousdecode()
, because it doesn't wait for the completion of downloading.
This function is designed to work with whatwgfetch()
like this:
import{decodeAsync}from"@msgpack/msgpack";constMSGPACK_TYPE="application/x-msgpack";constresponse=awaitfetch(url);constcontentType=response.headers.get("Content-Type");if(contentType&&contentType.startsWith(MSGPACK_TYPE)&&response.body!=null){constobject=awaitdecodeAsync(response.body);// do something with object}else{/* handle errors */}
decodeArrayStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
It is alike todecodeAsync()
, but only accepts astream
that includes an array of items, and emits a decoded item one by one.
for example:
import{decodeArrayStream}from"@msgpack/msgpack";conststream:AsyncIterator<Uint8Array>;// in an async function:forawait(constitemofdecodeArrayStream(stream)){console.log(item);}
decodeMultiStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>
It is alike todecodeAsync()
anddecodeArrayStream()
, but the inputstream
must consist of multiple MessagePack-encoded items. This is an asynchronous variant fordecodeMulti()
.
In other words, it could decode an unlimited stream and emits a decoded item one by one.
for example:
import{decodeMultiStream}from"@msgpack/msgpack";conststream:AsyncIterator<Uint8Array>;// in an async function:forawait(constitemofdecodeMultiStream(stream)){console.log(item);}
This function is available since v2.4.0; previously it was called asdecodeStream()
.
Encoder
andDecoder
classes are provided to have better performance by reusing instances:
import{deepStrictEqual}from"assert";import{Encoder,Decoder}from"@msgpack/msgpack";constencoder=newEncoder();constdecoder=newDecoder();constencoded:Uint8Array=encoder.encode(object);deepStrictEqual(decoder.decode(encoded),object);
According to our benchmark, reusingEncoder
instance is about 20% fasterthanencode()
function, and reusingDecoder
instance is about 2% fasterthandecode()
function. Note that the result should vary in environmentsand data structure.
Encoder
andDecoder
take the same options asencode()
anddecode()
respectively.
To handleMessagePack Extension Types, this library providesExtensionCodec
class.
This is an example to setup custom extension types that handlesMap
andSet
classes in TypeScript:
import{encode,decode,ExtensionCodec}from"@msgpack/msgpack";constextensionCodec=newExtensionCodec();// Set<T>constSET_EXT_TYPE=0// Any in 0-127extensionCodec.register({type:SET_EXT_TYPE,encode:(object:unknown):Uint8Array|null=>{if(objectinstanceofSet){returnencode([...object],{ extensionCodec});}else{returnnull;}},decode:(data:Uint8Array)=>{constarray=decode(data,{ extensionCodec})asArray<unknown>;returnnewSet(array);},});// Map<K, V>constMAP_EXT_TYPE=1;// Any in 0-127extensionCodec.register({type:MAP_EXT_TYPE,encode:(object:unknown):Uint8Array=>{if(objectinstanceofMap){returnencode([...object],{ extensionCodec});}else{returnnull;}},decode:(data:Uint8Array)=>{constarray=decode(data,{ extensionCodec})asArray<[unknown,unknown]>;returnnewMap(array);},});constencoded=encode([newSet<any>(),newMap<any,any>()],{ extensionCodec});constdecoded=decode(encoded,{ extensionCodec});
Ensure you include your extensionCodec in any recursive encode and decode statements!
Note that extension types for custom objects must be[0, 127]
, while[-1, -128]
is reserved for MessagePack itself.
When you use an extension codec, it might be necessary to have encoding/decoding state to keep track of which objects got encoded/re-created. To do this, pass acontext
to theEncoderOptions
andDecoderOptions
:
import{encode,decode,ExtensionCodec}from"@msgpack/msgpack";classMyContext{track(object:any){/*...*/}}classMyType{/* ... */}constextensionCodec=newExtensionCodec<MyContext>();// MyTypeconstMYTYPE_EXT_TYPE=0// Any in 0-127extensionCodec.register({type:MYTYPE_EXT_TYPE,encode:(object,context)=>{if(objectinstanceofMyType){context.track(object);returnencode(object.toJSON(),{ extensionCodec, context});}else{returnnull;}},decode:(data,extType,context)=>{constdecoded=decode(data,{ extensionCodec, context});constmy=newMyType(decoded);context.track(my);returnmy;},});// and laterimport{encode,decode}from"@msgpack/msgpack";constcontext=newMyContext();constencoded=encode({myType:newMyType<any>()},{ extensionCodec, context});constdecoded=decode(encoded,{ extensionCodec, context});
This library does not handle BigInt by default, but you have two options to handle it:
- Set
useBigInt64: true
to map bigint to MessagePack's int64/uint64 - Define a custom
ExtensionCodec
to map bigint to a MessagePack's extension type
useBigInt64: true
is the simplest way to handle bigint, but it has limitations:
- A bigint is encoded in 8 byte binaries even if it's a small integer
- A bigint must be smaller than the max value of the uint64 and larger than the min value of the int64. Otherwise the behavior is undefined.
So you might want to define a custom codec to handle bigint like this:
import{deepStrictEqual}from"assert";import{encode,decode,ExtensionCodec,DecodeError}from"@msgpack/msgpack";// to define a custom codec:constBIGINT_EXT_TYPE=0;// Any in 0-127constextensionCodec=newExtensionCodec();extensionCodec.register({type:BIGINT_EXT_TYPE,encode(input:unknown):Uint8Array|null{if(typeofinput==="bigint"){if(input<=Number.MAX_SAFE_INTEGER&&input>=Number.MIN_SAFE_INTEGER){returnencode(Number(input));}else{returnencode(String(input));}}else{returnnull;}},decode(data:Uint8Array):bigint{constval=decode(data);if(!(typeofval==="string"||typeofval==="number")){thrownewDecodeError(`unexpected BigInt source:${val} (${typeofval})`);}returnBigInt(val);},});// to use it:constvalue=BigInt(Number.MAX_SAFE_INTEGER)+BigInt(1);constencoded=encode(value,{ extensionCodec});deepStrictEqual(decode(encoded,{ extensionCodec}),value);
There is a proposal for a new date/time representations in #"auto">
This library mapsDate
to the MessagePack timestamp extension by default, but you can re-map the temporal module (orTemporal Polyfill) to the timestamp extension like this:
import{Instant}from"@std-proposal/temporal";import{deepStrictEqual}from"assert";import{encode,decode,ExtensionCodec,EXT_TIMESTAMP,encodeTimeSpecToTimestamp,decodeTimestampToTimeSpec,}from"@msgpack/msgpack";// to define a custom codecconstextensionCodec=newExtensionCodec();extensionCodec.register({type:EXT_TIMESTAMP,// override the default behavior!encode(input:unknown):Uint8Array|null{if(inputinstanceofInstant){constsec=input.seconds;constnsec=Number(input.nanoseconds-BigInt(sec)*BigInt(1e9));returnencodeTimeSpecToTimestamp({ sec, nsec});}else{returnnull;}},decode(data:Uint8Array):Instant{consttimeSpec=decodeTimestampToTimeSpec(data);constsec=BigInt(timeSpec.sec);constnsec=BigInt(timeSpec.nsec);returnInstant.fromEpochNanoseconds(sec*BigInt(1e9)+nsec);},});// to use itconstinstant=Instant.fromEpochMilliseconds(Date.now());constencoded=encode(instant,{ extensionCodec});constdecoded=decode(encoded,{ extensionCodec});deepStrictEqual(decoded,instant);
This will become default in this library with major-version increment, if the temporal module is standardized.
If there are large arrays of floating point numbers in your payload, thereis a way to decode it faster: define a custom extension type forFloat#Array
with alignment.
An extension type'sencode
method can return a function that takes a parameterpos: number
. This parameter can be used to make alignment of the buffer,resulting decoding it much more performant.
See an example implementation forFloat32Array
:
constextensionCodec=newExtensionCodec();constEXT_TYPE_FLOAT32ARRAY=0;// Any in 0-127extensionCodec.register({type:EXT_TYPE_FLOAT32ARRAY,encode:(object:unknown)=>{if(objectinstanceofFloat32Array){return(pos:number)=>{constbpe=Float32Array.BYTES_PER_ELEMENT;constpadding=1+((bpe-((pos+1)%bpe))%bpe);constdata=newUint8Array(object.buffer);constresult=newUint8Array(padding+data.length);result[0]=padding;result.set(data,padding);returnresult;};}returnnull;},decode:(data:Uint8Array)=>{constpadding=data[0]!;constbpe=Float32Array.BYTES_PER_ELEMENT;constoffset=data.byteOffset+padding;constlength=data.byteLength-padding;returnnewFloat32Array(data.buffer,offset,length/bpe);},});
Blob
is a binary data container provided by browsers. To read its contents when it contains a MessagePack binary, you can useBlob#arrayBuffer()
orBlob#stream()
.Blob#stream()
is recommended if your target platform support it. This is because streamingdecode should be faster for large objects. In both ways, you need to useasynchronous API.
asyncfunctiondecodeFromBlob(blob:Blob):unknown{if(blob.stream){// Blob#stream(): ReadableStream<Uint8Array> (recommended)returnawaitdecodeAsync(blob.stream());}else{// Blob#arrayBuffer(): Promise<ArrayBuffer> (if stream() is not available)returndecode(awaitblob.arrayBuffer());}}
This library is compatible with the "August 2017" revision of MessagePack specification at the point where timestamp ext was added:
- str/bin separation, added at August 2013
- extension types, added at August 2013
- timestamp ext type, added at August 2017
The living specification is here:
https://github.com/msgpack/msgpack
Note that as of June 2019 there're no official "version" on the MessagePack specification. Seemsgpack/msgpack#195 for the discussions.
The following table shows how JavaScript values are mapped toMessagePack formats and vice versa.
The mapping of integers varies on the setting ofuseBigInt64
.
The default,useBigInt64: false
is:
Source Value | MessagePack Format | Value Decoded |
---|---|---|
null, undefined | nil | null (*1) |
boolean (true, false) | bool family | boolean (true, false) |
number (53-bit int) | int family | number |
number (64-bit float) | float family | number |
string | str family | string (*2) |
ArrayBufferView | bin family | Uint8Array (*3) |
Array | array family | Array |
Object | map family | Object (*4) |
Date | timestamp ext family | Date (*5) |
bigint | N/A | N/A (*6) |
- *1 Both
null
andundefined
are mapped tonil
(0xC0
) type, and are decoded intonull
- *2 If you'd like to skip UTF-8 decoding of strings, set
rawStrings: true
. In this case, strings are decoded intoUint8Array
. - *3 Any
ArrayBufferView
s including NodeJS'sBuffer
are mapped tobin
family, and are decoded intoUint8Array
- *4 In handling
Object
, it is regarded asRecord<string, unknown>
in terms of TypeScript - *5 MessagePack timestamps may have nanoseconds, which will lost when it is decoded into JavaScript
Date
. This behavior can be overridden by registering-1
for the extension codec. - *6 bigint is not supported in
useBigInt64: false
mode, but you can define an extension codec for it.
If you setuseBigInt64: true
, the following mapping is used:
Source Value | MessagePack Format | Value Decoded |
---|---|---|
null, undefined | nil | null |
boolean (true, false) | bool family | boolean (true, false) |
number (32-bit int) | int family | number |
number (except for the above) | float family | number |
bigint | int64 / uint64 | bigint (*7) |
string | str family | string |
ArrayBufferView | bin family | Uint8Array |
Array | array family | Array |
Object | map family | Object |
Date | timestamp ext family | Date |
- *7 If the bigint is larger than the max value of uint64 or smaller than the min value of int64, then the behavior is undefined.
This is a universal JavaScript library that supports major browsers and NodeJS.
- ES2015 language features
- ES2024 standard library, including:
- Typed arrays (ES2015)
- Async iterations (ES2018)
- Features added in ES2015-ES2022
- whatwg encodings (
TextEncoder
andTextDecoder
)
ES2022 standard library used in this library can be polyfilled withcore-js.
IE11 is no longer supported. If you'd like to use this library in IE11, use v2.x versions.
NodeJS v18 is required.
This module requires type definitions ofAsyncIterator
,ArrayBufferLike
, whatwg streams, and so on. They are provided by"lib": ["ES2024", "DOM"]
intsconfig.json
.
Regarding the TypeScript compiler version, only the latest TypeScript is tested in development.
Run-time performance is not the only reason to use MessagePack, but it's important to choose MessagePack libraries, so a benchmark suite is provided to monitor the performance of this library.
V8's built-in JSON has been improved for years, esp.JSON.parse()
issignificantly improved in V8/7.6, it is the fastest deserializer as of 2019, as the benchmark result bellow suggests.
However, MessagePack can handles binary data effectively, actual performance depends on situations. Esp. streaming-decoding may be significantly faster than non-streaming decoding if it's effective. You'd better take benchmark on your own use-case if performance matters.
Benchmark on NodeJS/v22.13.1 (V8/12.4)
operation | op | ms | op/s |
---|---|---|---|
buf = Buffer.from(JSON.stringify(obj)); | 1348700 | 5000 | 269740 |
obj = JSON.parse(buf.toString("utf-8")); | 1700300 | 5000 | 340060 |
buf = require("msgpack-lite").encode(obj); | 591300 | 5000 | 118260 |
obj = require("msgpack-lite").decode(buf); | 539500 | 5000 | 107900 |
buf = require("@msgpack/msgpack").encode(obj); | 1238700 | 5000 | 247740 |
obj = require("@msgpack/msgpack").decode(buf); | 1402000 | 5000 | 280400 |
buf = /* @msgpack/msgpack */ encoder.encode(obj); | 1379800 | 5000 | 275960 |
obj = /* @msgpack/msgpack */ decoder.decode(buf); | 1406100 | 5000 | 281220 |
Note thatJSON
cases useBuffer
to emulate I/O where a JavaScript string must be converted into a byte array encoded in UTF-8, whereas MessagePack modules deal with byte arrays.
The NPM package distributed in npmjs.com includes both ES2015+ and ES5 files:
dist/
is compiled into ES2020 with CommomJS, provided for NodeJS v10dist.umd/
is compiled into ES5 with UMDdist.umd/msgpack.min.js
- the minified filedist.umd/msgpack.js
- the non-minified file
dist.esm/
is compiled into ES2020 with ES modules, provided for webpack-like bundlers and NodeJS's ESM-mode
If you use NodeJS and/or webpack, their module resolvers use the suitable one automatically.
This library is available via CDN:
<scriptcrossoriginsrc="https://unpkg.com/@msgpack/msgpack"></script>
It loadsMessagePack
module to the global object.
You can use this module on Deno.
Seeexample/deno-*.ts
for examples.
deno.land/x
is not supported.
You can use this module on Bun.
For simple testing:
npm run test
This library uses GitHub Actions.
Test matrix:
- NodeJS
- v18 / v20 / v22
- Browsers:
- Chrome, Firefox
- Deno
- Bun
#run tests on NodeJS, Chrome, and Firefoxmake test-all#edit the changelogcode CHANGELOG.md#bump versionnpm version patch|minor|major#run the publishing taskmake publish
npm run update-dependencies
Copyright 2019 The MessagePack community.
This software uses the ISC license:
https://opensource.org/licenses/ISC
SeeLICENSE for details.
About
@msgpack/msgpack - MessagePack for JavaScript / msgpack.org[ECMA-262/JavaScript/TypeScript]
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.