- Notifications
You must be signed in to change notification settings - Fork1.4k
Protocol Buffers for JavaScript & TypeScript.
License
protobufjs/protobuf.js
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Protocol Buffers are a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols, data storage, and more, originally designed at Google (see).
protobuf.js is a pure JavaScript implementation withTypeScript support forNode.js and the browser. It's easy to use, does not sacrifice on performance, has good conformance and works out of the box with.proto files!
Installation
How to include protobuf.js in your project.Usage
A brief introduction to using the toolset.Examples
A few examples to get you started.Additional documentation
A list of available documentation resources.Performance
A few internals and a benchmark on performance.Compatibility
Notes on compatibility regarding browsers and optional libraries.Building
How to build the library and its components yourself.
npm install protobufjs --save
// Static code + Reflection + .proto parservarprotobuf=require("protobufjs");// Static code + Reflectionvarprotobuf=require("protobufjs/light");// Static code onlyvarprotobuf=require("protobufjs/minimal");
The optionalcommand line utility to generate static code and reflection bundles lives in theprotobufjs-cli
package and can be installed separately:
npm install protobufjs-cli --save-dev
Pick the variant matching your needs and replace the version tag with the exactrelease your project depends upon. For example, to use the minified full variant:
<scriptsrc="//cdn.jsdelivr.net/npm/protobufjs@7.X.X/dist/protobuf.min.js"></script>
Distribution | Location |
---|---|
Full | https://cdn.jsdelivr.net/npm/protobufjs/dist/ |
Light | https://cdn.jsdelivr.net/npm/protobufjs/dist/light/ |
Minimal | https://cdn.jsdelivr.net/npm/protobufjs/dist/minimal/ |
All variants support CommonJS and AMD loaders and export globally aswindow.protobuf
.
Because JavaScript is a dynamically typed language, protobuf.js utilizes the concept of avalid message in order to provide the best possibleperformance (and, as a side product, proper typings):
A valid message is an object (1) not missing any required fields and (2) exclusively composed of JS types understood by the wire format writer.
There are two possible types of valid messages and the encoder is able to work with both of these for convenience:
- Message instances (explicit instances of message classes with default values on their prototype) naturally satisfy the requirements of a valid message and
- Plain JavaScript objects that just so happen to be composed in a way satisfying the requirements of a valid message as well.
In a nutshell, the wire format writer understands the following types:
Field type | Expected JS type (create, encode) | Conversion (fromObject) |
---|---|---|
s-/u-/int32 s-/fixed32 | number (32 bit integer) | value | 0 if signedvalue >>> 0 if unsigned |
s-/u-/int64 s-/fixed64 | Long -like (optimal)number (53 bit integer) | Long.fromValue(value) with long.jsparseInt(value, 10) otherwise |
float double | number | Number(value) |
bool | boolean | Boolean(value) |
string | string | String(value) |
bytes | Uint8Array (optimal)Buffer (optimal under node)Array.<number> (8 bit integers) | base64.decode(value) if astring Object with non-zero.length is assumed to be buffer-like |
enum | number (32 bit integer) | Looks up the numeric id if astring |
message | Valid message | Message.fromObject(value) |
repeated T | Array<T> | Copy |
map<K, V> | Object<K,V> | Copy |
- Explicit
undefined
andnull
are considered as not set if the field is optional. - Maps are objects where the key is the string representation of the respective value or an 8 characters long hash string for
Long
-likes.
With that in mind and again for performance reasons, each message class provides a distinct set of methods with each method doing just one thing. This avoids unnecessary assertions / redundant operations where performance is a concern but also forces a user to perform verification (of plain JavaScript objects thatmight just so happen to be a valid message) explicitly where necessary - for example when dealing with user input.
Note thatMessage
below refers to any message class.
Message.verify(message:
Object
):null|string
verifies that aplain JavaScript object satisfies the requirements of a valid message and thus can be encoded without issues. Instead of throwing, it returns the error message as a string, if any.varpayload="invalid (not an object)";varerr=AwesomeMessage.verify(payload);if(err)throwError(err);
Message.encode(message:
Message|Object
[, writer:Writer
]):Writer
encodes amessage instance or validplain JavaScript object. This method does not implicitly verify the message and it's up to the user to make sure that the payload is a valid message.varbuffer=AwesomeMessage.encode(message).finish();
Message.encodeDelimited(message:
Message|Object
[, writer:Writer
]):Writer
works likeMessage.encode
but additionally prepends the length of the message as a varint.Message.decode(reader:
Reader|Uint8Array
):Message
decodes a buffer to amessage instance. If required fields are missing, it throws autil.ProtocolError
with aninstance
property set to the so far decoded message. If the wire format is invalid, it throws anError
.try{vardecodedMessage=AwesomeMessage.decode(buffer);}catch(e){if(einstanceofprotobuf.util.ProtocolError){// e.instance holds the so far decoded message with missing required fields}else{// wire format is invalid}}
Message.decodeDelimited(reader:
Reader|Uint8Array
):Message
works likeMessage.decode
but additionally reads the length of the message prepended as a varint.Message.create(properties:
Object
):Message
creates a newmessage instance from a set of properties that satisfy the requirements of a valid message. Where applicable, it is recommended to preferMessage.create
overMessage.fromObject
because it doesn't perform possibly redundant conversion.varmessage=AwesomeMessage.create({awesomeField:"AwesomeString"});
Message.fromObject(object:
Object
):Message
converts any non-validplain JavaScript object to amessage instance using the conversion steps outlined within the table above.varmessage=AwesomeMessage.fromObject({awesomeField:42});// converts awesomeField to a string
Message.toObject(message:
Message
[, options:ConversionOptions
]):Object
converts amessage instance to an arbitraryplain JavaScript object for interoperability with other libraries or storage. The resulting plain JavaScript objectmight still satisfy the requirements of a valid message depending on the actual conversion options specified, but most of the time it does not.varobject=AwesomeMessage.toObject(message,{enums:String,// enums as string nameslongs:String,// longs as strings (requires long.js)bytes:String,// bytes as base64 encoded stringsdefaults:true,// includes default valuesarrays:true,// populates empty arrays (repeated fields) even if defaults=falseobjects:true,// populates empty objects (map fields) even if defaults=falseoneofs:true// includes virtual oneof fields set to the present field's name});
For reference, the following diagram aims to display relationships between the different methods and the concept of a valid message:
In other words:
verify
indicates that callingcreate
orencode
directly on the plain object will [result in a valid message respectively] succeed.fromObject
, on the other hand, does conversion from a broader range of plain objects to create valid messages. (ref)
It is possible to load existing .proto files using the full library, which parses and compiles the definitions to ready to use (reflection-based) message classes:
// awesome.protopackageawesomepackage;syntax="proto3";messageAwesomeMessage {stringawesome_field=1;// becomes awesomeField}
protobuf.load("awesome.proto",function(err,root){if(err)throwerr;// Obtain a message typevarAwesomeMessage=root.lookupType("awesomepackage.AwesomeMessage");// Exemplary payloadvarpayload={awesomeField:"AwesomeString"};// Verify the payload if necessary (i.e. when possibly incomplete or invalid)varerrMsg=AwesomeMessage.verify(payload);if(errMsg)throwError(errMsg);// Create a new messagevarmessage=AwesomeMessage.create(payload);// or use .fromObject if conversion is necessary// Encode a message to an Uint8Array (browser) or Buffer (node)varbuffer=AwesomeMessage.encode(message).finish();// ... do something with buffer// Decode an Uint8Array (browser) or Buffer (node) to a messagevarmessage=AwesomeMessage.decode(buffer);// ... do something with message// If the application uses length-delimited buffers, there is also encodeDelimited and decodeDelimited.// Maybe convert the message back to a plain objectvarobject=AwesomeMessage.toObject(message,{longs:String,enums:String,bytes:String,// see ConversionOptions});});
Additionally, promise syntax can be used by omitting the callback, if preferred:
protobuf.load("awesome.proto").then(function(root){ ...});
The library utilizes JSON descriptors that are equivalent to a .proto definition. For example, the following is identical to the .proto definition seen above:
// awesome.json{"nested": {"awesomepackage": {"nested": {"AwesomeMessage": {"fields": {"awesomeField": {"type":"string","id":1 } } } } } }}
JSON descriptors closely resemble the internal reflection structure:
Type (T) | Extends | Type-specific properties |
---|---|---|
ReflectionObject | options | |
Namespace | ReflectionObject | nested |
Root | Namespace | nested |
Type | Namespace | fields |
Enum | ReflectionObject | values |
Field | ReflectionObject | rule,type,id |
MapField | Field | keyType |
OneOf | ReflectionObject | oneof (array of field names) |
Service | Namespace | methods |
Method | ReflectionObject | type,requestType,responseType, requestStream, responseStream |
- Bold properties are required.Italic types are abstract.
T.fromJSON(name, json)
creates the respective reflection object from a JSON descriptorT#toJSON()
creates a JSON descriptor from the respective reflection object (its name is used as the key within the parent)
Exclusively using JSON descriptors instead of .proto files enables the use of just the light library (the parser isn't required in this case).
A JSON descriptor can either be loaded the usual way:
protobuf.load("awesome.json",function(err,root){if(err)throwerr;// Continue at "Obtain a message type" above});
Or it can be loaded inline:
varjsonDescriptor=require("./awesome.json");// exemplary for nodevarroot=protobuf.Root.fromJSON(jsonDescriptor);// Continue at "Obtain a message type" above
Both the full and the light library include full reflection support. One could, for example, define the .proto definitions seen in the examples above using just reflection:
...varRoot=protobuf.Root,Type=protobuf.Type,Field=protobuf.Field;varAwesomeMessage=newType("AwesomeMessage").add(newField("awesomeField",1,"string"));varroot=newRoot().define("awesomepackage").add(AwesomeMessage);// Continue at "Create a new message" above...
Detailed information on the reflection structure is available within theAPI documentation.
Message classes can also be extended with custom functionality and it is also possible to register a custom constructor with a reflected message type:
...// Define a custom constructorfunctionAwesomeMessage(properties){// custom initialization code ...}// Register the custom constructor with its reflected type (*)root.lookupType("awesomepackage.AwesomeMessage").ctor=AwesomeMessage;// Define custom functionalityAwesomeMessage.customStaticMethod=function(){ ...};AwesomeMessage.prototype.customInstanceMethod=function(){ ...};// Continue at "Create a new message" above
(*) Besides referencing its reflected type throughAwesomeMessage.$type
andAwesomeMesage#$type
, the respective custom class is automatically populated with:
AwesomeMessage.create
AwesomeMessage.encode
andAwesomeMessage.encodeDelimited
AwesomeMessage.decode
andAwesomeMessage.decodeDelimited
AwesomeMessage.verify
AwesomeMessage.fromObject
,AwesomeMessage.toObject
andAwesomeMessage#toJSON
Afterwards, decoded messages of this type areinstanceof AwesomeMessage
.
Alternatively, it is also possible to reuse and extend the internal constructor if custom initialization code is not required:
...// Reuse the internal constructorvarAwesomeMessage=root.lookupType("awesomepackage.AwesomeMessage").ctor;// Define custom functionalityAwesomeMessage.customStaticMethod=function(){ ...};AwesomeMessage.prototype.customInstanceMethod=function(){ ...};// Continue at "Create a new message" above
The library also supports consuming services but it doesn't make any assumptions about the actual transport channel. Instead, a user must provide a suitable RPC implementation, which is an asynchronous function that takes the reflected service method, the binary request and a node-style callback as its parameters:
functionrpcImpl(method,requestData,callback){// perform the request using an HTTP request or a WebSocket for examplevarresponseData= ...;// and call the callback with the binary response afterwards:callback(null,responseData);}
Below is a working example with a typescript implementation using grpc npm package.
constgrpc=require('grpc')constClient=grpc.makeGenericClientConstructor({})constclient=newClient(grpcServerUrl,grpc.credentials.createInsecure())constrpcImpl=function(method,requestData,callback){client.makeUnaryRequest(method.name,arg=>arg,arg=>arg,requestData,callback)}
Example:
// greeter.protosyntax="proto3";serviceGreeter {rpcSayHello (HelloRequest)returns (HelloReply) {}}messageHelloRequest {stringname=1;}messageHelloReply {stringmessage=1;}
...varGreeter=root.lookup("Greeter");vargreeter=Greeter.create(/* see above */rpcImpl,/* request delimited? */false,/* response delimited? */false);greeter.sayHello({name:'you'},function(err,response){console.log('Greeting:',response.message);});
Services also support promises:
greeter.sayHello({name:'you'}).then(function(response){console.log('Greeting:',response.message);});
There is also anexample for streaming RPC.
Note that the service API is meant for clients. Implementing a server-side endpoint pretty much always requires transport channel (i.e. http, websocket, etc.) specific code with the only common denominator being that it decodes and encodes messages.
The library ships with its owntype definitions and modern editors likeVisual Studio Code will automatically detect and use them for code completion.
The npm package depends on@types/node because ofBuffer
and@types/long because ofLong
. If you are not building for node and/or not using long.js, it should be safe to exclude them manually.
The API shown above works pretty much the same with TypeScript. However, because everything is typed, accessing fields on instances of dynamically generated message classes requires either using bracket-notation (i.e.message["awesomeField"]
) or explicit casts. Alternatively, it is possible to use atypings file generated for its static counterpart.
import{load}from"protobufjs";// respectively "./node_modules/protobufjs"load("awesome.proto",function(err,root){if(err)throwerr;// example codeconstAwesomeMessage=root.lookupType("awesomepackage.AwesomeMessage");letmessage=AwesomeMessage.create({awesomeField:"hello"});console.log(`message =${JSON.stringify(message)}`);letbuffer=AwesomeMessage.encode(message).finish();console.log(`buffer =${Array.prototype.toString.call(buffer)}`);letdecoded=AwesomeMessage.decode(buffer);console.log(`decoded =${JSON.stringify(decoded)}`);});
If you generated static code tobundle.js
using the CLI and its type definitions tobundle.d.ts
, then you can just do:
import{AwesomeMessage}from"./bundle.js";// example codeletmessage=AwesomeMessage.create({awesomeField:"hello"});letbuffer=AwesomeMessage.encode(message).finish();letdecoded=AwesomeMessage.decode(buffer);
The library also includes an early implementation ofdecorators.
Note that decorators are an experimental feature in TypeScript and that declaration order is important depending on the JS target. For example,@Field.d(2, AwesomeArrayMessage)
requires thatAwesomeArrayMessage
has been defined earlier when targetingES5
.
import{Message,Type,Field,OneOf}from"protobufjs/light";// respectively "./node_modules/protobufjs/light.js"exportclassAwesomeSubMessageextendsMessage<AwesomeSubMessage>{ @Field.d(1,"string")publicawesomeString:string;}exportenumAwesomeEnum{ONE=1,TWO=2}@Type.d("SuperAwesomeMessage")exportclassAwesomeMessageextendsMessage<AwesomeMessage>{ @Field.d(1,"string","optional","awesome default string")publicawesomeField:string; @Field.d(2,AwesomeSubMessage)publicawesomeSubMessage:AwesomeSubMessage; @Field.d(3,AwesomeEnum,"optional",AwesomeEnum.ONE)publicawesomeEnum:AwesomeEnum; @OneOf.d("awesomeSubMessage","awesomeEnum")publicwhich:string;}// example codeletmessage=newAwesomeMessage({awesomeField:"hello"});letbuffer=AwesomeMessage.encode(message).finish();letdecoded=AwesomeMessage.decode(buffer);
Supported decorators are:
Type.d(typeName?:
string
) (optional)
annotates a class as a protobuf message type. IftypeName
is not specified, the constructor's runtime function name is used for the reflected type.Field.d<T>(fieldId:
number
, fieldType:string | Constructor<T>
, fieldRule?:"optional" | "required" | "repeated"
, defaultValue?:T
)
annotates a property as a protobuf field with the specified id and protobuf type.MapField.d<T extends { [key: string]: any }>(fieldId:
number
, fieldKeyType:string
, fieldValueType.string | Constructor<{}>
)
annotates a property as a protobuf map field with the specified id, protobuf key and value type.OneOf.d<T extends string>(...fieldNames:
string[]
)
annotates a property as a protobuf oneof covering the specified fields.
Other notes:
- Decorated types reside in
protobuf.roots["decorated"]
using a flat structure, so no duplicate names. - Enums are copied to a reflected enum with a generic name on decorator evaluation because referenced enum objects have no runtime name the decorator could use.
- Default values must be specified as arguments to the decorator instead of using a property initializer for proper prototype behavior.
- Property names on decorated classes must not be renamed on compile time (i.e. by a minifier) because decorators just receive the original field name as a string.
ProTip! Not as pretty, but you canuse decorators in plain JavaScript as well.
- Questions and answers on StackOverflow
The package includes a benchmark that compares protobuf.js performance to native JSON (as far as this is possible) andGoogle's JS implementation. On an i7-2600K running node 6.9.1 it yields:
benchmarking encoding performance ...protobuf.js (reflect) x 541,707 ops/sec ±1.13% (87 runs sampled)protobuf.js (static) x 548,134 ops/sec ±1.38% (89 runs sampled)JSON (string) x 318,076 ops/sec ±0.63% (93 runs sampled)JSON (buffer) x 179,165 ops/sec ±2.26% (91 runs sampled)google-protobuf x 74,406 ops/sec ±0.85% (86 runs sampled) protobuf.js (static) was fastest protobuf.js (reflect) was 0.9% ops/sec slower (factor 1.0) JSON (string) was 41.5% ops/sec slower (factor 1.7) JSON (buffer) was 67.6% ops/sec slower (factor 3.1) google-protobuf was 86.4% ops/sec slower (factor 7.3)benchmarking decoding performance ...protobuf.js (reflect) x 1,383,981 ops/sec ±0.88% (93 runs sampled)protobuf.js (static) x 1,378,925 ops/sec ±0.81% (93 runs sampled)JSON (string) x 302,444 ops/sec ±0.81% (93 runs sampled)JSON (buffer) x 264,882 ops/sec ±0.81% (93 runs sampled)google-protobuf x 179,180 ops/sec ±0.64% (94 runs sampled) protobuf.js (reflect) was fastest protobuf.js (static) was 0.3% ops/sec slower (factor 1.0) JSON (string) was 78.1% ops/sec slower (factor 4.6) JSON (buffer) was 80.8% ops/sec slower (factor 5.2) google-protobuf was 87.0% ops/sec slower (factor 7.7)benchmarking combined performance ...protobuf.js (reflect) x 275,900 ops/sec ±0.78% (90 runs sampled)protobuf.js (static) x 290,096 ops/sec ±0.96% (90 runs sampled)JSON (string) x 129,381 ops/sec ±0.77% (90 runs sampled)JSON (buffer) x 91,051 ops/sec ±0.94% (90 runs sampled)google-protobuf x 42,050 ops/sec ±0.85% (91 runs sampled) protobuf.js (static) was fastest protobuf.js (reflect) was 4.7% ops/sec slower (factor 1.0) JSON (string) was 55.3% ops/sec slower (factor 2.2) JSON (buffer) was 68.6% ops/sec slower (factor 3.2) google-protobuf was 85.5% ops/sec slower (factor 6.9)
These results are achieved by
- generating type-specific encoders, decoders, verifiers and converters at runtime
- configuring the reader/writer interface according to the environment
- using node-specific functionality where beneficial and, of course
- avoiding unnecessary operations through splitting upthe toolset.
You can also runthe benchmark ...
$> npm run bench
andthe profiler yourself (the latter requires a recent version of node):
$> npm run prof <encode|decode|encode-browser|decode-browser> [iterations=10000000]
Note that as of this writing, the benchmark suite performs significantly slower on node 7.2.0 compared to 6.9.1 because moths.
- Works in all modern and not-so-modern browsers except IE8.
- Because the internals of this package do not rely on
google/protobuf/descriptor.proto
, options are parsed and presented literally. - If typed arrays are not supported by the environment, plain arrays will be used instead.
- Support for pre-ES5 environments (except IE8) can be achieved byusing a polyfill.
- Support forContent Security Policy-restricted environments (like Chrome extensions without unsafe-eval) can be achieved by generating and using static code instead.
- If a proper way to work with 64 bit values (uint64, int64 etc.) is required, just installlong.js alongside this library. All 64 bit numbers will then be returned as a
Long
instance instead of a possibly unsafe JavaScript number (see). - For descriptor.proto interoperability, seeext/descriptor
To build the library or its components yourself, clone it from GitHub and install the development dependencies:
$> git clone https://github.com/protobufjs/protobuf.js.git$> cd protobuf.js$> npm install
Building the respective development and production versions with their respective source maps todist/
:
$> npm run build
Building the documentation todocs/
:
$> npm run docs
Building the TypeScript definition toindex.d.ts
:
$> npm run build:types
By default, protobuf.js integrates into any browserify build-process without requiring any optional modules. Hence:
If int64 support is required, explicitly require the
long
module somewhere in your project as it will be excluded otherwise. This assumes that a globalrequire
function is present that protobuf.js can call to obtain the long module.If there is no global
require
function present after bundling, it's also possible to assign the long module programmatically:varLong= ...;protobuf.util.Long=Long;protobuf.configure();
If you have any special requirements, there isthe bundler for reference.
License:BSD 3-Clause License
About
Protocol Buffers for JavaScript & TypeScript.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.