DataDog/protobuf.js

Name: protobuf.js

Owner: Datadog, Inc.

Description: Protocol Buffers for JavaScript (& TypeScript).

Forked from: dcodeIO/protobuf.js

Created: 2016-12-14 21:43:00.0

Updated: 2017-04-20 07:21:24.0

Pushed: 2016-12-15 00:07:04.0

Homepage: http://dcode.io/protobuf.js

Size: 18033

Language: JavaScript

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

protobuf.js

travis npm npm donate

Protocol Buffers are a language-neutral, platform-neutral, extensible way of serializing structured data for use in communications protocols, data storage, and more, originally designed at Google (see).

protobuf.js is a pure JavaScript implementation for node and the browser. It efficiently encodes plain objects and custom classes and works out of the box with .proto files.

Recommended read: Changes in protobuf.js 6.0

Features
Contents
Usage
node.js
pm install protobufjs
s
protobuf = require("protobufjs");
Browsers

Development:

ipt src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.1.0/dist/protobuf.js"></script>

Production:

ipt src="//cdn.rawgit.com/dcodeIO/protobuf.js/6.1.0/dist/protobuf.min.js"></script>

NOTE: Remember to replace the version tag with the exact release your project depends upon.

Or download the library.

The protobuf namespace will always be available globally / also supports AMD loaders.

Examples
Using .proto files
wesome.proto
age awesomepackage;
ax = "proto3";

age AwesomeMessage {
string awesome_field = 1; // becomes awesomeField

s
obuf.load("awesome.proto", function(err, root) {
if (err) throw err;

// Obtain a message type
var AwesomeMessage = root.lookup("awesomepackage.AwesomeMessage");

// Create a new message
var message = AwesomeMessage.create({ awesomeField: "AwesomeString" });

// Encode a message
var buffer = AwesomeMessage.encode(message).finish();
// ... do something with buffer

// Or, encode a plain object
var buffer = AwesomeMessage.encode({ awesomeField: "AwesomeString" }).finish();
// ... do something with buffer

// Decode a buffer
var message = AwesomeMessage.decode(buffer);
// ... do something with message

// If your application uses length-delimited buffers, there is also encodeDelimited and decodeDelimited.

You can also use promises by omitting the callback:

obuf.load("awesome.proto")
.then(function(root) {
   ...
});
Using reflection only

Root  = protobuf.Root,
Type  = protobuf.Type,
Field = protobuf.Field;

AwesomeMessage = new Type("AwesomeMessage").add(new Field("awesomeField", 1, "string"));

root = new Root().define("awesomepackage").add(AwesomeMessage);

ontinue at "Create a new message" above

Using custom classes


tion AwesomeMessage(properties) {
protobuf.Message.call(this, properties);

obuf.Class.create(root.lookup("awesomepackage.AwesomeMessage") /* or use reflection */, AwesomeMessage);

message = new AwesomeMessage({ awesomeField: "AwesomeString" });

ontinue at "Encode a message" above

Custom classes are automatically populated with static encode, encodeDelimited, decode, decodeDelimited and verify methods and reference their reflected type via the $type property. Note that there are no methods (just $type) on instances by default as method names might conflict with field names.

Using services
reeter.proto
ice Greeter {
rpc SayHello (HelloRequest) returns (HelloReply) {}


age HelloRequest {
string name = 1;


age HelloReply {
string message = 1;

s

Greeter = root.lookup("Greeter");
greeter = Greeter.create(rpcImpl, false, false); // rpcImpl (see below), requestDelimited?, responseDelimited?

ter.sayHello({ name: 'you' }, function(err, response) {
console.log('Greeting:', response.message);

To make this work, all you have to do is provide an rpcImpl, which is an asynchronous function that takes the reflected service method, the binary HelloRequest and a node-style callback as its parameters. For example:

tion rpcImpl(method, requestData, callback) {
// perform the request using an HTTP request or a WebSocket for example
var responseData = ...;
// and call the callback with the binary response afterwards:
callback(null, responseData);

There is also an example for streaming RPC.

Usage with TypeScript
<reference path="node_modules/protobufjs/types/protobuf.js.d.ts" />

rt * as protobuf from "protobufjs";

Module Structure

The library exports a flat protobuf namespace including but not restricted to the following members, ordered by category:

Parser
Serialization
Reflection
Runtime
Utility

For less common members, see the API documentation.

Documentation
Command line

The pbjs command line utility can be used to bundle and translate between .proto and .json files.

olidates imports and converts between file formats.

, --target    Specifies the target format. Also accepts a path to require a custom target.

              json          JSON representation
              json-module   JSON representation as a module
              proto2        Protocol Buffers, Version 2
              proto3        Protocol Buffers, Version 3
              static        Static code without reflection
              static-module Static code without reflection as a module

, --path      Adds a directory to the include path.

, --out       Saves to a file instead of writing to stdout.

, --wrap      Specifies the wrapper to use for *-module targets. Also accepts a path.

              default   Default wrapper supporting both CommonJS and AMD
              commonjs  CommonJS only wrapper
              amd       AMD only wrapper

, --root      Specifies an alternative protobuf.roots name for *-module targets.

atic code generation only:

no-encode     Does not generate encode functions.
no-decode     Does not generate decode functions.
no-verify     Does not generate verify functions.
no-delimited  Does not generate delimited encode/decode functions.

e: pbjs [options] file1.proto file2.json ...

For production environments it is recommended to bundle all your .proto files to a single .json file, which reduces the number of network requests and parser invocations required:

bjs -t json file1.proto file2.proto > bundle.json

Now, either include this file in your final bundle:

root = protobuf.Root.fromJSON(require("./bundle.json"));

or load it the usual way:

obuf.load("bundle.json", function(err, root) {
...

Generating TypeScript definitions from static modules

Likewise, the pbts command line utility can be used to generate TypeScript definitions from pbjs-generated static modules.

rates TypeScript definitions from annotated JavaScript files.

, --name      Specifies the module name.

, --out       Saves to a file instead of writing to stdout.

e: pbts [options] file1.js file2.js ...
Descriptors vs. static modules

While .proto and JSON files require the full library (about 18kb gzipped, all features including reflection, parser and utility), pretty much all code but the relatively short descriptors is shared.

Static code, on the other hand, requires just the minimal runtime (about 5.5kb gzipped, i.e. no reflection features), but generates additional, albeit editable and customizable, source code.

When new Function(...) is supported (and it usually is), there is no difference performance-wise as the code generated statically is the same as generated at runtime.

Building

To build the library or its components yourself, clone it from GitHub and install the development dependencies:

it clone https://github.com/dcodeIO/protobuf.js.git
d protobuf.js
pm install --dev

Building the development and production versions with their respective source maps to dist/:

pm run build

Building the documentation to docs/:

pm run docs

Building the TypeScript definition to types/:

pm run types
Browserify integration

protobuf.js integrates into any browserify build-process. There are a few possible tweaks:

Performance

The package includes a benchmark that tries to compare performance to native JSON as far as this is possible. On an i7-2600K running node 6.9.1 it yields:

hmarking encoding performance ...

.encode to buffer x 481,172 ops/sec ±0.48% (92 runs sampled)
.stringify to string x 307,509 ops/sec ±1.04% (92 runs sampled)
.stringify to buffer x 164,463 ops/sec ±1.37% (89 runs sampled)

  Type.encode to buffer was fastest
SON.stringify to string was 36.4% slower
SON.stringify to buffer was 66.1% slower

hmarking decoding performance ...

.decode from buffer x 1,319,810 ops/sec ±0.71% (92 runs sampled)
.parse from string x 298,578 ops/sec ±0.98% (90 runs sampled)
.parse from buffer x 267,471 ops/sec ±0.81% (89 runs sampled)

Type.decode from buffer was fastest
 JSON.parse from string was 77.4% slower
 JSON.parse from buffer was 79.8% slower

hmarking combined performance ...

 to/from buffer x 262,728 ops/sec ±0.92% (92 runs sampled)
 to/from string x 129,405 ops/sec ±0.78% (94 runs sampled)
 to/from buffer x 89,523 ops/sec ±0.71% (89 runs sampled)

    Type to/from buffer was fastest
    JSON to/from string was 50.7% slower
    JSON to/from buffer was 65.9% slower

hmarking verifying performance ...

.verify x 5,833,382 ops/sec ±0.98% (85 runs sampled)

            Type.verify was fastest

Note that JSON is a native binding nowadays and as such is about as fast as it possibly can get. So, how can protobuf.js be faster?

Note that code generation requires new Function(...) (basically eval) support and that an equivalent but slower fallback will be used where unsupported.

You can also run the benchmark

pm run bench

and the profiler yourself (the latter requires a recent version of node):

pm run prof <encode|decode|encode-browser|decode-browser> [iterations=10000000]

Note that as of this writing, the benchmark suite performs significantly slower on node 7.2.0 compared to 6.9.1 because moths.

Compatibility

Sauce Test Status

License: Apache License, Version 2.0, bundled external libraries may have their own license

Analytics


This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.