Protons
    Preparing search index...

    Module protons

    protons is a high performance implementation of Protocol Buffers v3.

    It transpiles code to TypeScript and supports BigInts for 64 bit types.

    The protons module contains the code to compile .proto files to .ts files and protons-runtime contains the code to do serialization/deserialization to Uint8Arrays during application execution.

    Please ensure you declare them as the correct type of dependencies:

    $ npm install --save-dev protons
    $ npm install --save protons-runtime

    First generate your .ts files:

    $ protons ./path/to/foo.proto ./path/to/output.ts
    

    Then run tsc over them as normal:

    $ tsc
    

    In your code import the generated classes and use them to transform to/from bytes:

    import { Foo } from './foo.js'

    const foo = {
    message: 'hello world'
    }

    const encoded = Foo.encode(foo)
    const decoded = Foo.decode(encoded)

    console.info(decoded.message)
    // 'hello world'

    This module uses the internal reader/writer from protobuf.js as it is highly optimised and there's no point reinventing the wheel.

    It does have one or two differences:

    1. Supports proto3 semantics only
    2. All 64 bit values are represented as BigInts and not Longs (e.g. int64, uint64, sint64 etc)
    3. Unset optional fields are set on the deserialized object forms as undefined instead of the default values
    4. singular fields set to default values are not serialized and are set to default values when deserialized if not set - protobuf.js diverges from the language guide around this feature
    5. map fields can have keys of any type - protobufs.js only supports strings
    6. map fields are deserialized as ES6 Maps - protobuf.js uses Objects

    To protect decoders from malicious payloads, it's possible to limit the maximum size of repeated/map elements.

    You can either do this at compile time by using the protons.options extension:

    message MyMessage {
    // repeatedField cannot have more than 10 entries
    repeated uint32 repeatedField = 1 [(protons.options).limit = 10];

    // stringMap cannot have more than 10 keys
    map<string, string> stringMap = 2 [(protons.options).limit = 10];
    }

    Or at runtime by passing objects to the .decode function of your message:

    const message = MyMessage.decode(buf, {
    limits: {
    repeatedField: 10,
    stringMap: 10
    }
    })

    Sub messages with repeating elements can be limited in a similar way:

    message SubMessage {
    repeated uint32 repeatedField = 1;
    }

    message MyMessage {
    SubMessage message = 1;
    }
    const message = MyMessage.decode(buf, {
    limits: {
    messages: {
    repeatedField: 5 // the SubMessage can not have more than 5 repeatedField entries
    }
    }
    })

    Sub messages defined in repeating elements can be limited by appending $ to the field name in the runtime limit options:

    message SubMessage {
    repeated uint32 repeatedField = 1;
    }

    message MyMessage {
    repeated SubMessage messages = 1;
    }
    const message = MyMessage.decode(buf, {
    limits: {
    messages: 5 // max 5x SubMessages
    messages$: {
    repeatedField: 5 // no SubMessage can have more than 5 repeatedField entries
    }
    }
    })

    Repeating fields in map entries can be limited by appending $value to the field name in the runtime limit options:

    message SubMessage {
    repeated uint32 repeatedField = 1;
    }

    message MyMessage {
    map<string, SubMessage> messages = 1;
    }
    const message = MyMessage.decode(buf, {
    limits: {
    messages: 5 // max 5x SubMessages in the map
    messages$value: {
    repeatedField: 5 // no SubMessage in the map can have more than 5 repeatedField entries
    }
    }
    })

    By default 64 bit types are implemented as BigInts.

    Sometimes this is undesirable due to performance issues or code legibility.

    It's possible to override the JavaScript type 64 bit fields will deserialize to:

    message MyMessage {
    repeated int64 bigintField = 1;
    repeated int64 numberField = 2 [jstype = JS_NUMBER];
    repeated int64 stringField = 3 [jstype = JS_STRING];
    }
    const message = MyMessage.decode(buf)

    console.info(typeof message.bigintField) // bigint
    console.info(typeof message.numberField) // number
    console.info(typeof message.stringField) // string

    Some features are missing OneOfs, etc due to them not being needed so far in ipfs/libp2p. If these features are important to you, please open PRs implementing them along with tests comparing the generated bytes to protobuf.js and pbjs.

    protons

    ipfs.tech Discuss codecov CI

    Protobuf to ts transpiler

    About

    protons is a high performance implementation of Protocol Buffers v3.

    It transpiles code to TypeScript and supports BigInts for 64 bit types.

    The protons module contains the code to compile .proto files to .ts files and protons-runtime contains the code to do serialization/deserialization to Uint8Arrays during application execution.

    Please ensure you declare them as the correct type of dependencies:

    $ npm install --save-dev protons
    $ npm install --save protons-runtime

    First generate your .ts files:

    $ protons ./path/to/foo.proto ./path/to/output.ts
    

    Then run tsc over them as normal:

    $ tsc
    

    In your code import the generated classes and use them to transform to/from bytes:

    import { Foo } from './foo.js'

    const foo = {
    message: 'hello world'
    }

    const encoded = Foo.encode(foo)
    const decoded = Foo.decode(encoded)

    console.info(decoded.message)
    // 'hello world'

    This module uses the internal reader/writer from protobuf.js as it is highly optimised and there's no point reinventing the wheel.

    It does have one or two differences:

    1. Supports proto3 semantics only
    2. All 64 bit values are represented as BigInts and not Longs (e.g. int64, uint64, sint64 etc)
    3. Unset optional fields are set on the deserialized object forms as undefined instead of the default values
    4. singular fields set to default values are not serialized and are set to default values when deserialized if not set - protobuf.js diverges from the language guide around this feature
    5. map fields can have keys of any type - protobufs.js only supports strings
    6. map fields are deserialized as ES6 Maps - protobuf.js uses Objects

    To protect decoders from malicious payloads, it's possible to limit the maximum size of repeated/map elements.

    You can either do this at compile time by using the protons.options extension:

    message MyMessage {
    // repeatedField cannot have more than 10 entries
    repeated uint32 repeatedField = 1 [(protons.options).limit = 10];

    // stringMap cannot have more than 10 keys
    map<string, string> stringMap = 2 [(protons.options).limit = 10];
    }

    Or at runtime by passing objects to the .decode function of your message:

    const message = MyMessage.decode(buf, {
    limits: {
    repeatedField: 10,
    stringMap: 10
    }
    })

    Sub messages with repeating elements can be limited in a similar way:

    message SubMessage {
    repeated uint32 repeatedField = 1;
    }

    message MyMessage {
    SubMessage message = 1;
    }
    const message = MyMessage.decode(buf, {
    limits: {
    messages: {
    repeatedField: 5 // the SubMessage can not have more than 5 repeatedField entries
    }
    }
    })

    Sub messages defined in repeating elements can be limited by appending $ to the field name in the runtime limit options:

    message SubMessage {
    repeated uint32 repeatedField = 1;
    }

    message MyMessage {
    repeated SubMessage messages = 1;
    }
    const message = MyMessage.decode(buf, {
    limits: {
    messages: 5 // max 5x SubMessages
    messages$: {
    repeatedField: 5 // no SubMessage can have more than 5 repeatedField entries
    }
    }
    })

    Repeating fields in map entries can be limited by appending $value to the field name in the runtime limit options:

    message SubMessage {
    repeated uint32 repeatedField = 1;
    }

    message MyMessage {
    map<string, SubMessage> messages = 1;
    }
    const message = MyMessage.decode(buf, {
    limits: {
    messages: 5 // max 5x SubMessages in the map
    messages$value: {
    repeatedField: 5 // no SubMessage in the map can have more than 5 repeatedField entries
    }
    }
    })

    By default 64 bit types are implemented as BigInts.

    Sometimes this is undesirable due to performance issues or code legibility.

    It's possible to override the JavaScript type 64 bit fields will deserialize to:

    message MyMessage {
    repeated int64 bigintField = 1;
    repeated int64 numberField = 2 [jstype = JS_NUMBER];
    repeated int64 stringField = 3 [jstype = JS_STRING];
    }
    const message = MyMessage.decode(buf)

    console.info(typeof message.bigintField) // bigint
    console.info(typeof message.numberField) // number
    console.info(typeof message.stringField) // string

    Some features are missing OneOfs, etc due to them not being needed so far in ipfs/libp2p. If these features are important to you, please open PRs implementing them along with tests comparing the generated bytes to protobuf.js and pbjs.

    Install

    $ npm i protons
    

    API Docs

    License

    Licensed under either of

    Contribute

    Contributions welcome! Please check out the issues.

    Also see our contributing document for more information on how we work, and about contributing in general.

    Please be aware that all interactions related to this repo are subject to the IPFS Code of Conduct.

    Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

    Enumerations

    CODEC_TYPES

    Classes

    CodeError

    Functions

    generate