Binaron.Serializer — for .Net Core 3.0 and above

Zach Saw
4 min readJan 30, 2020

--

GitHub Project Page
Binaron.Serializer NuGet package

Performance vs BinaryFormatter and Protobuf

vs BinaryFormatter: 3x faster in both serialization and deserialization
vs Protobuf.NET: 9x faster in serialization and 2.5x faster in deserialization

Head over to .NET Fiddle to see for yourself.

What is Binaron.Serializer?

Binaron.Serializer is an MIT licensed open source binary serializer. It is designed and written from ground up to be really fast for modern programming languages with an open source binary object notation format.

In the GitHub repository above, you’ll find 2 benchmarks using BenchmarkDotNet comparing Binaron.Serializer to Newtonsoft.JSON.

The first one showcases best case scenario vs a JSON serializer, where we serialize an array of double with 64k items. In reality, this could be your typical weights from a CNN model.

Benchmark 1

In this benchmark, Binaron.Serializer is nearly 150x faster than Newtonsoft.JSON in serialization and around 137x faster in deserialization!

Array of Doubles — e.g. Weights from a CNN Model

Benchmark 2

The second benchmark is your typical DTO where Binaron.Serializer’s advantage is not as pronounced, but is over 3.5x faster in serialization and 4.3x faster in deserialization nonetheless.

Book Object (Typical DTO)

Usage

Polymorphism Support

Binaron.Serializer can be configured to support serialization / deserialization of interfaces and abstract types.

If you are using a service provider, you would use it in the PersonFactory to construct Employee and Customerinstead.

For serialization / deserialization, you’ll need to provide the PersonIdentifierProvider as well as the PersonFactory as follows.

Ignore Attributes

Binaron.Serializer supports the following ignore attributes: System.NonSerializedAttribute and System.Runtime.Serialization.IgnoreDataMemberAttribute.

Limitations

Binaron.Serializer uses and relies heavily on the newly released features of .net standard 2.1 for maximum performance and thus is only compatible with .net core app 3.0 and above.

High Unit Test Coverage

Writing a serializer was easy. Writing a deserializer that deserializes to ExpandoObject(dynamic type) was just as easy. However, deserializing to a specific type was a PITA simply because of the need to make Binaron.Serializer fit its serialized data as best it could (to sensible limits set in the Binary Object Notation documentation) to the destination object. For example, an int32 type should fit int64 and the deserializer shoud be smart enough to do that transparently. Likewise, an object with properties / fields that's been serialized should be deserializable to a dictionary.

To make sure all these permutations are covered and tested, the unit tests in this repository has a 94% coverage. Not perfect but most would agree it is high enough and will be improved in the near future.

Why Another Serializer?

Big payloads

In the world of microservices, data payloads tend to be pretty big these days with how the data is now consumed. As network bandwidth becomes cheaper, bigger data becomes the norm as it opens up UI/UX that would otherwise have been impossible (e.g. responsive web apps and mobi/mobile apps). Converting from text to object and vice versa is a very slow process. The bigger the payload, the slower it is, naturally.

JSON was created for consumption of the old web days where Javascript had limited support for binary. Unfortunately for everyone else on first class languages, they’ve had to dumb down to the common lowest denominator.

But… JSON is human readable

JSON does have its merits such as human readability. But, does machine really care about human readability? At what cost are we sacrificing performance — thus infra cost, latencies and ultimately user experience? If we really care about human readability, we could simply have the endpoint support two different types of accept-headers — one for JSON, the other Binary. In a normal day to day operation, you would go binary. For debugging purposes, give it a JSON only accept-header and you would get JSON sent back to you. How many microservices are doing this though?

What about protobuf?

Granted there are myriad of libraries that have tried to do binary serialization. The most popular is arguably protobuf. All of these libraries are fast (by normal standards but very slow compared to Binaron.Serializer) but they lack the one key feature that JSON serializers offer — the ability to serialize from / de-serialize to any unstructured object.

For example, protobuf requires a schema to be defined for the structure of the object — i.e. a .proto file. This itself is a massive burden on the developers to learn, create, debug and maintain. What do you do if you’re storing data in a NoSQL manner where data is simply unstructured?

Unfortunately, all the binary serializers for .NET have one fundamental flaw — they assume your data is structured, including ZeroFormatter.

But… Binary serializers are brittle, aren’t they?

No. Only the .NET BinaryFormatter is brittle because it serializes the type’s full name (including namespace). It encodes exact type names etc, making it useless for archiving data such as in a document format for an application for example.

Binaron.Serializer has the same brittleness as JSON serializers.

TL;DR

In other words, I wanted a drop-in replacement for Json.NET with near zero learning curve that promises vastly superior performance but couldn’t find one.

--

--