JSON serialization is already quite fast IMO so this is quite good. I think last time I compared JSON serialization to Protocol Buffers, JSON was just a little bit slower for typical scenarios but not materially so. These kinds of optimizations can shift the balance in terms of performance.
JSON is a great minimalist format which is both human and machine readable. I never quite understood the popularity of ProtoBuf; the binary format is a major sacrifice to readability. I get that some people appreciate the type validation but it adds a lot of complexity and friction to the transport protocol layer.
For me the appeal of protobuf is the wire-format forward-backward compatibility.
It's hard enough to not break logical compatibility, so I appreciate not having to think too hard about wire compat. You can of course solve the same thing with JSON, but, well, YOU have to solve it.
(Also worth noting, there are a lot of things I don't like about the grpc ecosystem so I don't actually use it that much. But this is one of the pieces I really like a lot).
Arguably JSON doesn't have this problem at all since it encodes the field names too. The only thing it doesn't handle is field renames, but I mean, come on, you know you can't rename a field in public API anyways :)
Many JSON parsers allow to store extra fields in a separate dictionary/map, and certainly the format allows for it too, so I'm not sure what makes you say that statically typed languages are at a disadvantage here
> You can of course solve the same thing with JSON, but, well, YOU have to solve it.
There is not a single well established convention across all languages/impls. The default behavior in many languages if a field is missing is to either panic, or replace it with a null pointer (which will just panic later, most likely).
A format cannot be both human and machine readable. JSON is human readable, that's the point of it. Human readability is great for debugging only but it has an overhead because it's not machine friendly. Protobuf messages are both smaller and quicker to decode. If you're in an environment where you're handling millions of messages per second binary formats pay dividends. The number of messages viewed by a human is miniscule so there's no real gain to having that slow path. Just write a message dump tool.
It is readable but it's not a good/fast format. IEEE754<->string is just expensive even w/ all the shortcuts and improvements. byte[]s have no good way to be presented either.
JSON is a great minimalist format which is both human and machine readable. I never quite understood the popularity of ProtoBuf; the binary format is a major sacrifice to readability. I get that some people appreciate the type validation but it adds a lot of complexity and friction to the transport protocol layer.