You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
import protobuf
import std/streams
import std/monotimes
import std/strformat
import std/random
# Define our protobuf specification and generate Nim code to use itconst protoSpec ="""syntax = "proto3";message ExampleMessage { int32 number = 1; string text = 2; bytes pixels = 3;}"""parseProto(protoSpec)
# Create our messagevar
msg =newExampleMessage
pixels: seq[uint8]
msg.number =10
msg.text ="Hello world"let bufferSize =512*512for i in0..<bufferSize:
pixels.add((uint8)rand(0..255)) # B channel (0)
pixels.add((uint8)rand(0..255)) # G channel (1)
pixels.add((uint8)rand(0..255)) # R channel (2)
pixels.add((uint8)rand(0..255)) # A channel (3)
msg.pixels = pixels
# Write it to a streamvar stream =newStringStream()
stream.write msg
# Read the message from the stream and output the data, if it's all present
stream.setPosition(0)
# start timervar t0 =getMonoTime().ticks()
var readMsg = stream.readExampleMessage()
if readMsg.has(number, text, pixels):
echo readMsg.number
echo readMsg.text
echo readMsg.pixels.len()
let t1 = (float64)(getMonoTime().ticks() - t0) *0.000001echo&"{t1:.2f}ms - time taken to unpack ExampleMessage"
The output is:
10
Hello world
1048576
6.95ms - time taken to unpack ExampleMessage
I ended up switching to using Zstandard to compress the bytes -> encode base64 -> JSON. Takes about 0.9ms to decode this pipeline on the client, but I switched mainly because the other end of the protobuf was so twitchy (it is in C++) and it's so much easier to be able to see the output in text vs using a hex editor to try to debug!
Also, the other issues was when sending protobuf, you have to roll your own network protocol (because it's a binary format), e.g. read the byte count of the message, cast that to a big endian uint32, prepend that to the message, then on the client, unpack the length, then parse the message, whereas with json you can just add a \r\L to the json string and use a recvLine on the client.
Feel free to close if this is behaving as expected. I was unable to reproduce the super long times I posted earlier, and as I said, have moved on to json, so am not going to dig too much more.
Thanks!
ps amazing use of Nim macros!
The text was updated successfully, but these errors were encountered:
Hi! Per #14, I'm creating a new issue.
Here's a quick example.
The output is:
I ended up switching to using Zstandard to compress the bytes -> encode base64 -> JSON. Takes about 0.9ms to decode this pipeline on the client, but I switched mainly because the other end of the protobuf was so twitchy (it is in C++) and it's so much easier to be able to see the output in text vs using a hex editor to try to debug!
Also, the other issues was when sending protobuf, you have to roll your own network protocol (because it's a binary format), e.g. read the byte count of the message, cast that to a big endian uint32, prepend that to the message, then on the client, unpack the length, then parse the message, whereas with json you can just add a
\r\L
to the json string and use arecvLine
on the client.Feel free to close if this is behaving as expected. I was unable to reproduce the super long times I posted earlier, and as I said, have moved on to json, so am not going to dig too much more.
Thanks!
ps amazing use of Nim macros!
The text was updated successfully, but these errors were encountered: