You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 14, 2021. It is now read-only.
var entry = createRandomEntry();
var packedEntry = msgpack.pack(entry);
map.set(entry.id, packedEntry);
the heap ram runs out of control. it wants to consume >30GB and isnt finished building the map yet. maximum ram usage storing the original entries without msgpack is only ~12GB. The original entries are not referenced anymore and i doesnt matter wether i run global.gc during insert (every 10000 entries for example).
Is there any kind of data-structure built up per item packed ?
Again: The exact same code without running pack (storing the original entry) is about 12GB heap. Also running with protobuf instead of msgpack maxes out at less then 3 GB RAM. I would expect protobuf to be better of course (since the schema is known and fixed) but the RAM usage of each single item should be less than the original json.
The original msgpack encoded buffer is 104 bytes
The protobuf encoded buffer is 92 bytes
The rough estimated in memory size of the raw json is 312 bytes
If anything i would have expected the msgpack RAM to be a third ~4 GB instead of ~12GB. The Protobuf memory usage (always including ES6 Map) is 2.1GB.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I am currently testing msgpack vs raw object storage vs protobuf into an ES6 Map (using node 5.x)
There seems to be some kind of memory issue with the current implementation of node-msgpack.
Surprisingly when creating 10.000.000 entries using this function
using (in a loop of course)
the heap ram runs out of control. it wants to consume >30GB and isnt finished building the map yet. maximum ram usage storing the original entries without msgpack is only ~12GB. The original entries are not referenced anymore and i doesnt matter wether i run global.gc during insert (every 10000 entries for example).
Is there any kind of data-structure built up per item packed ?
Again: The exact same code without running pack (storing the original entry) is about 12GB heap. Also running with protobuf instead of msgpack maxes out at less then 3 GB RAM. I would expect protobuf to be better of course (since the schema is known and fixed) but the RAM usage of each single item should be less than the original json.
The original msgpack encoded buffer is 104 bytes
The protobuf encoded buffer is 92 bytes
The rough estimated in memory size of the raw json is 312 bytes
If anything i would have expected the msgpack RAM to be a third ~4 GB instead of ~12GB. The Protobuf memory usage (always including ES6 Map) is 2.1GB.
The text was updated successfully, but these errors were encountered: