Skip to content
This repository has been archived by the owner on Jul 14, 2021. It is now read-only.

RAM usage #68

Open
matthiasg opened this issue Mar 10, 2016 · 0 comments
Open

RAM usage #68

matthiasg opened this issue Mar 10, 2016 · 0 comments

Comments

@matthiasg
Copy link
Contributor

I am currently testing msgpack vs raw object storage vs protobuf into an ES6 Map (using node 5.x)

There seems to be some kind of memory issue with the current implementation of node-msgpack.

Surprisingly when creating 10.000.000 entries using this function

function createRandomEntry(){
  return { id: uuid.v1(), v:38, p: uuid.v1(), l:[0,2,4,5,9,8,10], seq:12343432 };
}

using (in a loop of course)

var entry = createRandomEntry();
var packedEntry = msgpack.pack(entry);
map.set(entry.id, packedEntry);

the heap ram runs out of control. it wants to consume >30GB and isnt finished building the map yet. maximum ram usage storing the original entries without msgpack is only ~12GB. The original entries are not referenced anymore and i doesnt matter wether i run global.gc during insert (every 10000 entries for example).

Is there any kind of data-structure built up per item packed ?

Again: The exact same code without running pack (storing the original entry) is about 12GB heap. Also running with protobuf instead of msgpack maxes out at less then 3 GB RAM. I would expect protobuf to be better of course (since the schema is known and fixed) but the RAM usage of each single item should be less than the original json.

The original msgpack encoded buffer is 104 bytes
The protobuf encoded buffer is 92 bytes
The rough estimated in memory size of the raw json is 312 bytes

If anything i would have expected the msgpack RAM to be a third ~4 GB instead of ~12GB. The Protobuf memory usage (always including ES6 Map) is 2.1GB.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant