You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
By keeping the number of bins constant we ensure that the size of the histograms in memory is bounded, theoretically.
I did some investigation by inserting 100k values into a few histograms and got their average size in memory using sizeof.
I've included the results below, which show a small positive trend as we continue adding values,
after all the bins (using 100 here) have been populated.
Is this expected behavior? Is there a way to impose a hard limit on the size of the histograms in memory?
The text was updated successfully, but these errors were encountered:
By keeping the number of bins constant we ensure that the size of the histograms in memory is bounded, theoretically.
I did some investigation by inserting 100k values into a few histograms and got their average size in memory using sizeof.
I've included the results below, which show a small positive trend as we continue adding values,
after all the bins (using 100 here) have been populated.
Is this expected behavior? Is there a way to impose a hard limit on the size of the histograms in memory?
The text was updated successfully, but these errors were encountered: