Skip to content

Aggregate Bloom filters

No due date 50% complete

Pathfinder currently does iterate over the range of blocks in the filter and checks the per-block bloom filter for potential matches. Unfortunately loading/checking the bloom filters is fairly slow: for millions of blocks it certainly will require at least a few seconds even on fast hardware.

We could implement a more elaborate scheme similar to what is i…

Pathfinder currently does iterate over the range of blocks in the filter and checks the per-block bloom filter for potential matches. Unfortunately loading/checking the bloom filters is fairly slow: for millions of blocks it certainly will require at least a few seconds even on fast hardware.

We could implement a more elaborate scheme similar to what is implemented by Geth: we should store an aggregate (rotated) Bloom filter for a range of blocks and use that filter to check potential matches in a range (~32k) of blocks. This would enable us to efficiently filter on keys in a block range with a fraction of the cost involved with loading all bloom filters in the range.

For an overview (and some extra ideas) check the geth implementation or this write-up.

Loading