Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TinyLFU caching policy #1

Open
ben-manes opened this issue Apr 3, 2018 · 0 comments
Open

TinyLFU caching policy #1

ben-manes opened this issue Apr 3, 2018 · 0 comments

Comments

@ben-manes
Copy link

ben-manes commented Apr 3, 2018

Congrats on the paper. If possible, I would enjoy reading it (ben.manes @ gmail).

I spent a lot of time evaluating eviction policies, and settled on TinyLFU with some minor improvements. While ARC is better than LRU and other classic policies, it does not retain its history robustly enough. LIRS is often overlooked because of its complexity and poorly described algorithm, but it is a very solid performer. TinyLFU matches or exceeds LIRS in many workloads, but is dramatically simpler and more efficient.

LIRS, ARC, and other modern policies rely on ghost entries in LRU lists. TinyLFU instead stores the history in a compact sketch to estimate the frequency. This is used as an admission filter to reject unpopular candidates entries, which would otherwise pollute the cache. By comparing the likelihood of reuse of the candidate and the eviction policy's victim, the cache can achieve good hit hit rates in a very cheap and straightforward manner.

I have a Java simulator that you could crib from. I'd like to learn more about your work since there is always an new angle of attack to further improve performance.

https://github.com/ben-manes/caffeine
(See links to paper and HighScability article for details).

Cheers,
Ben

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant