Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revise usage of the memory-cache and size-limits #7

Closed
gfoidl opened this issue May 17, 2021 · 2 comments · Fixed by #21
Closed

Revise usage of the memory-cache and size-limits #7

gfoidl opened this issue May 17, 2021 · 2 comments · Fixed by #21

Comments

@gfoidl
Copy link
Contributor

gfoidl commented May 17, 2021

Cf. #1 (comment) and #1 (comment)

Use IMemoryCache.

@gfoidl gfoidl mentioned this issue May 17, 2021
8 tasks
@gfoidl
Copy link
Contributor Author

gfoidl commented May 17, 2021

I learnt a bit about this. Key points listed below.

If the cache size limit is set, all entries must specify size

This means if an entry is added without specifying a size, then there's an exception (which could crash the app).
Mitigation: don't use a shared cache when a size limit is specified.

So one has to be very careful when using the default provided MemoryCache (= shared cache instance) from ASP.NET Core DI, as other usage may cause the exception.

Also "All users of a cache instance should use the same unit system" which can't be guaranteed with the shared cache instance.

We have two (?) choices:

  • don't provice cache options, and rely on the user to inject the correct instance of IMemoryCache where the user is responsible of setting the correct options
  • provide the options as is, but use our own non-shared instance of IMemoryCache (as singleton)

My tendency is ATM to the second bullet.

For bullet one we need to document this "trap", but it's a bit astonishing and a lot of developers don't read docs, so our library may cause crashes and be blamed, while being innocent to that.

An entry will not be cached if the sum of the cached entry sizes exceeds the value specified by SizeLimit

So if the cache is "full", no more items will be cached. Is this the behavior we want?

I'm not sure. Ideally it should be something like "if the cache is full, so remove the entry with the least hits".
As this is hard to manage a "if the cache is full, remove the least recent used one" is more practicable.

Let's assume the app restarts at 02:00, no user online. The caches are empty now, and we have a size limit of 256.
Then there are 300 different bots accessing the site, filling the cache. At 06:00 first users / browser start to hit the site, the cache is full --> no more UAs will be stored and for each browser request the parsing needs to be done. Thus making the (nice) cache solution irrelevant.

MemoryCache.Compact provides a suitable implementation, but

  • it's a method on the concrete type, not the abstraction, so we would need to type test
  • when is it time to call this method
    These downsided make it less practical to use.

So taking these two points into account, should we add a LRU (least recently used) solution that allow to specify the size and some kind of expiration?

Didn't evaluate if there are any libraries, but a LRU cache is quite simple to implement (did some investigation on this for the portal some time ago).

@BenjaminAbt
Copy link
Member

BenjaminAbt commented May 18, 2021

-->

AddHttpUserAgentMemoryCachedParser() - Shared, no additional options

AddHttpUserAgentMemoryCachedParser(o=>
{
  o.AddIsolatedCache(); // our defaults
}

AddHttpUserAgentMemoryCachedParser(o=>
{
  o.AddIsolatedCache(mco => ...); // own options
}

or we drop the shared cache at all (and waiting for feedback)

or shared is just an option

AddHttpUserAgentMemoryCachedParser() // our cache, our defaults

AddHttpUserAgentMemoryCachedParser(o=> ...) // own options

AddHttpUserAgentSharedMemoryCachedParser()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants