-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revise usage of the memory-cache and size-limits #7
Comments
I learnt a bit about this. Key points listed below. If the cache size limit is set, all entries must specify sizeThis means if an entry is added without specifying a size, then there's an exception (which could crash the app). So one has to be very careful when using the default provided MemoryCache (= shared cache instance) from ASP.NET Core DI, as other usage may cause the exception. Also "All users of a cache instance should use the same unit system" which can't be guaranteed with the shared cache instance. We have two (?) choices:
My tendency is ATM to the second bullet. For bullet one we need to document this "trap", but it's a bit astonishing and a lot of developers don't read docs, so our library may cause crashes and be blamed, while being innocent to that. An entry will not be cached if the sum of the cached entry sizes exceeds the value specified by
|
--> AddHttpUserAgentMemoryCachedParser() - Shared, no additional options
AddHttpUserAgentMemoryCachedParser(o=>
{
o.AddIsolatedCache(); // our defaults
}
AddHttpUserAgentMemoryCachedParser(o=>
{
o.AddIsolatedCache(mco => ...); // own options
} or we drop the shared cache at all (and waiting for feedback) or shared is just an option AddHttpUserAgentMemoryCachedParser() // our cache, our defaults
AddHttpUserAgentMemoryCachedParser(o=> ...) // own options
AddHttpUserAgentSharedMemoryCachedParser() |
Cf. #1 (comment) and #1 (comment)
Use IMemoryCache.
The text was updated successfully, but these errors were encountered: