Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for IDistributedCache to the SDK #196

Closed
xantari opened this issue Feb 25, 2020 · 13 comments
Closed

Add support for IDistributedCache to the SDK #196

xantari opened this issue Feb 25, 2020 · 13 comments
Assignees
Milestone

Comments

@xantari
Copy link

xantari commented Feb 25, 2020

Motivation

Currently, the SDK provides a CacheManager that uses IMemoryCache.

In order to support more advanced scenarios utilizing commonly used distributed caching libraries such as:

  • Microsoft.Extensions.Caching.Distributed.MemoryDistributedCache
  • Microsoft.Extensions.Caching.Redis.RedisCache
  • Microsoft.Extensions.Caching.SqlServer.SqlServerCache
  • Microsoft.Extensions.Caching.StackExchangeRedis.RedisCache

we'd like to introduce support for the IDistributedCache interface.

Proposed solution

There are several approaches that we can take:

  1. Create a new implementation if the IDeliveryCacheManager that would be able to serialize the Delivery*Response objects. Depending on whether the implementation will be byte[]-based or string-based, it will require some changes in the Delivery*Response and underlying objects (such as Taxonomy) in terms of decorating them with certain attributes, implementing serializers, etc.
    If we choose to go this way, I'd vote for:
  • moving all the logic from the Delivery*Response objects to the DeliveryClient (everything that's related to IModelProvider or Newtonsoft.Json)
  • removing Kentico.Kontent.Delivery.Abstractions's dependency on Newtonsoft.Json.Linq
  • getting rid of dynamic properties - I think we can replace them with strongly-typed collections of objects (not sure why we haven't done so yet)
  • make all the models POCO with public constructors
    This approach is best performance-wise as it stores the objects after the strong typing facilitated by IModelProvider. And also goes in line with the caching model we've laid out with the introduction of the IDeliveryCacheManager.
  1. Implement an HTTP response-level caching. This can be achieved e.g. using Polly
  1. Similar to (2) - introduce the caching on the level of DeliveryClient.GetDeliverResponseAsync

Let's try the approach no. 1 first.

Additional context

@petrsvihlik

This comment has been minimized.

@petrsvihlik

This comment has been minimized.

@xantari

This comment has been minimized.

@petrsvihlik petrsvihlik transferred this issue from kontent-ai/boilerplate-net Mar 12, 2020
@petrsvihlik petrsvihlik added this to the 13.0.0 milestone Mar 16, 2020
@petrsvihlik petrsvihlik changed the title Add DistributedCacheManager to project Add support for IDistributedCache to the SDK Mar 16, 2020
@petrsvihlik petrsvihlik modified the milestones: 13.0.0, vNext Mar 18, 2020
@petrsvihlik
Copy link
Contributor

we'll probably use a DI-based contract resolver and load classes AsImplementedInterfaces

the question is whether to use Autofact or something simpler such as Scrutor: https://github.com/khellang/Scrutor

@petrsvihlik
Copy link
Contributor

done in #226

@petrsvihlik
Copy link
Contributor

Hey @xantari
would you like to give it a try? any feedback would help us a ton!

@xantari
Copy link
Author

xantari commented Aug 17, 2020

@petrsvihlik Opened bug #227 That I found while testing this.

@xantari
Copy link
Author

xantari commented Aug 17, 2020

@petrsvihlik Found another one: #228

@xantari
Copy link
Author

xantari commented Aug 17, 2020

@petrsvihlik So far it seems to work well except that I've found a different issue I hadn't anticipated.

Everyone is working from home for a few months now, and as such we are operating over VPN. Our VPN tunnel maxes out at around 1MB/s (which is another issue altogether) and I don't have super fast upload speed.

The BSON serialized data is fairly massive for some reason when being placed into the cache (which is sending/receiving data over the VPN to our SQL Server (which uses in Memory SQL tables so it's supposed to be fast).

I have no idea why the BSON objects are so massive, but it took my home page about 99 seconds to load now as it had to store/fetch the cached data over VPN.

See here for length of data:

image

You'll see our mega menu is 14 megabytes!

I'm trying to figure out why the BSON values are so large.

@xantari
Copy link
Author

xantari commented Aug 17, 2020

BTW, I tested a bit with some of the objects as just JSON, and they are just as big. So it's really just a matter for us to figure out how to get our own local distributed caches setup for each developer that doesn't go over a slow data link...

@xantari
Copy link
Author

xantari commented Aug 17, 2020

BTW, Thanks so much for this feature!!!

@petrsvihlik
Copy link
Contributor

@xantari thank you for testing it!

I added an example of how to make it work with a local Redis instance on Windows: https://github.com/Kentico/kontent-delivery-sdk-net/wiki/Caching-responses#distributed-caching---example-from-v1400-rc1

@petrsvihlik
Copy link
Contributor

@xantari btw I tested (.\bombardier.exe -l -c 125 -r 5000 -d 10s https://localhost:5001) the Redis cache on my local windows machine on the .NET boilerplate project and the results are pretty impressive. The response times were at about 50ms and throughput about 4.7K requests/s.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants