Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(event cache): avoid storing useless prev-batch tokens #4427

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

bnjbvr
Copy link
Member

@bnjbvr bnjbvr commented Dec 17, 2024

We have a problem right now, with the event cache storage: as soon as you receive a previous-batch token, either from the sync or from a previous back-pagination, we consider that we may have more events to retrieve in the past. Later, after running back-paginations, we may realize those events are duplicated. But since a back-pagination will return another previous-batch token, until we hit the start of the timeline, then we're getting stuck in back-pagination mode, until we've reached the start of the timeline again.

That's bad, because it makes the event cache store basically useless: every time you restore a session, you may receive a previous-batch token from sync (even more so with sliding sync, which will give one previous-batch token when timeline_limit=1, then another one when the timeline limit expands to 20). And so you back-paginate back until the start of the history.

This series of commits fixes that, by introducing two new rules:

  • in back-pagination, don't assume the absence of a gap always means we're waiting for a prev-batch token from sync. This is only true if there was no events stored in the linked chunk before; if there's any, then we don't need to restart back-pagination from the end of the timeline to the beginning.
  • in back-pagination or sync, don't store a previous-batch token, if all the events (at least one) we've received were already known (and are duplicated).

With these two rules, we're not storing useless previous-batch tokens anymore, and we'll back-paginate at most one, for a given room. Interestingly, this might uncover some bugs related to back-pagination orderings, so we'll desperately want #4408 <3

Part of #3280.

@bnjbvr bnjbvr requested a review from a team as a code owner December 17, 2024 17:05
@bnjbvr bnjbvr requested review from stefanceriu and removed request for a team December 17, 2024 17:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant