-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problems with large results sets #12
Comments
I also noticed longer delays when applying filter constraints in bigger page trees. Filtering for The idea about limiting results also came in my mind. Imagine you could use the currently selected page as context with the result that you only search in its subpages. From UIs point of view i think about a checkbox in the modal wizard "Search in subpages only". That may be helpful too. Question here is, does "building the uid list of subpages" maybe take longer then the actual search? :-) It really depends on the filter constraints and if constraint fields have an index. I'm open for for such context option... Limiting the result with "give me only 20 pages" seems fiddly to implement because of editors and their mount points. In a system with 10k pages i have seen response times of ~ 15s. IMHO this is okay. Also ~45s may be okay, but if its going to take a minutes, then something need to be changed. Question about your |
sorry, did not see the question before. Answer: About the core feature:
Is there a core issue for this? (you mentioned it was mentioned in one of the issues, but it might be helpful to actually address this directly) (It always has some drawbacks to make things configurable but here, I think it would be helpful. Alternatively create an extension, which implements this or add it in this extension?). |
* A click on a wizard item adds the filter to the input field. There it could be extended. * Double click on a wizard item applies the filter immediately Relates: #12
I meaned this comment: https://forge.typo3.org/issues/92036#note-11 Would you like to beta test and check if https://github.com/christophlehmann/pagetreefilter/tree/filter-input-in-wizard fit your needs? |
Once I apply my fix in #19, this works nicely. It also seems to handle some result sets better. However, if I have a very large result set (e.g. table=pages, doktype=1), this may still fail, resulting in result set of previous request displayed. However, I tested with (Also, the bevaviour in v11 is a little different than before / in TYPO3 v10, I no longer see an error message, if a request fails). |
If constraint has large result set, fetching pagetree takes much longer than without constraints and may result in errors.
I am assuming the extension may be fetching too many results. In the core - to handle the previous performance problems in the page tree - not the entire pagetree is fetched - only the expanded part. (After several changes in the pagetree mechanism).
If I fetch with this extension "pagetreefilter" using a constraint with a large result set I get one of:
I still find the extension very nice and quite useful. The editors on our site will most likely not get this problem as they see only a much smaller subset of pages (due to the mount points used).
I am not using this yet in production, it might be nice to be able to find a way to solve these problems. Will upgrade to v11 soon.
System
I am using
site has about 40 000 pages.
Reproduce
Watching the result times in the Network tab of developer tools, I also see long response times for the filterData requests, e.g. - 29s
error messages
Browser console:
No message in TYPO3 log (log file with ERROR log level).
This may be related to these core issues:
Possible solution
It might be a possible solution to get number of results first and if too large reject the constraint and fall back to original behaviour of TYPO3 or abort the current filter with an error message ("result set too large"). (Have not really thought this through though, just tossing out ideas. Difficulty is not just detemining what kind of result set is "too large").
The text was updated successfully, but these errors were encountered: