You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Optimize the store.List behavior for FileStore by increasing the limit argument when listing files during sync operations.
Why is this needed:
Currently, when syncing large directories with FileStore, there are significant performance issues:
Current behavior:
listCommonPrefix uses maxResult=1000 for each list operation
FileStore reads the entire directory(let's say over 10 million) to return just 1,000 files
This process repeats multiple times for large directories
Problems:
Extremely inefficient for directories with millions of files
Each list operation unnecessarily scans the entire directory
Makes syncing large directories (e.g., 10 million files) practically impossible
I met the case recently. Read the entire directory needs around 90 seconds but only returns 1000 in filestore List, which is terrible to get the entire 10 millions files in listCommonPrefix.
Proposed solution:
Increase the limit to a higher value if the store is file store in listCommonPrefix.
The text was updated successfully, but these errors were encountered:
What would you like to be added:
Optimize the store.List behavior for FileStore by increasing the limit argument when listing files during sync operations.
Why is this needed:
Currently, when syncing large directories with FileStore, there are significant performance issues:
This process repeats multiple times for large directories
I met the case recently. Read the entire directory needs around 90 seconds but only returns 1000 in filestore List, which is terrible to get the entire 10 millions files in listCommonPrefix.
Increase the limit to a higher value if the store is file store in listCommonPrefix.
The text was updated successfully, but these errors were encountered: