You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Users experience that listing of files hangs, when requesting file listings.
In worst case the listing simply time out.
Desired solution
As a user I want (1) fast listing of at least some files, and (2) that the file listings do not time out, even then there is a very, very large number of files.
For the migrid instance we want correct, efficient and well-performing file listings.
In the code base we ideally want tests and benchmarks on very large file listings.
Assumed reasons for the issue
Our assumption is that is happens in sharelinks listings, when there is a lot of links.
In internal ticket #33432 we got a report from user, that ls also was hanging when run on a network drive.
From Apache logs (ssl-sid-error.log) we get warnings and errors like:
[cgi-warn] <...> Timeout waiting for output from CGI script /home/mig/mig/cgi-sid/ls.py
[cgi-error] <...> Script timed out before returning headers: ls.py
The above were triggered by ls.py when user were opening sharelink on web:
The assumed reason is that 300s timeout occurs before the list to be displayed has been built.
The performance of the file system is part of the problem.
Suggestions for fix
Reuse the file manager we use in Files and in choose file on createfreeze.
The tasks involves (1) server side implementation of pagination in Python code, (2) front end pagination with AJAX, so that we load smaller chunks to reduce initial load time and preventing timeouts.
AI tips for what it is worth
WebSocket for real-time updates: Use WebSockets to establish a persistent connection between the client and server. This allows you to stream file listings in real-time as they're processed.
Background processing with task queue: Implement a task queue system (like Celery for Python) to process file listing in the background. The web interface can then poll for results or use WebSockets to get updates.
Caching: Implement caching mechanisms to store file listings for a certain period. This can significantly reduce load times for frequently accessed directories.
Optimize the file listing process: Instead of using os.listdir(), consider using more efficient methods like os.scandir() for large directories. You can also implement multi-threading to process large directories faster.
Progressive loading with virtual scrolling: Implement virtual scrolling in your front-end, where only the visible items are rendered. This can handle extremely large lists efficiently.
Optimize file system through various strategies.
The text was updated successfully, but these errors were encountered:
Issue
Users experience that listing of files hangs, when requesting file listings.
In worst case the listing simply time out.
Desired solution
As a user I want (1) fast listing of at least some files, and (2) that the file listings do not time out, even then there is a very, very large number of files.
For the migrid instance we want correct, efficient and well-performing file listings.
In the code base we ideally want tests and benchmarks on very large file listings.
Assumed reasons for the issue
Our assumption is that is happens in sharelinks listings, when there is a lot of links.
In internal ticket #33432 we got a report from user, that
ls
also was hanging when run on a network drive.From Apache logs (ssl-sid-error.log) we get warnings and errors like:
The above were triggered by ls.py when user were opening sharelink on web:
The assumed reason is that 300s timeout occurs before the list to be displayed has been built.
The performance of the file system is part of the problem.
Suggestions for fix
AI tips for what it is worth
The text was updated successfully, but these errors were encountered: