Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tardis_portal add large Datafile times out worker #2

Open
avrljk opened this issue Feb 13, 2018 · 1 comment
Open

tardis_portal add large Datafile times out worker #2

avrljk opened this issue Feb 13, 2018 · 1 comment

Comments

@avrljk
Copy link

avrljk commented Feb 13, 2018

Adding a real life 3.2 GB .PvDatasets file fails about halfway through on my PC.

From the log:

django_1    | [2018-02-13 06:39:07 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:97)
django_1    | [2018-02-13 06:39:09 +0000] [310] [INFO] Booting worker with pid: 310
django_1    | [2018-02-13 06:39:39 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:75)
django_1    | [2018-02-13 06:39:41 +0000] [314] [INFO] Booting worker with pid: 314

I've asked Andrew for a list of recent file sizes generated by instrument users. A histogram should help us choose a suitable timeout.

@dean-taylor
Copy link
Member

This could be related to:

  1. An incorrect worker type. The MyTardis front end for file uploads (development there is only one) should be set to the Gunicorn gevent worker. This allows for large stream file uploads, the default worker is designed for standard web content and will time out after 30 seconds by default. Please ensure that the environment variable GUNICORN_WORKER_CLASS=gevent is sent to the django application responsible for this type of process. You should be able to verify worker type from the startup logs.
  2. The current version of MyTardis has reported that large file uploads using this method results in overly high process usage and has discouraged this method of upload for large file sets. Recommendation is to utilise a staging area with SCP or some other file transfer method and then a post upload worker to ingest files into permanent data location. It may be that running this on a development host has hit this current code limitation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants