Skip to content

Commit

Permalink
Task/WG-383: add queue for heavy tasks (#220)
Browse files Browse the repository at this point in the history
* Rework worker Dockerfile and bump PotreeConverter

* Fix docker compose commands

* Update and improve custom html template

* Activate conda when starting bash on running container

* Add an additional test

* Improve nginx.conf to allow range requests for potree bin files

* Fix adding of nsf_logo.png

* Fix unit test

* Add vim package

* Do not need to install laszip

* Clean up dockerfile

* Update deployed nginx.conf to allow range requests for potree bin files

* Simplify entrypoint script

* Refactor oder of Dockerfile

* Fix nginx.conf so range requests work on Firefox

* Remove unused background image in template

Also, fixes nsf log snippet application.

* Improve comment

* Fix PYTHONPATH

* Fix .bin settings

* Unify how some settings are set

* Lower max body size

This was high as we used to support direct file upload instead of using TAPIS

* Improve error handling to log memory issues

* Add heavy queue for computationally intensive tasks.
  • Loading branch information
nathanfranklin authored Oct 31, 2024
1 parent f50e8e9 commit 9027c0f
Show file tree
Hide file tree
Showing 5 changed files with 22 additions and 4 deletions.
7 changes: 6 additions & 1 deletion devops/docker-compose.local.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,12 @@ services:
tty: true
container_name: geoapiworkers
hostname: geoapiworkers
command: "celery -A geoapi.celery_app worker -l info"
command: >
sh -c '
celery -A geoapi.celery_app worker -l info -Q default -n default_worker@geoapi &
celery -A geoapi.celery_app worker -l info -Q heavy --concurrency=6 -n heavy_worker@geoapi &
wait
'
celerybeat:
image: taccaci/geoapi-workers:local
Expand Down
7 changes: 6 additions & 1 deletion devops/geoapi-workers/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,12 @@ services:
driver: syslog
options:
tag: geoapi_workers
command: "celery -A geoapi.celery_app worker -l info"
command: >
sh -c '
celery -A geoapi.celery_app worker -l info -Q default -n default_worker@geoapi &
celery -A geoapi.celery_app worker -l info -Q heavy --concurrency=6 -n heavy_worker@geoapi &
wait
'
watchtower:
image: containrrr/watchtower:1.7.1
Expand Down
8 changes: 8 additions & 0 deletions geoapi/celery_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,14 @@
broker=CELERY_CONNECTION_STRING,
include=['geoapi.tasks'])

# Define the queues
app.conf.task_queues = {
'default': {'exchange': 'default', 'routing_key': 'default'},
'heavy': {'exchange': 'heavy', 'routing_key': 'heavy'}
}

app.conf.task_default_queue = 'default'

app.conf.beat_schedule = {
'refresh_projects_watch_content': {
'task': 'geoapi.tasks.external_data.refresh_projects_watch_content',
Expand Down
2 changes: 1 addition & 1 deletion geoapi/tasks/external_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ def _handle_point_cloud_conversion_error(pointCloudId, userId, files, error_desc
f"Processing failed for point cloud ({pointCloudId})!")


@app.task(rate_limit="1/s")
@app.task(queue='heavy')
def import_point_clouds_from_agave(userId: int, files, pointCloudId: int):
with create_task_session() as session:
user = session.query(User).get(userId)
Expand Down
2 changes: 1 addition & 1 deletion geoapi/tasks/streetview.py
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ def check_existing_upload(session, user, streetview_service, task_uuid, system_i


# TODO: Ensure that just user works and not userid (previously took userid)
@app.task(rate_limit="5/s")
@app.task(queue='heavy')
def from_tapis_to_streetview(user_id: int,
streetview_service_id: int,
system_id: str,
Expand Down

0 comments on commit 9027c0f

Please sign in to comment.