Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upload Ingestion Server's TSV files to AWS S3 (skip tags) #4529

Merged
merged 8 commits into from
Jun 27, 2024

Conversation

krysal
Copy link
Member

@krysal krysal commented Jun 20, 2024

Fixes

Fixes #3912 by @krysal

Description

Adds the function to upload the produced files of cleaned data to AWS S3 and delete them locally if the upload is successful.

Testing Instructions

  1. Set the required environment variables in your .env file
AWS_REGION="us-east-1"
AWS_ACCESS_KEY_ID="test_key"
AWS_SECRET_ACCESS_KEY="test_secret"
AWS_S3_ENDPOINT="http://s3:5000"
  1. Spin up the services
just a && just c
  1. Make sure you have the openverse-catalog bucket created in MinIO (or the one you'll be using; it can be configured with OPENVERSE_BUCKET): http://localhost:5011/browser/
  2. Make some rows of the image table in the catalog "dirty" by removing the protocol in one or more of the URLs fields (url, creator_url, or foreign_landing_url)
  3. Run an image data refresh
# Optionally, delete the image index before to avoid the error of "index already exists"
just docker/es/delete-index image-init

just ingestion_server/ingest-upstream "image" "init"
  1. Check the files are in the expected path in MinIO: http://localhost:5011/browser/openverse-catalog, and the URL values are not surrounded by quotation marks (they are unnecessary).

Checklist

  • My pull request has a descriptive title (not a vague title likeUpdate index.md).
  • My pull request targets the default branch of the repository (main) or a parent feature branch.
  • My commit messages follow best practices.
  • My code follows the established code style of the repository.
  • I added or updated tests for the changes I made (if applicable).
  • I added or updated documentation (if applicable).
  • I tried running the project locally and verified that there are no visible errors.
  • I ran the DAG documentation generator (./ov just catalog/generate-docs for catalog PRs) or the media properties generator (./ov just catalog/generate-docs media-props for the catalog or ./ov just api/generate-docs for the API) where applicable.

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

@openverse-bot openverse-bot added the 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work label Jun 20, 2024
@krysal krysal added 🟧 priority: high Stalls work on the project or its dependents 💻 aspect: code Concerns the software code in the repository 🧰 goal: internal improvement Improvement that benefits maintainers, not users 🧱 stack: ingestion server Related to the ingestion/data refresh server and removed 🚦 status: awaiting triage Has not been triaged & therefore, not ready for work labels Jun 20, 2024
@krysal krysal marked this pull request as ready for review June 20, 2024 21:44
@krysal krysal requested review from a team as code owners June 20, 2024 21:44
@krysal krysal requested review from zackkrida and sarayourfriend and removed request for a team June 20, 2024 21:44
Copy link
Collaborator

@AetherUnbound AetherUnbound left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM! Not commenting on the other functionality from the split PR, but the code here makes sense and I have verified that the files get uploaded to minio appropriately 🚀

@openverse-bot
Copy link
Collaborator

Based on the high urgency of this PR, the following reviewers are being gently reminded to review this PR:

@zackkrida
@sarayourfriend
This reminder is being automatically generated due to the urgency configuration.

Excluding weekend1 days, this PR was ready for review 2 day(s) ago. PRs labelled with high urgency are expected to be reviewed within 2 weekday(s)2.

@krysal, if this PR is not ready for a review, please draft it to prevent reviewers from getting further unnecessary pings.

Footnotes

  1. Specifically, Saturday and Sunday.

  2. For the purpose of these reminders we treat Monday - Friday as weekdays. Please note that the operation that generates these reminders runs at midnight UTC on Monday - Friday. This means that depending on your timezone, you may be pinged outside of the expected range.

Copy link
Collaborator

@sarayourfriend sarayourfriend left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍 Just one non-blocking comment to request inline explanation of the skipped cases.

Comment on lines 331 to 332
if not file_path.exists():
continue
Copy link
Collaborator

@sarayourfriend sarayourfriend Jun 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a comment with the explanation you shared in the other PR for why this is an expected and good condition (not an error)? Otherwise, it's definitely not clear.

Although, I'm actually still not clear.

I understand tags won't have a file. That's fine, but then can we explicitly skip it in the list of files we are looking at? That would be much clearer for that case and implicitly document that we know the tags file does not exist.

However: I don't understand the bit about how these files will over time no longer exist. Why would that happen? Once we've applied the fix upstream, is there a point where the data refresh would have "half fixed" data, and some of the cleaning would stop? That's fine, but I just wanted to clarify, because again it's not clear based on this code, but the condition is a starkly significant one with no explanation, but apparently matters quite a bit (with many implementation details behind the reason for it).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand the bit about how these files will over time no longer exist. Why would that happen? Once we've applied the fix upstream, is there a point where the data refresh would have "half fixed" data, and some of the cleaning would stop?

The cleaning steps will remain for a time, let's say two or one data refresh process, until we're confident that nothing is left pending in that ETL. The cleaning won't stop midway by itself, so we'd need to catch these cases when the files aren't produced anymore.

Thanks for raising the flag here. It was clear to me, but now I see I was assuming many things. I added a comment that hopes to add more context to it.

@krysal krysal force-pushed the feat/ing_server_s3_upload branch from b4172e4 to 08ab020 Compare June 26, 2024 16:57
Base automatically changed from feat/ing_server_s3_upload to main June 26, 2024 22:43
@krysal krysal merged commit 97ff97d into main Jun 27, 2024
45 checks passed
@krysal krysal deleted the feat/ing_server_s3_upload_2 branch June 27, 2024 22:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💻 aspect: code Concerns the software code in the repository 🧰 goal: internal improvement Improvement that benefits maintainers, not users 🟧 priority: high Stalls work on the project or its dependents 🧱 stack: ingestion server Related to the ingestion/data refresh server
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

Upload Ingestion Server's TSV files to AWS S3
4 participants