-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
All large files in cohort folders should be placed on archiveInstant tiering #719
Comments
Do these come in as |
Pls do run past check with Flo. Managed by bucket life cycle? @reisingerf |
The bucket (byob prefix) is configured to push everything into IT (see here). AFAIK we can't decide on the storage tier when it's under IT (this is then handled automatically) |
Hang on. I recall from early discussion, we wish not to put any objects in operational-ready store (like pipeline-cache bucket) into archive tier classes. But to move them into a dedicated archive bucket. Do we change this view now that - due to complexity/current situation? |
Adding to this, I think we can specify |
What about 'S3 Glacier Instant Retrieval', which we could force .bam files to after say one week? The API / retrieval pricing of S3 Glacier Instant Retrieval is $0.03 per Gb, so a 100 Gb bam would cost $3 to retrieve. The same bam would cost $5.10 in the first 90 days of storage on Intelligent Tiering. |
All good points, but optimisations in my view... Ultimately, I'd like to get to a point where we have different storage back-ends, with different retention / tiering options, and can choose between them based on use case (project, research, clinical, ... ) and potentially cost attribution. Having said that: Yes, for well known use cases / projects, we could start by changing the lifecycle configuration and manage it on a per cohort/project prefix rather than for the whole BYOB share. |
Next
|
We have a set of cohort data split into projects enter the following
Any bam files in the
cohort-*
directories should be sent to Intelligent Tiering 'Archive Instant'.The text was updated successfully, but these errors were encountered: