-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Populate from gregor metadata #3990
Conversation
…ulate-from-gregor-metadata
…ulate-from-gregor-metadata
…ulate-from-gregor-metadata
…itute/seqr into populate-from-gregor-metadata
return family_counts | ||
|
||
|
||
def _add_tag_type_counts(project_guid, project_variant_tags): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
copy-pasted from proect_api, does not need review
return num_new, len(update_tags) | ||
|
||
|
||
def _set_updated_tags(key: tuple[int, str], metadata: dict[str, dict], support_var_ids: list[str], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
copy-paste from saummary_data_api, does not need review
}, update_json | ||
|
||
|
||
def bulk_create_tagged_variants(family_variant_data, tag_name, get_metadata, user, project=None, load_new_variant_data=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adapted from summary_data_api, needs less thorough review
if not is_google_bucket_file_path(gs_path): | ||
raise Exception('A Google Storage path is expected.') | ||
|
||
# Anvil buckets are requester-pays and we bill them to the anvil project | ||
google_project = get_google_project(gs_path) | ||
google_project = get_google_project(gs_path) if not no_project else None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why aren't we including a billing project when importing the gregor metadata file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because that workspace is a special snowflake somehow and including the billing project causes it to fail shrug
@@ -167,6 +167,8 @@ export const updateIndividuals = values => (dispatch, getState) => { | |||
action = `save_individuals_table/${values.uploadedFileId}` | |||
} else if (values.delete) { | |||
action = 'delete_individuals' | |||
} else if (values.workspaceName) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you make uploadedFileId
, delete
, and workspaceName
constants and use them wherever they're being defined originally? I know that workspaceName
comes from a field on the uploaded Gregor metadata form but I haven't figured out yet where uploadedFileId
comes from.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think fully refactoring a helper function used in several different place plus updating a whole lot of forms unrelated to this change to use those constants is outside the scope of this PR, but I am happy to add comments to make it clearer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that's out of scope, comments about where those fields come from would help too
|
||
|
||
def _iter_metadata_table(file_path, table_name, user, filter_row): | ||
file_name = f'{file_path}/{table_name}.tsv' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do these files come from the gregor report generated by seqr?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Kind of. All the centers submit the same reports and then the DCC makes them into one big multi-site report. But in general, we want the schema for the report we generate to match the schema for the metadata we import here
@@ -1087,6 +1342,10 @@ class LocalIndividualAPITest(AuthenticationTestCase, IndividualAPITest): | |||
fixtures = ['users', '1kg_project', 'reference_data'] | |||
HAS_EXTERNAL_PROJECT_ACCESS = False | |||
|
|||
def test_import_gregor_metadata(self, *args): | |||
# Importing gregor metadata does not work in local environment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should this be removed? looks like you do have another test_import_gregor_metadata
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So the way this test file is structured it has a base IndividualAPITest
object thats not actually a runnable test case, and then it has LocalIndividualAPITest
and AnvilIndividualAPITest
that both inherit from that object plus from a different auth base class so that every test in the IndividualAPITest
is by default run under both local install and anvil auth conditions. However, this test fails under local conditions because you can't import metadata from anvil buckets without anvil, so for the LocalIndividualAPITest
we override this function so it doesn't run
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh I missed that test_import_gregor_metadata
is in a different *APITest
class. thanks for explaining that
…ulate-from-gregor-metadata
No description provided.