Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not showing meta-analysis experiments in creating score set dropdown menu. #310

Open
wants to merge 5 commits into
base: release-2024.4.4
Choose a base branch
from

Conversation

EstelleDa
Copy link
Member

No description provided.

Copy link
Collaborator

@bencap bencap left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My sense with this particular issue is that we should just filter the list of experiments returned to us by the API such that they exclude meta-analysis experiments. That way we don't need to make any API changes. The filter options here are a little wonky, since they don't actually serve any purpose at the moment other than denoting whether to show all experiments or just a user's experiments.

@EstelleDa
Copy link
Member Author

EstelleDa commented Nov 13, 2024

My sense with this particular issue is that we should just filter the list of experiments returned to us by the API such that they exclude meta-analysis experiments. That way we don't need to make any API changes. The filter options here are a little wonky, since they don't actually serve any purpose at the moment other than denoting whether to show all experiments or just a user's experiments.

I changed the back-end codes too so that front end excludes the meta-analysis experiment directly. The main filter function is in VariantEffect/mavedb-api#347.

@bencap
Copy link
Collaborator

bencap commented Nov 14, 2024

I changed the back-end codes too so that front end excludes the meta-analysis experiment directly. The main filter function is in VariantEffect/mavedb-api#347.

Yeah I guess that what I'm saying is that either we should make this query field hew closely to the score set structure (passing meta_analyzes_score_set: false will include only those score sets that are not meta-analysis score sets and to get only the users' current score sets we pass user_id: <current_user_id> instead of the weird current status quo where passing some: value is what does that), or we should just leave the query JSON as something we fix later on and do the filtration on the front-end.

@EstelleDa
Copy link
Member Author

EstelleDa commented Nov 15, 2024

@bencap Sorry, I don't get your means very well or I may misunderstand something. This request is used to up an experiment list. Why do we need to make this query field close to a score set structure for the experiment request? Do you mean I had better change the query to such as query: { user_id: <current_user_id>, meta_analysis_experiment: 'false' }?
I think doing some processing on the back end is better. One main reason is I have no idea how to filter unpublished urns for meta-analysis experiments on the front end. Or do we only allow meta-analyze the published score set? I agree the original query was weird and meaningless to me too.

@bencap bencap changed the base branch from release-2024.4.2 to release-2024.4.3 November 21, 2024 22:58
@bencap bencap changed the base branch from release-2024.4.3 to release-2024.4.4 November 22, 2024 23:10
@EstelleDa EstelleDa requested a review from jstone-dev December 3, 2024 06:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Experiment dropdown during score set creation should exclude meta-analysis experiments
2 participants