Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Try to reconstruct the exp #246

Open
YOU-k opened this issue Sep 2, 2024 · 1 comment
Open

Try to reconstruct the exp #246

YOU-k opened this issue Sep 2, 2024 · 1 comment

Comments

@YOU-k
Copy link

YOU-k commented Sep 2, 2024

Hi there,
I was trying to test the model's performance with a reconstruction task. I tried to make this work with the TransformerModel used.
And when I load the weight using the function load_pretrained:
I get layers that start with 'encoder, value_encoder, transformer_encoder, decoder ' all loaded,

the scatter plot for the averaged expression of each gene, where the x-axis means the predicted values from the model, and y is the predicted value.
image

I find that in the perturbation fine-tune task, the decoder is not loaded. so, I reload the model with 'encoder, value_encoder, transformer_encoder,' loaded, so the decoder is randomly initialized, and this time, the scatter plot looks like this:
image

Could you please help me understand why loading the decoder weight could end up with this result? Am I using the weight in the wrong way? and how could I reconstruct the data?

Cheers,
Yue

@a93sokol
Copy link

a93sokol commented Nov 5, 2024

Having the same issue, still relevant.
My question is rather, why when nothing is masked the reconstruction quality/ranking is so off?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants