You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently studying your model and find it fascinating. I've learned a lot, but I have a question about the pre-training stage.
In the example pre-training script dev-temp/examples/pretrain.py, the model calculates various types of loss from different perspectives. From my understanding:
loss_mse: This represents the gene-prompt task for generative prediction.
However, I'm unclear about loss_gen. I see it is added to the total loss, but I'm unsure of its specific role. How does loss_gen differ from loss_mse? I've noticed differences, such as it being calculated after 1000 iterations and the output_dict["cell_emb"] gradient being detached. Could you clarify its purpose?
I'm currently studying your model and find it fascinating. I've learned a lot, but I have a question about the pre-training stage.
In the example pre-training script
dev-temp/examples/pretrain.py
, the model calculates various types of loss from different perspectives. From my understanding:loss_mse
: This represents the gene-prompt task for generative prediction.https://github.com/bowang-lab/scGPT/blob/4068d67caaac1e28d56964da68e0214817e38428/examples/pretrain.py#L886C1-L890C43
loss_mvc
: This represents the cell-prompt task for generative prediction.https://github.com/bowang-lab/scGPT/blob/4068d67caaac1e28d56964da68e0214817e38428/examples/pretrain.py#L886C1-L890C43
However, I'm unclear about
loss_gen
. I see it is added to the total loss, but I'm unsure of its specific role. How doesloss_gen
differ fromloss_mse
? I've noticed differences, such as it being calculated after 1000 iterations and theoutput_dict["cell_emb"]
gradient being detached. Could you clarify its purpose?https://github.com/bowang-lab/scGPT/blob/4068d67caaac1e28d56964da68e0214817e38428/examples/pretrain.py#L894C1-L908C39
Thank you for your help!
The text was updated successfully, but these errors were encountered: