Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Descendant Accuracy for each batch #15

Open
yaldashbz opened this issue Sep 28, 2022 · 3 comments
Open

Descendant Accuracy for each batch #15

yaldashbz opened this issue Sep 28, 2022 · 3 comments

Comments

@yaldashbz
Copy link

Hi,

I am using your dataset and your pretrained model and training mean-teacher with unchanged hyper-parameters. Still, the accuracy in the validation phase has a descendant graph for each batch (acc_re, which you print in a Bar), and then for the next batch, it suddenly jumps to the starting accuracy of the previous batch (It looks like a periodic graph). Could you please explain the reason?

Thanks.

@chaneyddtt
Copy link
Owner

Hi @yaldashbz , I have not met this before, can you provide more details for the phenomenon?

@yaldashbz
Copy link
Author

About the jumping after each batch I was wrong; sorry about that. But I still have a question.
In the training phase, for each epoch after training, the accuracy is calculated in the validation method, and the accuracy graph has a descending form. (for example, in the second epoch, it starts from 1.0, 0.86, 0.81, ... tolerate between 0.69 and 0.78, and at the end goes to 0.70). Note that I'm not using mixup for training. I don't understand the reason.

I've attached the graph for ACC and the loss of a model trained for one epoch.

Screenshot from 2022-09-29 15-12-31

@chaneyddtt
Copy link
Owner

We are computing the average accuracy over the test data during the validation. Only one batch of data is used for the first iteration, and the accuracy can be very high depends on the difficulty of the data in the first batch. As the iteration gets larger, the accuracy is computed over more batch of data, and the accuracy also changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants