Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FIX] Fix incorrect PR comment generation #540

Merged

Conversation

pranavrajpal
Copy link
Contributor

Please prefix your pull request with one of the following: [FEATURE] [FIX] [IMPROVEMENT].

In raising this pull request, I confirm the following (please check boxes):

  • I have read and understood the contributors guide.
  • I have checked that another pull request for this purpose does not exist.
  • I have considered, and confirmed that this submission will be valuable to others.
  • I accept that this submission may not be used, and the pull request closed at the will of the maintainer.
  • I give this submission freely, and claim no ownership to its content.

My familiarity with the project is as follows (check one):

  • I have never used the project.
  • I have used the project briefly.
  • I have used the project extensively, but have not contributed previously.
  • I am an active contributor to the project.

Fixes (part of) #535 (see below for more info)

This fixes a bug in the generation of the PR comment that caused it to mark regression tests as failing if they produced one of the allowed variants of the expected output file, instead of correctly marking these tests as passing. I also added a few tests for this bug.

I don't think this is a complete fix for #535 because I don't think it completely accounts for the difference between CCExtractor/ccextractor#1329 (comment) and https://sampleplatform.ccextractor.org/test/3224. In that comment, the table and the list of failed tests show different results, but the changes I made and the bug I found affect the results for both the table and the list of failed tests in the same way.

Judging from the template for the PR comment, that would probably be caused by tests and failed_tests in the template getting out of sync, but I can't tell what the cause of that might be just from looking at the code. I also don't know how to debug that problem because I can't tell what regression tests were incorrectly marked as failing.

Add a test that the info used to generate a PR comment handles variant
output files correctly. Regression tests that output one of the allowed
variants of the output file should be treated as correct in addition to
tests that output the original version of the output file.
Refactor the code for getting the info about a test run for use in a PR
comment into a separate function. This makes this function easier to
test instead of testing the entire process of creating a PR comment at
the same time.
Currently, when generating stats and a list of failed tests for PR
comments, tests are only marked as passed if they exactly matched the
original output file. This commit changes that by also treating a
regression test as passed if it is an acceptable variant of the output
file.
Add another test to make sure that the change in
4e78b55 didn't accidentally cause the
PR comment handling to mark all tests as passing even if they result in
an invalid variant file.
@pranavrajpal
Copy link
Contributor Author

I don't think the CI failing is related to this PR, seeing as sudo apt-get install libvirt-dev seems to be getting a 404 when trying to install those packages.

From what I can tell, this appears to be related to actions/runner-images#2104 and there seems to be a relatively simple solution in actions/runner-images#2104 (comment). I'll send a PR with that fix.

@pranavrajpal
Copy link
Contributor Author

I created #541 just now to fix that error. The CI run for that PR seems to have gotten past installing dependencies successfully, so I think that should fix the installation error.

@canihavesomecoffee
Copy link
Member

@pranavrajpal can you take a look at the CI? Some tests fail.

nosetests appears to be treating get_info_about_test_for_pr_comment as a
test to run because it has 'test' in the name. That causes the test
suite to fail because it doesn't pass in the required argument.

This renames the function so that nosetests realizes that it shouldn't
treat it as a test.
@codecov
Copy link

codecov bot commented May 11, 2021

Codecov Report

Merging #540 (5e2ae9c) into master (7809700) will increase coverage by 0.07%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #540      +/-   ##
==========================================
+ Coverage   80.98%   81.05%   +0.07%     
==========================================
  Files          36       36              
  Lines        3591     3605      +14     
  Branches      408      410       +2     
==========================================
+ Hits         2908     2922      +14     
  Misses        549      549              
  Partials      134      134              
Flag Coverage Δ
unittests 81.05% <100.00%> (+0.07%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mod_ci/controllers.py 61.68% <100.00%> (+0.14%) ⬆️
mod_ci/models.py 94.00% <100.00%> (+1.69%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7809700...5e2ae9c. Read the comment docs.

@pranavrajpal
Copy link
Contributor Author

I think I understand why the tests are failing, but I don't think it's related to this PR.

When the test suite (CompleteSignUp) is being set up, it uses the current time that's going to be used as the expiry time for the HMAC authentication:

self.time_of_hash = int(time.time())

Later, during the test run, that time is compared to the current time to see if the HMAC has expired yet:

if int(time.time()) <= expires:

which would fail if it reached that any later than 1 second after the set up code ran. That would cause it to never get to the template rendering, which would cause the Template auth/complete_signup.html not used error.

If you add a sleep to one of the tests in CompleteSignUp you should be able to reproduce this more reliably. I'm honestly surprised that this hasn't appeared before, as it seems unlikely that the tests have always finished in one second.

The fix should be pretty simple (just set the expiry date to float('+inf') or some other large number). Do you want me to just add it to this PR or create a new one?

@canihavesomecoffee
Copy link
Member

If it's not caused by your PR then I'd prefer a separate PR to fix it :)

@pranavrajpal
Copy link
Contributor Author

I created #543 to fix that bug and one other related bug I found while fixing it, but, for reasons I don't understand, the CI seems to be failing when trying to generate the coverage report. Could you take a look and see if you know what's going wrong?

@pranavrajpal
Copy link
Contributor Author

The CI appears to be passing now. Is there anything else you want me to change/fix?

@kvshravan
Copy link
Contributor

Nice work 👍 Can you move the data classes to models.py and then import them as required?

Move dataclasses used for representing info about PR comments to
models.py so that all classes for storing data are placed together.
@pranavrajpal
Copy link
Contributor Author

I've moved the data classes to models.py.

I'm slightly confused on how you decide what goes in controllers.py and what goes in models.py. I originally thought models.py was meant for database models and controllers.py was for everything else. Is models.py meant for any classes for storing data, and controllers.py for the rest of the actual logic, or is there some other distinction between what goes in each file?

@kvshravan
Copy link
Contributor

I think you can go through mod_sample, afaik Mediainfoparser.py contains custom classes. It's not much of a deal, it's better if the classes aren't defined in controllers.py. @canihavesomecoffee has a say on this

@pranavrajpal
Copy link
Contributor Author

Ok, keeping all of the classes out of controllers.py so that all of the actual logic is kept together makes sense.

The data classes are in models.py and the CI seems to be passing, so let me know if you want me to change anything else.

@kvshravan
Copy link
Contributor

LGTM.

@canihavesomecoffee
Copy link
Member

In general the models were for database (ORM) purposes, but I'd rather see the classes under model than tacked to the controller file. The controller file should be reserved for logic wrt handling requests :)

@canihavesomecoffee canihavesomecoffee merged commit 7fcb343 into CCExtractor:master May 17, 2021
@pranavrajpal pranavrajpal deleted the fix-incorrect-pr-comment branch May 17, 2021 13:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants