Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve skipTest handling #177

Closed
ewu63 opened this issue Oct 23, 2020 · 5 comments
Closed

Improve skipTest handling #177

ewu63 opened this issue Oct 23, 2020 · 5 comments
Labels
stale *crickets chirp*

Comments

@ewu63
Copy link
Collaborator

ewu63 commented Oct 23, 2020

Description of feature

Sometimes when testing we expect all the optimizer to be available (such as Travis tests on master), and even on PR builds we want specific tests to pass and others to skip. The way that is done currently, sometimes tests will essentially fail silently, if some optimizer wasn't installed properly it's simply skipped instead of reported as a failure.

Potential solution

This is currently done with the skipTest stuff in unittest. Maybe we want some sort of conditional skip with the skipIf decorator, and specify some environment variable to specify that all tests must pass.

A harder task would be to specify a list of optimizers that are expected to be available, so that we can test the PR builds correctly as well without manually checking the list of skipped tests.

@stale
Copy link

stale bot commented Apr 21, 2021

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity.

@stale stale bot added the stale *crickets chirp* label Apr 21, 2021
@marcomangano
Copy link
Contributor

@nwu63 this baseclasses PR is paving the way to address this issue, right? Is it something that we want to prioritize?

@ewu63
Copy link
Collaborator Author

ewu63 commented Apr 21, 2021

That PR is directly related to #224, this issue is more on how to organize our tests. For example, tests involving SLSQP should always pass, and SNOPT should pass in some cases. Right now, everything just skips if import failed, and we will want to change this:

  • no skips for default optimizers
  • skip for optional optimizers (perhaps by using the skipTest stuff we currently have)
  • use the newly-added testflo flag that I added (--disallow_skipped) on tests where we expect all tests to pass

Something like that? Keep in mind that I am planning to do a major refactor of pyoptsparse tests in May, so I will take care of this then.

@marcomangano
Copy link
Contributor

Did the test refactoring fix this? I don't recall it now but I don't want to lose track of this again

@ewu63
Copy link
Collaborator Author

ewu63 commented Aug 30, 2021

This was addressed in #237.

@ewu63 ewu63 closed this as completed Aug 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale *crickets chirp*
Projects
None yet
Development

No branches or pull requests

2 participants