Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_bad_stdout_specs vs writeable / #3160

Closed
benclifford opened this issue Mar 6, 2024 · 5 comments
Closed

test_bad_stdout_specs vs writeable / #3160

benclifford opened this issue Mar 6, 2024 · 5 comments

Comments

@benclifford
Copy link
Collaborator

benclifford commented Mar 6, 2024

Describe the bug

One of our Outreachy participants, @SobiaAman, discovered that on WSL2, this test fails for them in make test:

===================================================================== FAILURES ======================================================================
______________________________________________________ test_bad_stdout_specs[nonexistent_dir] _______________________________________________________

spec = '/bad/dir/t.out'

    @pytest.mark.issue363
    @pytest.mark.parametrize('spec', speclist, ids=testids)
    def test_bad_stdout_specs(spec):
        """Testing bad stdout spec cases"""
    
        fn = echo_to_streams("Hello world", stdout=spec, stderr='t.err')
    
        try:
            fn.result()
        except Exception as e:
            # This tests for TypeCheckError by string matching on the type name
            # because that class does not exist in typeguard 2.x - it is new in
            # typeguard 4.x. When typeguard 2.x support is dropped, this test can
            # become an isinstance check.
            assert "TypeCheckError" in str(type(e)) or isinstance(e, TypeError) or isinstance(e, perror.BadStdStreamFile), "Exception is wrong type"
        else:
>           assert False, "Did not raise expected exception"
E           AssertionError: Did not raise expected exception
E           assert False

parsl/tests/test_bash_apps/test_stdout.py:54: AssertionError

This test assumes that / is not writeable and a permissions error can be tested by attempting to write to /.

This is not true in this user's case (and I have over the years experienced other unix environments where / is writeable, such as single-user-targeted OS X laptops).

To Reproduce
Run this test in an environment where / is writeable - for example, in a suitably disposable fileystem image, chmod a+w / and then run the test suite.

Expected behavior
This test should not be reliant on configuration of the / directory of the hosting environment.

Environment
WSL2
Recreateable in my containerized linux dev environment

@SobiaAman
Copy link
Contributor

Hello @benclifford
I have tried to surpass it for a while by replacing the Assert fail with the message istead so the code contiues testing without throwing Assertion Error.

@benclifford benclifford changed the title make test vs WSL2 test_bad_stdout_specs vs writeable / Mar 7, 2024
@benclifford
Copy link
Collaborator Author

this issue was also encountered in a container running flux, see issue #3159, so i changed the title away from WSL

@benclifford
Copy link
Collaborator Author

a possible approach to fixing this test:

the test needs a directory that cannot be written. the assumption so far (which is wrong) is that / would be such a directory.

a different way to get such a directory would be to use the tmpd_cwd fixture to give you a temporary directory, then use os.chmod to make it non-writeable.

I'm not sure how that would fit into the way that the test is laid out - it might be better to move the non-writeable directory test out of the parameterized test test_bad_stdout_specs into its own separate test in the same file, so that the non-writeable directory mentioned above can be created there.

@benclifford
Copy link
Collaborator Author

the approach I wrote in the immediately preceeding message does not work when running as root, because permissions are not enforced for root at all - in the case of #3159 @mercybassey tried out this approach inside a flux-provided container running as root, for example.

So something more interesting needs to happen to make this test either pass or be controllably skipped.

@benclifford
Copy link
Collaborator Author

PR #3483 adds a pytest marker to disable tests affected by this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants