Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test result is "Fail" when all testSteps completed #470

Open
dhouhamaa opened this issue Apr 19, 2023 · 3 comments
Open

Test result is "Fail" when all testSteps completed #470

dhouhamaa opened this issue Apr 19, 2023 · 3 comments

Comments

@dhouhamaa
Copy link

What happened:
I use Kuttl to validate the creation of certain k8s custom resources. The first 3 test cases are run in Host cluster and succeeds. In the last testStep I generate another kubeconfig for a vcluster where I want to run the last testAssert using --kubeconfig. Running the test I see all of them success and for each of them I see "test step completed" , but the end result of kuttl is
"case.go:114: timed out waiting for the condition
=== CONT kuttl
harness.go:405: run tests finished
harness.go:513: cleaning up
harness.go:570: removing temp folder: ""
--- FAIL: kuttl (544.85s)
--- FAIL: kuttl/harness (0.00s)
--- FAIL: kuttl/harness/create (537.77s)
FAIL"
Screenshot 2023-04-19 at 17 48 58

What you expected to happen:
As long as all conditions are met during assert and all test are completed and not failed , I expect the end result of test to be "Pass"
How to reproduce it (as minimally and precisely as possible):
create another kubeconfig during one of the teststeps and run the assert using "--kubeconfig"
Anything else we need to know?:
The result becomes "Pass" if I omit the latest step which is using another Kubeconfig.
Environment:

  • Kubernetes version (use kubectl version):
  • KUTTL version (use kubectl kuttl version):
  • Cloud provider or hardware configuration:
  • OS (e.g. from /etc/os-release):
  • Kernel (e.g. uname -a):
  • Install tools:
  • Others:
@dvgitit
Copy link

dvgitit commented May 19, 2023

I can confirm this error as well. Even the example from this page (https://kuttl.dev/docs/kuttl-test-harness.html#run-the-tests) results in an error in a live cluster:

    logger.go:42: 09:43:07 | example-test | Deleting namespace: kuttl-test-grown-chow
    case.go:114: timed out waiting for the condition
=== CONT  kuttl
    harness.go:405: run tests finished
    harness.go:513: cleaning up
    harness.go:570: removing temp folder: ""
--- FAIL: kuttl (541.82s)
    --- FAIL: kuttl/harness (0.00s)
        --- FAIL: kuttl/harness/example-test (527.91s)
FAIL

@CaffeineDaemon
Copy link

I have the same problem but observed that the test namespace generated by kuttl takes some time to delete . In my case, this is due to finalizers being run to clean up the resources deployed during my tests. When i remove the resources with a TestStep using the delete property and let it sleep long enough so the finalizers are done before kuttl deletes the namespace, my tests succeed.

I suppose the Deleting namespace: kuttl-test-dashing-buffalo step performed by kuttl fails because it hits a narrow timeout and then logs line casge.go114: timed out waiting for the condition. I am no go developer, but https://github.com/kudobuilder/kuttl/blob/7e783766c9b15837934f8f98137140cf87929f2c/pkg/test/case.go#L115C53-L115C53 seems to match what i think happens here. Adding --skip-delete does indeed let the test pass, but now i have to clean up myself.

I would suggest a configurable timeout for namespace deletion could help here, maybe as optional property on the kuttl TestSuite and as command line flag.

@WassimKallel
Copy link

According to this line, the same timeout value used for the TestSuite configuration is used for the cleanup process.
You can have a look at the docs here.

I can confirm that it works since I had the exact same issue and creating a kuttl-tests.yaml and specifying the timeout value solved the issue.
Note: Don't forget to reference your kuttl-tests.yaml file with the flag --config kuttl-tests.yaml in your command.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants