-
-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Maven Surefire Plugin rerunning flaky tests #539
Comments
Thanks for the detailed description of your use case. There is no option available currently. Rather than changing the logic, I would specifically handle the "flaky" annotations in the test result files: https://maven.apache.org/surefire/maven-surefire-plugin/examples/rerun-failing-tests.html Can you provide an example test result with one failure and success reruns, as well as one result file with all rerun failures? |
Sure, for completeness, I made 4 tests fail/error by either throwing an exception or doing a false assertion:
Output of
<?xml version="1.0" encoding="UTF-8"?>
<testsuite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="https://maven.apache.org/surefire/maven-surefire-plugin/xsd/surefire-test-report-3.0.xsd" version="3.0" name="example.ExampleTests" time="0.002" tests="2" errors="1" skipped="0" failures="1">
<testcase name="throwsAlways" classname="example.ExampleTests" time="0.01">
<error message="Always throws exception" type="java.lang.RuntimeException"><![CDATA[java.lang.RuntimeException: Always throws exception
at example.ExampleTests.throwsAlways(ExampleTests.java:26)
]]></error>
<rerunError message="Always throws exception" type="java.lang.RuntimeException">
<stackTrace><![CDATA[java.lang.RuntimeException: Always throws exception
at example.ExampleTests.throwsAlways(ExampleTests.java:26)
]]></stackTrace>
</rerunError>
<rerunError message="Always throws exception" type="java.lang.RuntimeException">
<stackTrace><![CDATA[java.lang.RuntimeException: Always throws exception
at example.ExampleTests.throwsAlways(ExampleTests.java:26)
]]></stackTrace>
</rerunError>
<rerunError message="Always throws exception" type="java.lang.RuntimeException">
<stackTrace><![CDATA[java.lang.RuntimeException: Always throws exception
at example.ExampleTests.throwsAlways(ExampleTests.java:26)
]]></stackTrace>
</rerunError>
</testcase>
<testcase name="failsAssertionAlways" classname="example.ExampleTests" time="0.003">
<failure message="Always fails assertion ==> expected: <1> but was: <2>" type="org.opentest4j.AssertionFailedError"><![CDATA[org.opentest4j.AssertionFailedError: Always fails assertion ==> expected: <1> but was: <2>
at example.ExampleTests.failsAssertionAlways(ExampleTests.java:40)
]]></failure>
<rerunFailure message="Always fails assertion ==> expected: <1> but was: <2>" type="org.opentest4j.AssertionFailedError">
<stackTrace><![CDATA[org.opentest4j.AssertionFailedError: Always fails assertion ==> expected: <1> but was: <2>
at example.ExampleTests.failsAssertionAlways(ExampleTests.java:40)
]]></stackTrace>
</rerunFailure>
<rerunFailure message="Always fails assertion ==> expected: <1> but was: <2>" type="org.opentest4j.AssertionFailedError">
<stackTrace><![CDATA[org.opentest4j.AssertionFailedError: Always fails assertion ==> expected: <1> but was: <2>
at example.ExampleTests.failsAssertionAlways(ExampleTests.java:40)
]]></stackTrace>
</rerunFailure>
<rerunFailure message="Always fails assertion ==> expected: <1> but was: <2>" type="org.opentest4j.AssertionFailedError">
<stackTrace><![CDATA[org.opentest4j.AssertionFailedError: Always fails assertion ==> expected: <1> but was: <2>
at example.ExampleTests.failsAssertionAlways(ExampleTests.java:40)
]]></stackTrace>
</rerunFailure>
</testcase>
<testcase name="failsAssertionOnlyFirstAttempt" classname="example.ExampleTests" time="0.0">
<flakyFailure message="First attempt fails assertion ==> expected: <1> but was: <2>" type="org.opentest4j.AssertionFailedError">
<stackTrace><![CDATA[org.opentest4j.AssertionFailedError: First attempt fails assertion ==> expected: <1> but was: <2>
at example.ExampleTests.failsAssertionOnlyFirstAttempt(ExampleTests.java:32)
]]></stackTrace>
</flakyFailure>
</testcase>
<testcase name="throwsOnlyFirstAttempt" classname="example.ExampleTests" time="0.001">
<flakyError message="First attempt throws exception" type="java.lang.RuntimeException">
<stackTrace><![CDATA[java.lang.RuntimeException: First attempt throws exception
at example.ExampleTests.throwsOnlyFirstAttempt(ExampleTests.java:18)
]]></stackTrace>
</flakyError>
</testcase>
</testsuite>
|
Thanks for the input. With the provided XML the action reports one test as error, one as failure and the last two as pass. Can you add another two tests that fail / error twice and pass on third attempt? |
Sure, see below for these 7 tests so far:
Output of
|
The action only considers known tags like The |
I think you're right and I agree with the current behaviour. After reading your comment, I've dug a bit deeper and found that I had not accurately reproduced the actual problem. Some context: we run ~5000 tests spread over many classes and each class gets its own Though our I'll focus my efforts now on how the xml reports are generated, it might have to be related to the fact we're running our tests in parallel. Thanks for helping along and the quick responses. |
Thanks for clarification and for raising this! |
Let's reopen this to track supporting the flaky annotation for eventual successful tests. That would be a great addition to the summary. |
So, currently, the only way is to use "fail-on nothing* and rely on the failure of the build job, right? |
Depends on what you want. Do you want to fail on falkiness? Or do you want to fail on consistent test failure? This action fails on the latter (with default settings). |
We don't want it to fail on flakyness, but it does.. looks like it interprets first failed attempts of flaky tests as failures, and the corresponding job is marked as failed. We don't want that. |
@EnricoMi I've also tried with fail_on nothing, eclipse-xtext/xtext#3164 but the job for publishing test results is still marked with failure (due to flakes). Am I missing something? |
Can you point me to an example workflow run? |
Many of the latest runs in https://github.com/eclipse/xtext/ are affected. |
In file Looking at 1, a The xml files are inconsistent with 1. Footnotes |
We use maven-surefire and for Eclipse tests (the flaky ones) we use tycho-surefire, which uses maven-surefire under the hood I think. So we don't do anything specific... |
Thank you for investigating this. And although it is correct that Xtext uses the And indeed I can reproduce the described behavior with a very simple Maven project that only has the following test class:
and a very simple
The resulting text-report.xml has the described content:
I cannot se that the this page make any statements regarding the summary in the head-line. But I agree with you that the content is indeed surprising. I would also expect the I'll ask the Maven devs if this is intended. I don't see anything regarding flakes in JUnit so I assume it's a Maven-Surefire feature. |
I think the <testsuite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="https://maven.apache.org/surefire/maven-surefire-plugin/xsd/surefire-test-report.xsd"
version="3.0"
name="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest"
time="8.454" tests="8" errors="0" skipped="1" failures="1">
<properties>
...
</properties>
<testcase name="testOnRemoveTwoProjects" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="0.177"/>
<testcase name="testBug574908_Performance" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="3.674"/>
<testcase name="testBug574908_Performance_500" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="0.0">
<skipped message="Can be enabled for performance test purposes"/>
</testcase>
<testcase name="testGetStorages_Performance" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="3.771"/>
<testcase name="testOnClasspathChange" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="0.452"/>
<testcase name="testBug574908" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="0.107"/>
<testcase name="testOnCloseOpenRemoveProject" classname="org.eclipse.xtext.ui.tests.core.resource.Storage2UriMapperJavaImplTest" time="0.152">
<flakyFailure message="{} expected:<1> but was:<0>" type="java.lang.AssertionError">
<stackTrace><![CDATA[...]]></stackTrace>
</flakyFailure>
</testcase>
</testsuite> We have 7 testcases, so you would expect I have a workaround to derive suite counts from test cases, which will make flaky tests count as successful tests and not over-count the number of tests. Supporting flaky tests natively will require extending JunitParser. Then this action will mark flaky test as ❄️ instead of ✅. |
Thanks for the pointer! But from the report I'm not sure if it's just about processing a given test-report XML or if it's about the creation of that file. I have the impression it's about the former.
Yes I think that's exactly was happening.
Yes I wanted to suggest that as well, but first wanted to know if the current content is intended or not.
That would be nice :) Thank you! |
Just wanted to share that this issue exists already in some form. |
Hi,
First of all, I think this plugin works great and provides many GitHub reporting options. 👌
I'm currently in the process of running my tests in parallel to speedup my test run. Some of my tests now fail because of concurrency issues and that's why I also configured the Maven Surefire Plugin to rerun failed tests.
However, I notice a few things:
my regular

mvn test
command ultimately succeeds (i.e. after rerunning some tests, all my tests pass):however,
publish-unit-test-result-action
creates a check with a failed status (❌Maven Test Results
)the annotations show failed tests, since the first attempt they did indeed fail (however, succeeding attempts pass):

Note that my
mvn test
command completes with the following output (Flakes
signifies the reruns):the duration shown in the summary (57m) seems to be the sum of the parallel runs instead of the wall clock time
My simplified workflow:
Log output of the
Publish Test Results
step (sensitive parts removed):Example of annotations of 2 different tests where in both cases only the first attempt failed:

Just for completeness, but probably not necessary, relevant parts of
pom.xml
:Example of (part of) an xml report file that has a passing test after a rerun:
Is there a configuration parameter I missed that might create a succeeding Check (✅
Maven Test Results
) and that causes the report to show only succeeding tests (optionally after retries)? Or is rerunning simply not supported?The repository I'm working on is private, so unfortunately I can't give any direct links, but let me know if more information is needed.
The text was updated successfully, but these errors were encountered: