-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limitation on how many tests can be excluded or included #78
Comments
Did you try using a local deployment using https://github.com/cta-wave/WMAS-deploy? The hosted instance is intended for demo usage, not for production. |
Thanks @FritzHeiden I have tried already on a local deployment version wmas2020 and it doesnt work for long list of tests either. |
@FritzHeiden @louaybassbouss could we please get an update on this issue. @yanj-github is seeing an issue with trying to run any subset of tests greater than 100 in number regardless of where the WMAS TR is installed. Can you please confirm if you also see this issue, and if so, if there is a planned approach to resolve it (we are assuming and HbbTV subset will be run using the json file output from the HbbTV activities as an input, if not can you please confirm what the approach needs to be taken? How do you currently successfully run a large subset of tests (greater than 100 tests either added or excluded)? |
@yanj-github @pshorrock I pushed a fix for this issue to wmas2021 branch
We are running large subsets by setting them up in the file system using shell scripts. This, however, is only applicable because the desired subset is fixed for a single wmas release. The exclude list was intended for removing problematic tests, which never exceeded 100, so we never came across this performance issue. Thanks for reporting. |
Tested and working on local set up. |
@FritzHeiden Is this fixed on all other versions and applied to the cloud hosted version please? |
I applied the fix to wmas2021, wmas2020 and wmas2019 code base and redeployed the new versions. For wmas2018 to apply the fix some more testing required |
Hi, what is the ETA on this ticket being resolved? HbbTV 2.0.3 references WMAS 2018 and therefore receivers that would be expected to pass a subset defined by HbbTV need to exclude the tests in that list. It won't be a daily variable list, but it will change outside the cycle of WMAS updates - because HbbTV itself hasn't even finally approved/agreed the official subset - and I anticipate it will be updated from time to time. Same problem for ATSC - they'll use the same subset but its conceivable they'll identify APIs/tests that need to be included/excluded independently of HbbTV. And then I might as a manufacturer want to run a longer list, but not all tests in areas I know I don't support. So for lots of reasons, users of the WMAS test tools need to be able to define an exclude list that isn't tied to the version of WMAS, and isn't too complex to apply to their running instance.. I guess its fine if that level of complexity isn't supported on the "non" production hosted versions but it needs to be supported somehow in deployed versions. Thanks |
I created a PR to update the test runner of wmas2018: #85 As wmas2018 is rather old and it was intended to keep it up to date with the latest runner there is quiet a diff between the versions. I will have to perform a few test runs to see if everything works as expected. |
This is how we will proceed with this issue: The newest version of the test runner will be used to run the 2018 subset of tests. This makes it easy to apply fixes from the other versions. |
Ref: #77
As far as I understnad there shouldn't be any limitation on how many tests can be excluded.
However, when I am entering longer list of excluded tests (more than 100) the WAMS server hangs and won't process further.
Is there a way for user to configure a test session with longer excluded or included test list please?
The text was updated successfully, but these errors were encountered: