-
Notifications
You must be signed in to change notification settings - Fork 282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[META] Automate performance testing #126
Comments
@anasalkouz - Can you update the status of this issue? Will this be ready by 1.3.0? Thanks! |
@dblock are we also looking to gate PR merges if they cause regression. This is important since we want to discover this earlier. We do not want to wait for a release only to discover a regression at which point it would get harder to pin point the exact commit(s) |
@kavilla @tianleh @seanneumann Do you have plans to execute performance testing for Dashboards soon? |
Closing this META issue as we have automated performance testing for OpenSearch. We will track the automation of performance testing for OpenSearch dashboards in this issue. |
Coming from #123, this is a meta issue for defining the performance testing framework as part of automated test infra. This issue focuses on performance regression testing and not scalability benchmarking
Tasks:
PerformanceTestSuite
class and wire it in totest.py
#215Bugs:
Exit Criteria:
perf-test
on the jenkins cluster completes without issue.test-orchestration-pipeline
is triggered, it triggers the performance testsThe text was updated successfully, but these errors were encountered: