Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Result sheet of the Network Perf tests is missing metrics #428

Closed
SachinNinganure-zz opened this issue Jun 10, 2022 · 5 comments
Closed

Comments

@SachinNinganure-zz
Copy link

The result sheet of the network performance tests coming back doesn't have the metadata and is missing some of the metrics. Examples of missing metrics: avg(norm_ltcy), avg(norm_ops) , max(norm_byte)

Please see this report which has missing metrics --https://docs.google.com/spreadsheets/d/1yMA1DStv6GbC0t0LZ0VOYO9rTRxi6Ym5_nGWYEk23bA/edit#gid=1690151223

@paigerube14
Copy link
Collaborator

paigerube14 commented Jun 21, 2022

@SachinNinganure are you able to give access to the google sheet with everyone in Red Hat and show what it looked like before and what's output now. I am not able to see the sheet at all so it's hard to help

@SachinNinganure-zz
Copy link
Author

SachinNinganure-zz commented Jun 22, 2022

@paigerube14 https://docs.google.com/spreadsheets/d/1VqHxJE7aLoXBcOaWzaSBibVz5AvI_zeROAoCo5zPWFY/edit#gid=2073637806 this is what it looked earlier.
gave you the access for both the spreadsheets

@paigerube14
Copy link
Collaborator

paigerube14 commented Jun 23, 2022

I have started working on this but in relation to issue: #395

I think if we are able to have the same code currently in kube-burner be used for all the workloads that use benchmark-comparison (ie kube-burner, network-perf and router-perf) we could add these metrics back

This will allow each user to be able to get the data in their google sheets that they want by setting their specific touchstone-config files per run

Right now I have only tested from the utils folder directly not in the workloads themselves
https://github.com/paigerube14/e2e-benchmarking/tree/compare_combine

@paigerube14
Copy link
Collaborator

@SachinNinganure-zz you should be able to use the below going forward to get all the data you need. You should also be able to add any variables in the aggregations section for anything that's missed

COMPARISON_CONFIG="uperf-touchstone.json uperf-touchstone-norm.json"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants