-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support for benchmarks #67
Comments
Sounds reasonable. Thinking out loud, if we detect that at least 1 package has benchmark results then we expand the table to include (as you suggested):
Am I missing any other fields? This does mean a test from another package that doesn't have benchmarks would be blanked out |
One concern I have is the length of each row. Example
There is a But with 3 added columns this may not look so pretty, especially if the there is wrapping for longer lines. Not sure what the right solution here is, maybe a different format. Orrr just let it run long. |
The only other one I see on the raw text output is the actual number of benchmarked operations, but I don't think that's necessarily a useful number by itself.
I'm not sure if you can run benchmarks and regular unit tests at the same time. When I do
Well, a benchmark doesn't pass or fail as far as I understand, so that one can be skipped. The package may also be redundant if the specific test name is listed. So that would leave elapsed, then probably the cpu/memory/allocations options. It seems like sometimes, the benchmark will only measure CPU (probably in cases where the code under test doesn't perform any memory allocations at all). One option for |
Took a first pass at this, just need to figure out how to nicely display so it makes sense. I'm thinking this might be a net new table. Sample output from running benchmarks in github.com/cilium/cilium
|
Adding myself to the list here as we would love to have this too :) |
I should have some cycles in December, so I'll try to bang this out. |
When running a benchmark through tparse (v0.10.3), it appears that the benchmark information is omitted from the table. It would be neat to be able to also display benchmark results in a table format.
Today:
Desired:
The text was updated successfully, but these errors were encountered: