Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Fortran code coverage analysis #91

Closed
SeanBryan51 opened this issue Jun 9, 2023 · 9 comments · Fixed by #298
Closed

Add Fortran code coverage analysis #91

SeanBryan51 opened this issue Jun 9, 2023 · 9 comments · Fixed by #298
Assignees
Labels
enhancement New feature or request priority:high High priority issues that should be included in the next release.

Comments

@SeanBryan51
Copy link
Collaborator

Code coverage analysis would allow us to see where in the CABLE code base is being executed for a given set of science configurations. This would help users (and us) in choosing an optimal set of science configurations to use for testing.

Resources:

@SeanBryan51 SeanBryan51 added the enhancement New feature or request label Jun 9, 2023
@MartinDix
Copy link

Code coverage was added to the NCI JULES rose stem tests in https://code.metoffice.gov.uk/trac/jules/ticket/1266. Might be something useful to copy there.

@bschroeter
Copy link
Collaborator

Hey team! Please add your planning poker estimate with Zenhub @ccarouge @SeanBryan51

@SeanBryan51 SeanBryan51 self-assigned this Dec 4, 2023
@abhaasgoyal abhaasgoyal self-assigned this Feb 15, 2024
@SeanBryan51 SeanBryan51 removed their assignment Mar 18, 2024
@SeanBryan51
Copy link
Collaborator Author

Unassigning myself from this for now

@ccarouge
Copy link
Collaborator

This should wait for #258 before looking into it.

@SeanBryan51 SeanBryan51 added the priority:medium Medium priority issues to become high priority issues after a release. label Mar 21, 2024
@ccarouge ccarouge added priority:high High priority issues that should be included in the next release. and removed priority:medium Medium priority issues to become high priority issues after a release. labels Apr 24, 2024
@abhaasgoyal
Copy link

To set profile generated file on codecov compilation on a specific directory an additional flag should be set prof-dir=<profile-generated-files> . I was thinking to set it to runs/fluxsite/analysis/codecov . But the thing is the directory already has to exist during build stage (https://www.intel.com/content/www/us/en/docs/fortran-compiler/developer-guide-reference/2023-2/prof-dir-qprof-dir.html). Now we have some options here:

  1. Generate profile information in the same folder as the build-dir, and then move the profile information in codecov after runtime for all experiments.
  2. Change the directory structure and move analysis to top level (along with runs and src). Create analysis dir before building. This would change a bit of bitwise-comparisons as well, but not by much. This seems like the cleanest option to me, but do we want the infomation that the analysis was for fluxsite/spatial/etc
  3. Some other potential option I haven't thought of (a share folder perhaps)

@SeanBryan51
Copy link
Collaborator Author

I was thinking to set it to runs/fluxsite/analysis/codecov . But the thing is the directory already has to exist during build stage

The runs/fluxsite/analysis/codecov seems like a good spot for it. Just remember to create separate coverage directories for each model version in realisations. On the directory existing before build time, can we create the directory (parents included) before invoking the build?

@ccarouge
Copy link
Collaborator

ccarouge commented Jul 5, 2024

I think creating the run directories before the build should work fine.
The only problem is having the codecov/ directory underneath the fluxsite directory. What happens if we want the code coverage only for the spatial runs (benchcab spatial)? Should the serial compilation use codecov/ under fluxsite while the MPI compilation uses codecov/ under spatial?

Or should we bring analysis up directly under runs? But then why not move outputs?

@abhaasgoyal
Copy link

abhaasgoyal commented Jul 8, 2024

I think codecov/ shouldn't be under fluxsite/, becuase unlike outputs/ we want code-coverage analysis grouped by each binary irrespective any type of run (fluxsite/spatial) for a given set of science configurations. We aggregate the results for all runs done by a single binary using profmerge in a single folder.

I propose the following hierarchy

.
├── runs
│   ├── analysis
│   │   ├── bitwise-comparisons
│   │   │   ├── spatial (not yet implemented/if no need - then just have fluxsite-bitwise-comparisons as folder name)
│   │   │   └── fluxsite
│   │   └── codecov
│   │   │   ├── <realisation_no>_<serial/parallel>
│   │   │    │   ├── .dpi,.spi and .dyn files for each run based on met forcing and science configurations
│   │   │    │   └── ...
│   │   │   └── ...
│   ├── fluxsite
│   │   ├── logs
│   │   ├── outputs
│   │   └── tasks
│   ├── spatial
│   └── payu-laboratory

Note (unrelated to the issue): Also wanted to ask why we have payu-laboratory as a separate directory to spatial? Is it because payu can also run fluxsite tasks and is independent of spatial? If yes, then I think for clarity, runs could be divided in a stage-wise manner (like setup / output / analysis) at the top level.

@SeanBryan51
Copy link
Collaborator Author

I think codecov/ shouldn't be under fluxsite/, becuase unlike outputs/ we want code-coverage analysis grouped by each binary irrespective any type of run (fluxsite/spatial) for a given set of science configurations. We aggregate the results for all runs done by a single binary using profmerge in a single folder.

Yep I think that's a good approach. To keep changes minimal, can we put the codecov directory on the root directory (i.e. same level as runs) and keep the existing structure in the runs directory?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request priority:high High priority issues that should be included in the next release.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants