Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hpc rework #289

Merged
merged 26 commits into from
Jan 31, 2024
Merged

Hpc rework #289

merged 26 commits into from
Jan 31, 2024

Conversation

Gregtom3
Copy link
Contributor

@Gregtom3 Gregtom3 commented Dec 19, 2023

Briefly, what does this PR introduce?

Parallel computing capabilities with JLab's ifarm computing. Allows for multiple root files to be analyzed per job while maintaining proper Q2weighting across entire pipeline.

New script titled hpc/run-local-slurm-pipeline.rb creates the full pipeline that is run with a 'single button press'. After running the script, it will generate a new "run" script in hpc/project_scripts that will do the following for each campaign and beam energy.

  1. Creates default S3 .config file for the chosen campaign, detector configuration, and beam energy
  2. Splits the .config file into multiple batches, with a number of ROOT files per batch specified by the user
  3. Calculates the number of events for each Q2 range studied, either by reading the TTrees directly or calling their values from csvs in hpc/nevents_database. New files have their number of events automatically added to the database for faster access in the future.
  4. Manually calculate the Q2weights for each range and insert them into the batched .config files
  5. Run each .config file on the analysis macro specified by the user
  6. Merge the output ROOT Files

What kind of change does this PR introduce?

  • Bug fix (issue #__)
  • New feature (issue Support ifarm Slurm #236)
  • Documentation update
  • Other: Addressed Q2weight issue where beamCrossing!=0 changes the trueQ2 of the event, causing us to assign its Q2weight to the wrong value.

Please check if this PR fulfills the following:

  • Tests for the changes have been added
  • Documentation has been added / updated
  • Changes have been communicated to collaborators

Does this PR introduce breaking changes? What changes might users need to make to their code?

Does this PR change default behavior?

Yes. See Q2weight issue above.

…lurm script, and reworked S3 reading on slurm with xrootd
…nEvents.Now calculates the Q2weights which are manually inserted into the split config files. Also handled XsTotal, WeightTrackTotal, and TotalEvents in the merge.
…gns and append the nevents to a cached database for each file
@Gregtom3 Gregtom3 requested a review from cpecar December 19, 2023 15:05
@cpecar
Copy link
Contributor

cpecar commented Dec 21, 2023

Testing it on ifarm now, but it might be nice to have an option similar to the energy settings to select only whatever Q2 ranges you're interested in (not sure if this would make the Q2 weighting a pain though)

@cpecar
Copy link
Contributor

cpecar commented Dec 21, 2023

Screenshot 2023-12-21 at 10 58 00 AM May want to increase the memory request for the parallel analysis jobs, they all failed due to running out of memory for me.

@cpecar
Copy link
Contributor

cpecar commented Jan 4, 2024

Tested this again while being better at reading all your comments in the readme, and it seems like your default memory request should be okay up to 3 files per job.

@cpecar cpecar merged commit d72c69e into main Jan 31, 2024
111 of 125 checks passed
@cpecar cpecar deleted the hpc_rework branch January 31, 2024 16:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants