-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
submission: automatically identify "pyhf" file_type of resource files #163
Comments
More HEPData records have appeared with attached additional_resources:
- description: Archive of full likelihoods in the HistFactory JSON format described
in ATL-PHYS-PUB-2019-029. In the sub-directory the statiscal models SR-lowMass,
SR-highMass and SR-combined are provided. The SR-combined is a combined fit of
the SR-lowMass and SR-highMass. For each model the background-only model is found
in the file named 'BkgOnly.json'. For each model a set of patches for various
signal points is provided
location: SUSY-2018-04_likelihoods.tar.gz additional_resources:
- description: Archive of full likelihoods in the HistFactory JSON format described
in SUSY-2018-06. The background-only fit is found
in the file named 'BkgOnly.json'. For each model a set of patches for various signal points is
provided
location: likelihoods_ANA-SUSY-2018-06_3L-RJ-mimic.tar.gz The proposed convention of a |
Agreement with Louie Corpe for HEPData recommendations for ATLAS: if location.endswith('.tar.gz') and \
('histfactory json' in description.lower() or 'pyhf' in description.lower()):
file_type = 'pyhf_tarball' whereas individual pyhf JSON files (#164) would have: file_type = 'pyhf_json' For individual pyhf JSON files when we provide native support, we would probably identify them by requiring a |
Can we add an option that allows you to override the automated detection? E.G. something like
|
Hopefully, we'll get around to tackling this issue soon. But I noticed that the latest HEPData record (released today) with an attached pyhf archive has: additional_resources:
- description: Archive of full statistical likelihoods and README
location: FullLikelihoods_sm.tar.gz @ldcorpe, is the agreement above not being followed by ATLAS? Should we add "likelihoods" as an additional trigger phrase in addition to "histfactory json" or "pyhf"? @kratsg, yes, we should probably allow something like |
Hi @GraemeWatt , thanks for flagging this. I can talk with the SUSY conveners and try to get this sorted out. Im realizing that this is likely just a miscommunication somewhere. One thing I'm worried about is relying on the location filename/description for these sorts of things. I would definitely prefer a
|
Plan is:
|
A minor nitpick (admittedly) but I think we should call it "HistFactory" or "HistFactory JSON" or similar, rather than |
|
@kratsg @lukasheinrich @matthewfeickert @cranmer : do you get notifications from the HEPData Zulip instance? I posted a message two days ago (tagging |
Hi @GraemeWatt , yes I did and it looks great. This past week was particularly busy, but I'll try to go over it next week. Thanks, |
I've commented on Zulip. Do you want me to copy/paste it into GitHub as well? |
Thanks for the comments. We can continue the discussion on Zulip, so no need to repeat comments here. |
Now deployed in production and sent tweets to advertise new search options: |
Thank you so much HEPData team — this is amazing! 🚀 |
Consider the pyhf JSON files attached to https://www.hepdata.net/record/ins1748602?version=1 as an additional resource:
It would be good if we could automatically identify
pyhf
tarballs from thedescription
andlocation
of theadditional_resources
in thesubmission.yaml
file, then we can make these resource files more prominent (as we already do for links to Rivet analyses). Can we agree on some convention, e.g. alocation
ending inworkspaces.tar.gz
, that will be used for futurepyhf
uploads to HEPData? The code can then check for this convention to writefile_type
aspyhf
in thedataresource
table of the database.Cc: @lukasheinrich
The text was updated successfully, but these errors were encountered: