Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[INSTALL]: METcalcpy, METplotpy, METdataio #1112

Open
14 tasks
mkavulich opened this issue May 14, 2024 · 35 comments
Open
14 tasks

[INSTALL]: METcalcpy, METplotpy, METdataio #1112

mkavulich opened this issue May 14, 2024 · 35 comments
Labels
question Further information is requested

Comments

@mkavulich
Copy link

mkavulich commented May 14, 2024

Package name

METcalcpy, METplotpy, METdataio

Package version/tag

v2.1

Build options

none

Installation timeframe

Please install on Hera for testing, and then include in the next release.

Other information

The DTC Agile Framework group is seeking to add plotting of verification statistics to the Short-Range Weather App workflow. This work relies on the three packages of the METplus Analysis Suite: METcalcpy, METplotpy, and METdataio. The latest versions of all these tools are v2.1, as they are released on the same cadence as all MET products.

We do not need these tools installed on WCOSS2, though as I understand it they have already been installed independently for use by EMC, so they should pass all necessary checks if that was desired eventually.

WCOSS2

  • Check this box if and only if your package should be installed on WCOSS2 Cactus and Dogwood (all spack-stack packages will be installed on Acorn). If not, you may disregard the rest of the items below and submit this request.

WCOSS2: General questions

No response

WCOSS2: Installation and testing

No response

WCOSS2: Technical & security review list

  • The code is mature, stable, and production ready
  • The code is does not and cannot use the internet, and does not contain URLs (http, https, ftp, etc.) except in comments
  • The package does not contain prebuilt binary files that have not been approved by NCO security review
  • The code has no publicly disclosed cybersecurity vulnerabilities and exposures (search https://cve.mitre.org/cve/)
  • The code is not prohibited by DHS, DOC, NOAA, or NWS
  • The code comes from a trusted source. Trusted sources include other NWS, NOAA, or DOC, agencies, or other Federal agencies that operate at a FISMA high or equivalent level. Additionally, trusted sources could be third-party agencies through which there is an existing SLA on file (such as RedHat).
  • The code is actively maintained and supported (it continues to get updates, patches, etc.)
  • The code is not maintained by a private entity operating in a foreign country (if it is, make a note below)
  • There is sufficient documentation to support maintenance
  • There are no known security vulnerabilities or weaknesses
  • Installing and running the code does not require privileged processes/users
  • There are no software dependencies that are unapproved or have security concerns (if there are, make a note below)
  • There are no concerns related to SA, SI, and SC NIST control families

WCOSS2: Additional comments

No response

@mkavulich
Copy link
Author

Just checking in on this if there's any more action or information needed from our end for this installation request. We would like to have a test install ready on Hera (or any other UFS tier-1 platform aside from WCOSS) in the next few weeks so we can demonstrate at least a test capability by the end of our period of performance (end of June), is this a realistic timeline?

@AlexanderRichert-NOAA
Copy link
Collaborator

FWIW it looks like there are not currently recipes for these three packages in Spack. Thankfully, they all have setup.py scripts, so creating recipes should be pretty straightforward. Since this is for SRW, can someone from EPIC take the lead on this? @ulmononian @natalie-perlin @RatkoVasic-NOAA ?

@RatkoVasic-NOAA
Copy link
Collaborator

FWIW it looks like there are not currently recipes for these three packages in Spack. Thankfully, they all have setup.py scripts, so creating recipes should be pretty straightforward. Since this is for SRW, can someone from EPIC take the lead on this? @ulmononian @natalie-perlin @RatkoVasic-NOAA ?

Sure, since I never did that, I'm asking Rick Grubin to show me how to do it.

@climbfuji
Copy link
Collaborator

I believe metcalpy is there - I just stumbled over it when working on an NRL plotting package.

@rickgrubin-noaa
Copy link
Collaborator

I believe metcalpy is there - I just stumbled over it when working on an NRL plotting package.

Which spack repo / release or branch? Not showing up in the definitive and JCSDA repos. Apologies if I'm being thick and missing it.

@climbfuji
Copy link
Collaborator

I mixed it up with metpy - my bad

@mkavulich
Copy link
Author

@AlexanderRichert-NOAA @RatkoVasic-NOAA Thanks for your work so far. Let me know if there's any help or info our team can provide to help the process along.

@rickgrubin-noaa
Copy link
Collaborator

@mkavulich must the components called out in respective requirements.txt files be pinned to the specific versions stated?, e.g.

metcalcpy
numpy==1.24.2

It seems 'yes' in that, for the simple example stated here, numpy versions differ across versions of metcalcpy. These were generated from pip freeze perhaps?

Thanks.

@mkavulich
Copy link
Author

@rickgrubin-noaa Our group doesn't maintain the MET code so I'm not sure, but I am pretty these are minimum versions, not hard requirements. Is the problem that this numpy version is in conflict with other python packages in spack-stack?

@rickgrubin-noaa
Copy link
Collaborator

@rickgrubin-noaa Our group doesn't maintain the MET code so I'm not sure, but I am pretty these are minimum versions, not hard requirements. Is the problem that this numpy version is in conflict with other python packages in spack-stack?

Thanks; after consultation with spack-stack folks, we'll work toward relaxing the numpy version hard requirement, as other packages run into the same collision.

@JeffBeck-NOAA
Copy link

@RatkoVasic-NOAA, @rickgrubin-noaa, @mkavulich, @climbfuji, just checking to see if there is an update on inclusion of METplotpy, METcalcpy, and METdataio into spack-stack. Thank you!

@climbfuji
Copy link
Collaborator

The blocker is gone (update of versions in package.yaml), my understanding is that @rickgrubin-noaa is still working on this. Once we've got the updates in spack-stack develop, they will be slated for roll-out in spack-stakc-1.8.0 (around early September).

@rickgrubin-noaa
Copy link
Collaborator

The blocker is gone (update of versions in package.yaml), my understanding is that @rickgrubin-noaa is still working on this. Once we've got the updates in spack-stack develop, they will be slated for roll-out in spack-stakc-1.8.0 (around early September).

Hi @JeffBeck-NOAA -- was away for ~3wks.

Will be merging the latest changes to spack-stack into my branch and updating the recipes for required components that do not have existing recipes and rebuilding test envs, updates to follow.

@JeffBeck-NOAA
Copy link

Thanks, @rickgrubin-noaa!

@climbfuji
Copy link
Collaborator

We should target numpy 1.25.x if at all possible. That's the latest version that works for all our Python packages - 1.26 breaks many of them.

@rickgrubin-noaa
Copy link
Collaborator

On hera, as user myself (not the EPIC role account), and with latest commits that enforce numpy 1.25.x and other requirements per configs/common/packages.yaml:

Requirements for metcalcpy and metplotpy need scikit-image per package requirements:

metcalcpy: [email protected] or above
metplotpy: [email protected]

As configured, concretization settles on [email protected] and [email protected] for a spack env that upstream chains to the Intel-based stack v1.7.0. Trying to use newer versions of py-setuptools results in concretization errors combined with py-cython.

When attempting an install, [email protected] not only doesn't meet requirements for metplotpy, it also fails to install at all:

     961       INFO: C compiler: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack/lib/spack/env/intel/icc -Wsign-compare -Wunreachable-code -DNDEBUG -g -O3 -Wall -fPIC -fPIC
     962     
     963       INFO: compile options: '-I/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/python-venv-1.0-ty237wr/in
             clude -I/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/python-3.10.13-mldz2f2/include/python3.10 -c'
     964       extra options: '-xKNM -Werror'
     965       WARN: CCompilerOpt.dist_test[636] : CCompilerOpt._dist_test_spawn[770] : Command (/scratch1/NCEPDEV/nems/Richard.Grubin/git/spa
             ck-stack/spack/lib/spack/env/intel/icc -Wsign-compare -Wunreachable-code -DNDEBUG -g -O3 -Wall -fPIC -fPIC -I/scratch1/NCEPDEV/ne
             ms/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/python-venv-1.0-ty237wr/include -I/scratch1/NCEPDEV/nems/Richard.Grubin/en
             vs/met.hera/install/intel/2021.5.0/python-3.10.13-mldz2f2/include/python3.10 -c /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.he
             ra/install/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/python3.10/site-packages/numpy/distutils/checks/cpu_avx512_knm.c -o /tmp/tm
             pk8t6sb52/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/python3.10/site-p
             ackages/numpy/distutils/checks/cpu_avx512_knm.o -MMD -MF /tmp/tmpk8t6sb52/scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/inst
             all/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/python3.10/site-packages/numpy/distutils/checks/cpu_avx512_knm.o.d -xKNM -Werror) 
             failed with exit status 4 output ->
     966       icc: command line warning #10121: overriding '-march=haswell' with '-xKNM'
  >> 967     ": internal error: IERROR_MODULE_ID_1102
     968     
     969       compilation aborted for /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/install/intel/2021.5.0/py-numpy-1.25.2-2rmiq6b/lib/
             python3.10/site-packages/numpy/distutils/checks/cpu_avx512_knm.c (code 4)
     970     
     971       WARN: CCompilerOpt.feature_test[1575] : testing failed
     972       INFO: CCompilerOpt.__init__[1815] : skip features (SSE SSE2 SSE3) since its part of baseline
     973       INFO: CCompilerOpt.__init__[1819] : initialize targets groups

     ...

     1375                ^
     1376    
     1377      skimage/feature/_cascade.cpp(112214): warning #1292: unknown attribute "fallthrough"
     1378                CYTHON_FALLTHROUGH;
     1379                ^
     1380    
  >> 1381    ": internal error: ** The compiler has encountered an unexpected problem.
     1382      ** Segmentation violation signal raised. **
     1383      Access violation or stack overflow. Please contact Intel Support for assistance.

Need assistance / discussion as to whether or not this work requires using the role account, and the accepted method(s) for explicitly requiring certain package versions / range of versions for requirements.

@rickgrubin-noaa
Copy link
Collaborator

Marking as blocked; versions for various python packages, in particular py-setuptools, upon which a great deal depends:

==> Concretized py-setuptools%intel
 -   kld7qgg  [email protected]%[email protected] build_system=generic arch=linux-rocky8-haswell
[e]  mxvoe7u      ^[email protected]%[email protected] build_system=autotools arch=linux-rocky8-haswell
 -   jcmbfvi      ^[email protected]%[email protected] build_system=generic arch=linux-rocky8-haswell
 -   blg2gbg      ^[email protected]%[email protected]+bz2+crypt+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tkinter+uuid+zlib build_system=generic patches=0d98e93,7d40923,ebdca64,f2fd060 arch=linux-rocky8-haswell
[e]  lpyczhf          ^[email protected]%[email protected]~debug~pic+shared build_system=generic arch=linux-rocky8-haswell
 -   isjrtpq          ^[email protected]%[email protected]+libbsd build_system=autotools arch=linux-rocky8-haswell
 -   wecsnwm              ^[email protected]%[email protected] build_system=autotools arch=linux-rocky8-haswell
 -   eupt4k5                  ^[email protected]%[email protected] build_system=autotools arch=linux-rocky8-haswell
 -   r3dqdul          ^[email protected]%[email protected] build_system=autotools arch=linux-rocky8-haswell
[e]  xqgzbbl          ^[email protected]%[email protected]+bzip2+curses+git~libunistring+libxml2+pic+shared+tar+xz build_system=autotools patches=9acdb4e arch=linux-rocky8-haswell
[e]  fesaeeg          ^[email protected]%[email protected]~guile build_system=generic patches=ca60bd9 arch=linux-rocky8-haswell
 -   3vzglds          ^[email protected]%[email protected] build_system=autotools arch=linux-rocky8-haswell
 -   3toxiab          ^[email protected]%[email protected]~obsolete_api build_system=autotools patches=4885da3 arch=linux-rocky8-haswell
[e]  hzm46vt              ^[email protected]%[email protected]~cpanm+opcode+open+shared+threads build_system=generic patches=0eac10e,3bbd7d6 arch=linux-rocky8-haswell
[e]  akhzxhm          ^[email protected]%[email protected]~symlinks+termlib abi=5 build_system=autotools patches=daee321,f84b270 arch=linux-rocky8-haswell
 -   3opswd6          ^[email protected]%[email protected]~docs+shared build_system=generic certs=mozilla arch=linux-rocky8-haswell
 -   sgoeeip              ^ca-certificates-mozilla@2023-05-30%[email protected] build_system=generic arch=linux-rocky8-haswell
[e]  ijo4xqm          ^[email protected]%[email protected]+internal_glib build_system=autotools patches=49ffcd6 arch=linux-rocky8-haswell
 -   slunwfa          ^[email protected]%[email protected] build_system=autotools patches=bbf97f1 arch=linux-rocky8-haswell
 -   draficb          ^[email protected]%[email protected]+column_metadata+dynamic_extensions+fts~functions+rtree build_system=autotools arch=linux-rocky8-haswell
 -   xnwdjcw          ^[email protected]%[email protected] build_system=autotools arch=linux-rocky8-haswell
[e]  xin7ooq          ^[email protected]%[email protected]~pic build_system=autotools libs=shared,static arch=linux-rocky8-haswell
 -   ks4wym4          ^[email protected]%[email protected]+compat+new_strategies+opt+pic~shared build_system=autotools arch=linux-rocky8-haswell
 -   r6lyonh      ^[email protected]%[email protected] build_system=generic arch=linux-rocky8-haswell

concretize to a version that won't support minimum package version requirements for:

metcalcpy: [email protected] or above
metplotpy: [email protected]

With the above concretization, scikit-image will concretize to v0.18.3 – other python packages must be upgraded in order to satisfy (at least) metplotpy, thus this is a bigger issue requiring a broader discussion.

@climbfuji climbfuji added the question Further information is requested label Jul 26, 2024
@rickgrubin-tomorrow
Copy link

Steps to demonstrate package dependencies are concretized to versions that are insufficient to meet requirements for metcalcpy / metdataio / metplotpy:

  • Add new package files for py-eofs / py-imutils / py-opencv-python in addition to py-metcalcpy / py-metdataio / py-metplotpy. These are straightforward as they'll install from pypi.
  • Create an env with specs for py-metcalcpy / py-metdataio / py-metplotpy
grubin@hera-hfe06% spack stack create env --site hera --template empty --name met.hera --dir $SCRATCH/envs --overwrite
Configuring basic directory information ...
  ... script directory: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack-ext/lib/jcsda-emc/spack-stack/stack
  ... base directory: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack-ext/lib/jcsda-emc/spack-stack
  ... spack directory: /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/spack
==> Environment /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera exists. Overwriting...
Creating environment from command-line args
Copying site includes from /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/configs/sites/tier1/hera ...
  ... to /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/site
Copying common includes from /scratch1/NCEPDEV/nems/Richard.Grubin/git/spack-stack/configs/common/modules_lmod.yaml ...
  ... to /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/common/modules.yaml
Successfully wrote environment at /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera/spack.yaml

Checked user umask and found no issues (0022)

==> Created environment /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera

grubin@hera-hfe06% spack env activate /scratch1/NCEPDEV/nems/Richard.Grubin/envs/met.hera

# edit spack.yaml to add the following specs:

  - py-metcalcpy%intel
  - py-metdataio%intel
  - py-metplotpy%intel

grubin@hera-hfe06% spack concretize 2>&1 | tee $SCRATCH/envs/met.hera/log.concretize

With the above concretization, scikit-image (for example, which also fails to compile, see above)) will concretize to v0.18.3 (minimum required version is v0.19.3)– other python packages, including at least metpy / netcdf4 / pint must also be upgraded in order to satisfy (at least) metplotpy.

I suspect that part of the problem is linked to:

    py-setuptools:
      require: '@63.4.3'

which seems to necessarily downgrade some package versions in order to be satisfied.

File log.concretize is attached.

met.hera.concretize.log

@mkavulich
Copy link
Author

I see that this was removed from the list of spack-stack 1.8 features. Can we get an update on the status of this issue? If I am following correctly, the holdup is that there are other packages already in spack-stack that would need to be upgraded in order to accommodate the requirements of metplotpy?

@rickgrubin-noaa
Copy link
Collaborator

I see that this was removed from the list of spack-stack 1.8 features. Can we get an update on the status of this issue? If I am following correctly, the holdup is that there are other packages already in spack-stack that would need to be upgraded in order to accommodate the requirements of metplotpy?

As described in the comment from August 6, there is at least this issue for attempting to integrate into spack-stack v1.7.0:

scikit-image (for example) will concretize to v0.18.3 (minimum required version is v0.19.3)– other python packages, including at least metpy / netcdf4 / pint must also be upgraded in order to satisfy (at least) metplotpy.

Further, the pinning of py-setuptools

    py-setuptools:
      require: '@63.4.3'

appears to necessarily downgrade some other package versions -- which then do not satisfy other package requirements -- in order for said pinned version's requirements to be met.

Per today's biweekly spack-stack meeting, this issue will be deferred until v1.8.0 is released, at which time a newer version of python (v3.11.7) will be the default. At that time, package version pinning will be relaxed in an attempt to satisfy requirements for METcalcpy / METplotpy / METdataio while still building a viable and valid stack.

@climbfuji
Copy link
Collaborator

@mkavulich This doesn't necessarily mean that you have to wait until the next release. We can provide a test stack on one platform after we figured out the dependencies, and if that works we can make addon environments to the existing 1.8.0 stack available on selected platforms (likely not all),

@rickgrubin-noaa
Copy link
Collaborator

rickgrubin-noaa commented Oct 21, 2024

With spack-stack @ release/1.8.0

Created a unified-dev env and unpinned versions for

  • py-setuptools
  • py-numpy

This yielded a concretized environment that successfully built metcalcpy / metdataio / metplotpy with adequately versioned dependent packages. Concretization chose:

However, it also creates a duplicate concretization, choosing [email protected] for [email protected] and [email protected] (and it seems we must stay at [email protected] per this issue), and duplicate concretizations for many other packages (as identified by show_duplicate_packages.py).

The entire env successfully installs.

It's not clear to me how to rectify this duplicate concretization -- suggestions / help appreciated, thanks.

@AlexanderRichert-NOAA
Copy link
Collaborator

Spack allows duplicates for dependencies with type=build, even for unify:true. This happens a lot with py-setuptools and other packages. One solution might be to just ignore it, then blacklist the py-setuptools module (though this is still not an ideal solution as it can still lead to problems when reconcretizing an environment part way through installation). Are you able to pin a single version of py-setuptools? I'm a bit confused as to why it's allowing [email protected] to be concretized based on the version requirement...

@rickgrubin-noaa
Copy link
Collaborator

I misspoke above (fixed the comment, I apolgize for confusion).

With spack-stack @ release/1.8.0

For a simple environment for only metcalcpy / metdataio / metplotpy:

Pinning [email protected] and not py-numpy eliminates duplicates, but concretizes to [email protected] which, it seems, is bad juju for other python packages.

Pinning py-numpy@:1.24.2 (note the leading colon) and [email protected] yields

==> Error: concretization failed for the following reasons:

   1. cannot satisfy a requirement for package 'py-setuptools'.

Pinning [email protected] (note the absence of a colon) and [email protected] yields

==> Error: concretization failed for the following reasons:

   1. cannot satisfy a requirement for package 'py-setuptools'.

Pinning [email protected]: (note the trailing colon) and [email protected] concretizes to [email protected].

Pinning [email protected] and [email protected] yields [take note!]

==> Error: concretization failed for the following reasons:

   1. cannot satisfy a requirement for package 'py-setuptools'.

Further, with this simplified metpy env to install only metcalcpy / metdataio / metplotpy, unpinning both py-setuptools and py-numpy results in a concretization:

This is the same as the unified-dev concretization, and the concretized environment successfully built metcalcpy / metdataio / metplotpy with adequately versioned dependent packages.

However, as above, it also creates a duplicate concretization, choosing [email protected] for [email protected] and [email protected] (and it seems we must stay at [email protected] per this issue), and duplicate concretizations for many other packages (as identified by show_duplicate_packages.py).

This confuses me; allowing the concretizer to choose package versions is OK, but then pinning packages to those versions is not OK. I'll compare concretization logs of the two scenarios and try to ascertain what's different.

@rickgrubin-noaa
Copy link
Collaborator

Following on to the previous comment:

Pinning py-numpy at some version less than 1.26, e.g. [email protected], and unpinning py-setuptools is viable. There is duplicate concretization with respect to py-setuptools for a few packages (concretizing at [email protected]), notably py-matplotlib and py-numpy (!).

Can't seem to come up with anything that doesn't have duplicate concretizations for py-setuptools.

@climbfuji
Copy link
Collaborator

The culprit is probably [email protected]. If you look at the package definition, it limits setuptools to @:63. Try to use [email protected] (and remove the pinning of py-shapely entirely - 1.8.0 is ancient).

@rickgrubin-noaa
Copy link
Collaborator

[email protected] -- is this comment still applicable? I have been attempting to adhere to it:

configs/common/packages.yaml

  # [email protected] causes many build problems with older Python packages
  # also check Nautilus site config when making changes here
  # https://github.com/JCSDA/spack-stack/issues/1276
  py-numpy:
    require:
    - '@:1.23.5'

@climbfuji
Copy link
Collaborator

I was able to build a decent stack on my dev system (Oracle Linux 9) with gcc@13. It's just a suggestion to see if it helps your problem.

@rickgrubin-noaa
Copy link
Collaborator

[email protected] and [email protected] concretizes without duplicates. I'd seen this prior, but was trying to adhere to the warning message in configs/common/packages.yaml

That said, with:

https://github.com/rickgrubin-noaa/spack-stack/tree/SI-1052 which sets up the versions noted above, and

https://github.com/rickgrubin-noaa/spack/tree/SI-1052 which contains spack packages for:

  • py-metcalcpy
  • py-metdataio
  • py-metplotpy
  • py-eofs
  • py-imutils
  • py-opencv-python

envs for py-metcalcpy and py-metplotpy are successfully created:

  • py-metcalcpy | /scratch1/NCEPDEV/nems/Richard.Grubin/envs/metcalcpy.hera.intel
  • py-metplotpy | /scratch1/NCEPDEV/nems/Richard.Grubin/envs/metplotpy.hera.intel

whereas for

  • py-metdataio | /scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel

fails to install metdataio (see log.install there):

==> Installing py-metdataio-2.1.1-rn733io6rdezerbrtv7jiteimvg4wvoa [69/69]
==> No binary for py-metdataio-2.1.1-rn733io6rdezerbrtv7jiteimvg4wvoa found: installing from source
==> Fetching https://github.com/dtcenter/METdataio/archive/refs/tags/v2.1.1.tar.gz
==> No patches needed for py-metdataio
==> py-metdataio: Executing phase: 'install'
[...]
  Running command Preparing metadata (pyproject.toml)
  Preparing metadata (pyproject.toml): finished with status 'done'
ERROR: Exception:
Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/cli/base_command.py", line 169, in exc_logging_wrapper
    status = run_func(*args)
             ^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/cli/req_command.py", line 248, in wrapper
    return func(self, options, args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/commands/install.py", line 377, in run
    requirement_set = resolver.resolve(
                      ^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 73, in resolve
    collected = self.factory.collect_root_requirements(root_reqs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 491, in collect_root_requirements
    req = self._make_requirement_from_install_req(
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 453, in _make_requirement_from_install_req
    cand = self._make_candidate_from_link(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 206, in _make_candidate_from_link
    self._link_candidate_cache[link] = LinkCandidate(
                                       ^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 293, in __init__
    super().__init__(
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 156, in __init__
    self.dist = self._prepare()
                ^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 225, in _prepare
    dist = self._prepare_distribution()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 304, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 516, in prepare_linked_requirement
    return self._prepare_linked_requirement(req, parallel_builds)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 631, in _prepare_linked_requirement
    dist = _get_prepared_distribution(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 69, in _get_prepared_distribution
    abstract_dist.prepare_distribution_metadata(
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/distributions/sdist.py", line 61, in prepare_distribution_metadata
    self.req.prepare_metadata()
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/req/req_install.py", line 555, in prepare_metadata
    self.metadata_directory = generate_metadata(
                              ^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/operations/build/metadata.py", line 35, in generate_metadata
    distinfo_dir = backend.prepare_metadata_for_build_wheel(metadata_dir)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_internal/utils/misc.py", line 713, in prepare_metadata_for_build_wheel
    return super().prepare_metadata_for_build_wheel(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_impl.py", line 186, in prepare_metadata_for_build_wheel
    return self._call_hook('prepare_metadata_for_build_wheel', {
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_impl.py", line 321, in _call_hook
    raise BackendUnavailable(data.get('traceback', ''))
pip._vendor.pyproject_hooks._impl.BackendUnavailable: Traceback (most recent call last):
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/py-pip-23.1.2-ddjfool/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 77, in _build_backend
    obj = import_module(mod_path)
          ^^^^^^^^^^^^^^^^^^^^^^^
  File "/scratch1/NCEPDEV/nems/Richard.Grubin/envs/metdataio.hera.intel/install/intel/2021.5.0/python-3.11.7-m2gbofx/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1126, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1140, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'setuptools'

Not sure where to start! setuptools is installed in that env, the package is a simple install (effectively identical to metcalcpy and metplotpy), and successfully installs into a venv from the shell level.

I'm new to creating spack packages, and perhaps have done something incorrectly with metdataio.

@climbfuji
Copy link
Collaborator

does metdataio have py-setuptools listed as a build-time dependency?

@rickgrubin-tomorrow
Copy link

@mkavulich -- a test env can be built for you; questions, please:

is hera an acceptable host?
is a full unified-dev env appropriate?

Thanks.

@rickgrubin-noaa
Copy link
Collaborator

@mkavulich -- a test env can be built for you; questions, please:

  • is hera an acceptable host? is a full unified-dev env appropriate?
  • is a full unified-dev env appropriate?

Thanks.

@mkavulich -- how would you like to proceed?

@mkavulich
Copy link
Author

@rickgrubin-noaa Sorry for the delay in getting back to you. Hera is acceptable for testing. I am not sure about the differences between environments; is unified-dev just the standard build of spack-stack used for UFS purposes?

@rickgrubin-noaa
Copy link
Collaborator

@rickgrubin-noaa Sorry for the delay in getting back to you. Hera is acceptable for testing. I am not sure about the differences between environments; is unified-dev just the standard build of spack-stack used for UFS purposes?

@mkavulich yes -- unified-dev is the stack against which UFS is built / tested.

@mkavulich
Copy link
Author

In that case then unified-dev sounds correct. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
Development

No branches or pull requests