-
Notifications
You must be signed in to change notification settings - Fork 4
Updating the Core Conda Packages
The contents of this page explain what is in the corresponding workflows to update ska3-core packages:
The update from scratch also updates ska3-flight
and ska3-perl
in the process.
Be aware that this page might be a bit outdated in relation to the code in the workflows.
Updating ska3-core entails:
- assembling the list of all packages installed for each platform (linux-64, osx-64, win-64),
- saving these lists and combining them into a single YAML file,
- collecting the actual packages to place them in our repository
- collecting the patches that need to be applied to the packages
There are two different types of updates:
-
from scratch. This starts from the specifications in
ska3-core-latest/meta.yaml
,ska3-flight-latest/meta.yaml
andska3-perl-latest/meta.yaml
, which list direct dependencies without specifying versions. It uses a top-level script calledinstall_from_scratch.py
forska3-flight-latest
andska3-core-latest
. -
incrementally. This starts from an existing
ska3-core
package already in the conda channel, adding/updating some packages on top of it.
If the update will include increasing the python version, then some ska3 packages might need to be re-built (most probably any package that lists python in their recipe).
-
Edit
pkg_defs/ska3-*-latest/meta.yaml
and optionally edit environment files in their respective directories. The idea of these environment files is that they can help the dependency resolution by splitting the installation task in stages, and they can also specify alternate channels. -
Assemble the package list and create new
meta.yaml
files (the recipes with specific versions). This proceeds in two stages:- Create working environments in each platform and list all packages installed, following this gist.
- Combine the package lists from all platforms into a final
meta.yaml
file for each meta-project, following this gist
There is a Github workflow that should automate this (described below), but unfortunately it always needs tweaks and it is not working as of this writing. The following python code triggers the workflow:
from skare3_tools import github repo = github.Repository('sot/skare3') repo.dispatch_event( event_type='conda-meta-yaml', client_payload={'version': '2021.1', 'skare3_branch': '2021.1'} )
After/if the workflow succeeds, it produces three artifacts:
-
conda-meta.zip
: the meta.yaml files for ska3-core, ska3-flight and ska3-perl -
conda-packages.zip
: the built ska3-core, ska3-flight and ska3-perl conda packages -
json-files.zip
: various JSON files, including the list of all packages downloaded and a fewpatch_instructions.json
files.
-
Download packages into conda channel (I do this on kady using this gist):
unzip json-files.zip ./fetch_packages.py --channel www/ASPECT/ska3-conda/prime *json`
-
Combine the
noarch/patch_instructions.json
into a tarball namedpatch_instructions.tar.bz2
(using this gist):./combine_patch_instructions.py patch_instructions-*/*/*json
-
index the channel:
conda index -p patch_instructions.tar.bz2 www/ASPECT/ska3-conda/prime
The following procedure produces meta.yml
for ska3-core, ska3-flight and ska3-perl, taking the latest versions of all packages that are available in the conda channels (and should be equivalent to the older
create-ska3-meta.sh and
combine-meta.sh gists):
- Create/review the ska3-core-latest, ska3-flight-latest, ska3-perl-latest packages. These packages list only explicit package dependencies we want in the environment, and to a great extent they need to be made by hand. No versions are specified in these packages.
- Repeat the following for each architecture in
["ubuntu", "macos", "windows"]
(and remember to change the channel):ARCH=ubuntu channel=twelve git clone https://github.com/sot/skare3.git -b 2025.0-branch conda create -y -n 2025.0 conda activate 2025.0 conda config --env --add channels https://ska:${CONDA_PASSWORD}@cxc.cfa.harvard.edu/mta/ASPECT/ska3-conda/${channel} conda config --env --add channels conda-forge conda config --env --remove channels defaults conda config --show-sources conda env update -f skare3/pkg_defs/ska3-core-latest/base_environment.yml python ./skare3/pkg_defs/ska3-core-latest/install_from_scratch.py conda list --json > ska3-core-${ARCH}.json python ./skare3/pkg_defs/ska3-flight-latest/install_from_scratch.py --ska-channel ${channel} conda list --json > ska3-flight-${ARCH}.json if [ "$ARCH" != "windows" ] then mamba install -y -c conda-forge -c https://ska:${CONDA_PASSWORD}@cxc.cfa.harvard.edu/mta/ASPECT/ska3-conda/${channel} ska3-perl-latest; conda list --json > ska3-perl-${ARCH}.json fi python ./skare3/conda_fetch.py --no-zip --no-packages -o json/patch_instructions-${ARCH}
- Combine these into
meta.yml
files forska3-[core/flight/perl]
. Roughly speaking:- ska3-flight includes all Ska packages installed by ska3-flight-latest,
- ska3-core includes all non-Ska packages that get installed when doing `conda install ska3-flight-latest,
- ska3-perl includes all packages installed by ska3-perl-latest and not by ska3-flight-latest. The steps of the process are:
./skare3/combine_arch_meta.py --name ska3-core --version ${SKA3_VERSION} \ --out pkg_defs/ska3-core/meta.yaml \ --env linux=json/ska3-flight-ubuntu.json \ --env osx=json/ska3-flight-macos.json \ --env win=json/ska3-flight-windows.json \ --not-in skare3/pkg_defs/ska3-flight-latest/meta.yaml \ --exclude ska3-flight ./skare3/combine_arch_meta.py --name ska3-flight --version ${SKA3_VERSION} \ --out pkg_defs/ska3-flight/meta.yaml \ --env linux=json/ska3-flight-ubuntu.json \ --env osx=json/ska3-flight-macos.json \ --env win=json/ska3-flight-windows.json \ --in skare3/pkg_defs/ska3-flight-latest/meta.yaml \ --include ska3-core \ --build "noarch: generic" ./skare3/combine_arch_meta.py --name ska3-perl --version ${SKA3_VERSION} \ --out pkg_defs/ska3-perl/meta.yaml \ --env linux=json/ska3-perl-ubuntu.json \ --env osx=json/ska3-perl-macos.json \ --subtract-env linux=json/ska3-flight-ubuntu.json \ --subtract-env osx=json/ska3-flight-macos.json \ --build "skip: True # [win]"
- Combine the patches
./skare3/conda_fetch.py --merge-patches --no-zip json/patch_instructions-* \ -o patch_instructions/noarch cp -fr json/patch_instructions-*/linux-64 patch_instructions/ cp -fr json/patch_instructions-*/osx-arm64 patch_instructions/ cp -fr json/patch_instructions-*/win-64 patch_instructions/
There is a Github workflow to do (part of this) this in the three standard platforms. In the following example, we trigger the workflow which installs ska3-core 2021.4, updates jpeg package to version 9c, and creates artifacts with conda packages (no yaml files yet):
from skare3_tools import github
repo = github.Repository('sot/skare3')
repo.dispatch_event(event_type='incremental-conda-meta', client_payload={'ska3_core_version': '2022.2', 'update': '-c conda-forge sherpa==4.14.0'})
Most updates to core packages are done incrementally from a base version. In this case, it is not convenient to follow the process outlined above, because it would pull all the newest versions of packages. Instead, we follow a procedure to increase the versions of only a few selected packages and make sure all the dependencies are collected and registered in the corresponding meta.yml
file.
For now, this procedure assembles ska3-core/meta.yml only, it assumes packages come from the defaults
channel, and it is not automated. Adjust as needed. These are the steps on each architecture (windows, macos, ubuntu):
-
conda create -y -n pkg-dev conda activate pkg-dev mamba install -y --override-channels -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight ska3-core # conda install -y --strict-channel-priority --override-channels -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight -c defaults -c conda-forge mamba mamba uninstall -y ska3-core mamba update -y [--strict-channel-priority] [--override-channels] -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight -c defaults [package_spec [package_spec ...]] mamba install -y [--strict-channel-priority] [--override-channels] -c https://icxc.cfa.harvard.edu/aspect/ska3-conda/flight -c defaults [package_spec [package_spec ...]] conda list --json > ska3-core-${ARCH}.json
- I currently have a prototype script to gather all packages that are not from our conda channel, so I do:
git clone --branch improvements https://github.com/sot/skare3_tools.git python ./skare3_tools/skare3_tools/conda.py --directory packages/${ARCH} --exclude-channel aspect/ska3-conda/flight
- combine YAML files (encoding of windows file was different and had to be fixed before this):
./skare3/combine_arch_meta.py --name ska3-core --version ${SKA3_VERSION} \ --out skare3/pkg_defs/ska3-core/meta.yaml \ --env linux=linux/ska3-core-linux.json \ --env osx=macos/ska3-core-macos.json \ --env win=win/ska3-core-windows.json