Skip to content

Commit

Permalink
fixed conflicts with main
Browse files Browse the repository at this point in the history
  • Loading branch information
moustakas committed Oct 23, 2024
2 parents 202af59 + bdec78b commit aae3bd7
Show file tree
Hide file tree
Showing 10 changed files with 87 additions and 45 deletions.
2 changes: 2 additions & 0 deletions doc/changes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ Change Log
3.0.0 (not released yet)
------------------------

* Miscellaneous bug fixes [`PR #185`_].
* Use Numba-based Resolution implementation with faster multiply.
Also tweak parameters of continuum least_squares optimizer for
speed. [`PR #181`_]
Expand All @@ -18,6 +19,7 @@ Change Log
.. _`PR #179`: https://github.com/desihub/fastspecfit/pull/179
.. _`PR #180`: https://github.com/desihub/fastspecfit/pull/180
.. _`PR #181`: https://github.com/desihub/fastspecfit/pull/181
.. _`PR #185`: https://github.com/desihub/fastspecfit/pull/185

2.5.2 (2024-04-28)
------------------
Expand Down
49 changes: 34 additions & 15 deletions doc/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,20 +18,39 @@ standard dependencies. Here, we describe two different ways of setting up
At `NERSC`_, ``FastSpecFit`` can be loaded trivially on top of the standard DESI
software stack. In a login or interactive `Perlmutter
<https://docs.nersc.gov/systems/perlmutter>`_ node, simply run the following
commands::
commands::

source /global/cfs/cdirs/desi/software/desi_environment.sh main
module load fastspecfit/main

Or, to load a specific version or tag of the code simply replace ``main`` with
the desired version, for example::

source /global/cfs/cdirs/desi/software/desi_environment.sh 24.8
module load fastspecfit/2.5.2

JupyterHub
##########

Alternatively, some users may want to access ``FastSpecFit`` through `NERSC`_'s
`JupyterHub`_ notebook server. To set up the kernel first do::
`JupyterHub`_ notebook server. In order to use this service, however, we first
have to install the appropriate *kernel*. To set up the ``main`` kernel do (just
once!)::

mkdir -p ${HOME}/.local/share/jupyter/kernels/fastspecfit
wget -O ${HOME}/.local/share/jupyter/kernels/fastspecfit/kernel.json \
mkdir -p ${HOME}/.local/share/jupyter/kernels/fastspecfit-main
wget -O ${HOME}/.local/share/jupyter/kernels/fastspecfit-main/kernel.json \
https://raw.githubusercontent.com/desihub/fastspecfit/main/etc/jupyter-kernel.json

Then, in `JupyterHub`_, select the *FastSpecFit* kernel and you are all
set!
Or you can install a kernal with the latest version of the code by executing the
following commands (as before, just one time)::

mkdir -p ${HOME}/.local/share/jupyter/kernels/fastspecfit-latest
wget -O ${HOME}/.local/share/jupyter/kernels/fastspecfit-latest/kernel.json \
https://raw.githubusercontent.com/desihub/fastspecfit/main/etc/jupyter-kernel-latest.json

Then, in `JupyterHub`_, select either the *FastSpecFit Main* or *FastSpecFit
Latest* kernel and then have fun coding!


.. _laptop installation:

Expand All @@ -44,15 +63,15 @@ been uploaded to `PyPi`_, although there is an `open ticket`_ to do so.)
Therefore, for the time being, the code and its dependencies must be installed
"by hand" in an accessible location (e.g., */path/to/desi/code*) with the
following commands::

conda create -y --name fastspec python numpy scipy numba astropy matplotlib seaborn
conda activate fastspec
pip install fitsio healpy speclite

export DESI_CODE=/path/to/desi/code
mkdir -p $DESI_CODE
pushd $DESI_CODE

pushd $DESI_CODE
for package in desiutil desimodel desitarget desispec fastspecfit; do
git clone https://github.com/desihub/$package.git
export PATH=$DESI_CODE/$package/bin:$PATH
Expand All @@ -70,7 +89,7 @@ their own environment variable:

* ``DESI_ROOT``, which specifies the top-level location of the DESI data;
* ``DUST_DIR``, which specifies the location of the `Schlegel, Finkbeiner, &
Davis dust maps`_;
Davis dust maps`_;
* ``FPHOTO_DIR``, which specifies the location of the `DESI Legacy Imaging
Surveys Data Release 9 (DR9)`_ data; and
* ``FTEMPLATES_DIR``, which indicates the location of the stellar population
Expand All @@ -85,11 +104,11 @@ These environment variables can be set with the following commands::

wget -r -np -nH --cut-dirs 5 -A fits -P $DUST_DIR \
https://portal.nersc.gov/project/cosmo/data/dust/v0_1/maps
wget -O $FTEMPLATES_DIR/ftemplates-chabrier-1.3.0.fits \
https://data.desi.lbl.gov/public/external/templates/fastspecfit/1.3.0/ftemplates-chabrier-1.3.0.fits
wget -O $FTEMPLATES_DIR/ftemplates-chabrier-2.0.0.fits \
https://data.desi.lbl.gov/public/external/templates/fastspecfit/2.0.0/ftemplates-chabrier-2.0.0.fits

.. note::

The DESI `Early Data Release (EDR)`_ became publicly available in
June 2023!

Expand All @@ -105,7 +124,7 @@ These environment variables can be set with the following commands::

.. _`NERSC`: https://www.nersc.gov/

.. _`JupyterHub`: https://jupyter.nersc.gov/
.. _`JupyterHub`: https://jupyter.nersc.gov/

.. _`DockerHub/desihub`: https://hub.docker.com/u/desihub

Expand Down
9 changes: 9 additions & 0 deletions etc/jupyter-kernel-latest.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"language": "python",
"argv": [
"/global/cfs/cdirs/desi/spectro/fastspecfit/software/activate_fastspecfit_jupyter_latest.sh",
"FastSpecFit Latest",
"{connection_file}"
],
"display_name": "FastSpecFit Latest"
}
6 changes: 3 additions & 3 deletions etc/jupyter-kernel.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
{
"language": "python",
"argv": [
"/global/cfs/cdirs/desi/spectro/fastspecfit/software/activate_fastspecfit_jupyter.sh",
"FastSpecFit",
"/global/cfs/cdirs/desi/spectro/fastspecfit/software/activate_fastspecfit_jupyter_main.sh",
"FastSpecFit Main",
"{connection_file}"
],
"display_name": "FastSpecFit"
"display_name": "FastSpecFit Main"
}
11 changes: 8 additions & 3 deletions py/fastspecfit/continuum.py
Original file line number Diff line number Diff line change
Expand Up @@ -1367,9 +1367,14 @@ def continuum_fastspec(redshift, objflam, objflamivar, CTools,
log.critical(errmsg)
raise ValueError(errmsg)

ncam = len(data['snr'])
snrmsgs = [ f'Median S/N_{data["cameras"][icam]}={data["snr"][icam]:.2f}' for icam in range(ncam) ]
log.info(' '.join(snrmsgs))
ncam = len(data['cameras'])
if ncam == 1:
snrmsg = f"Median spectral S/N_{data['cameras']}={data['snr'][0]:.2f}"
else:
snrmsg = f"Median spectral S/N_{data['cameras'][0]}={data['snr'][0]:.2f}"
for icam in np.arange(ncam-1)+1:
snrmsg += f" S/N_{data['cameras'][icam]}={data['snr'][icam]:.2f}"
log.info(snrmsg)

if templates.use_legacy_fitting:
ebv = 0.
Expand Down
13 changes: 9 additions & 4 deletions py/fastspecfit/emlines.py
Original file line number Diff line number Diff line change
Expand Up @@ -909,6 +909,8 @@ def get_boundaries(A, v_lo, v_hi):
else:
stats = narrow_stats
stats.append((linesigma, linez))
else:
flux, flux_ivar = 0., 0.
else:
flux, flux_ivar = 0., 0.

Expand Down Expand Up @@ -937,10 +939,13 @@ def get_boundaries(A, v_lo, v_hi):

ew, ewivar = 0., 0.
if cmed != 0. and civar != 0.:
if flux > 0. and flux_ivar > 0.:
# add the uncertainties in flux and the continuum in quadrature
ew = flux / cmed / (1. + redshift) # rest frame [A]
ewivar = (1. + redshift)**2 / (1. / (cmed**2 * flux_ivar) + flux**2 / (cmed**4 * civar))
try:
if flux > 0. and flux_ivar > 0.:
# add the uncertainties in flux and the continuum in quadrature
ew = flux / cmed / (1. + redshift) # rest frame [A]
ewivar = (1. + redshift)**2 / (1. / (cmed**2 * flux_ivar) + flux**2 / (cmed**4 * civar))
except:
import pdb ; pdb.set_trace()

# upper limit on the flux is defined by snrcut*cont_err*sqrt(2*pi)*linesigma
fluxlimit = np.sqrt(2. * np.pi) * linesigma_ang / np.sqrt(civar) # * u.erg/(u.second*u.cm**2)
Expand Down
10 changes: 5 additions & 5 deletions py/fastspecfit/io.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
import fitsio
from astropy.table import Table

from fastspecfit.logger import log
from fastspecfit.logger import log, DEBUG
from fastspecfit.singlecopy import sc_data
from fastspecfit.photometry import Photometry
from fastspecfit.util import FLUXNORM, ZWarningMask
Expand Down Expand Up @@ -903,7 +903,7 @@ def one_spectrum(iobj, specdata, meta, ebv, fastphot,
for icam, camera in enumerate(specdata['cameras']):
# Check whether the camera is fully masked.
if np.sum(specdata['ivar0'][icam]) == 0:
log.warning(f'Dropping fully masked camera {camera}.')
log.warning(f'Dropping fully masked camera {camera} [specdata["uniqueid"]].')
else:
ivar = specdata['ivar0'][icam]
mask = specdata['mask0'][icam]
Expand All @@ -917,7 +917,7 @@ def one_spectrum(iobj, specdata, meta, ebv, fastphot,
ivar[mask != 0] = 0.

if np.all(ivar == 0.):
log.warning(f'Dropping fully masked camera {camera}.')
log.warning(f'Dropping fully masked camera {camera} [specdata["uniqueid"]].')
else:
# interpolate over pixels where the resolution matrix is masked
I = (mask != 0)
Expand Down Expand Up @@ -1313,7 +1313,7 @@ def one_stacked_spectrum(iobj, specdata, synthphot):
for icam, camera in enumerate(specdata['cameras']):
# Check whether the camera is fully masked.
if np.sum(specdata['ivar0'][icam]) == 0:
log.warning(f'Dropping fully masked camera {camera}.')
log.warning(f'Dropping fully masked camera {camera} [specdata["uniqueid"]].')
else:
ivar = specdata['ivar0'][icam]
mask = specdata['mask0'][icam]
Expand All @@ -1327,7 +1327,7 @@ def one_stacked_spectrum(iobj, specdata, synthphot):
ivar[mask != 0] = 0

if np.all(ivar == 0):
log.warning(f'Dropping fully masked camera {camera}.')
log.warning(f'Dropping fully masked camera {camera} [specdata["uniqueid"]].')
else:
cameras.append(camera)
npixpercamera.append(len(specdata['wave0'][icam])) # number of pixels in this camera
Expand Down
10 changes: 6 additions & 4 deletions py/fastspecfit/linemasker.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ def _get_contpix(zlinewaves, sigmas, nsigma_factor=2., linemask=None, lya=False)

pix = {'linepix': {}, 'contpix': {}}
if patchMap is not None:
pix.update({'patch_contpix': {}, 'dropped': [], 'merged': []})
pix.update({'patch_contpix': {}, 'dropped': [], 'merged': [], 'merged_from': []})
patchids = list(patchMap.keys())
npatch = len(patchids)

Expand Down Expand Up @@ -177,6 +177,7 @@ def _get_contpix(zlinewaves, sigmas, nsigma_factor=2., linemask=None, lya=False)
del pix['patch_contpix'][patchids[ipatch+1]]
pix['patch_contpix'][newpatchid] = newcontpix
pix['merged'].append(newpatchid)
pix['merged_from'].append([patchid, patchids[ipatch+1]])

if patchMap is not None:
if len(pix['dropped']) > 0:
Expand Down Expand Up @@ -298,8 +299,8 @@ def _fit_patches(continuum_patches, patchMap, linemodel, debug_plots=False,
# In the case that patches have been merged, update the
# continuum_patches table and patchMap dictionary.
if len(pix['merged']) > 0:
for newpatchid in np.atleast_1d(pix['merged']):
oldpatchids = list(newpatchid)
for oldpatchids, newpatchid in zip(np.atleast_1d(pix['merged_from']),
np.atleast_1d(pix['merged'])):
O = np.where(np.isin(continuum_patches['patchid'].value, oldpatchids))[0]

# update continuum_patches
Expand All @@ -316,6 +317,7 @@ def _fit_patches(continuum_patches, patchMap, linemodel, debug_plots=False,
newI.append(patchMap[oldpatchid][1])
newJ.append(patchMap[oldpatchid][2])
del patchMap[oldpatchid]

patchMap[newpatchid] = (np.hstack(newlines), np.hstack(newI), np.hstack(newJ))


Expand Down Expand Up @@ -549,7 +551,7 @@ def _niceline(line):
xx.set_ylim(ymin, ymax)
xx.legend(loc='upper left', fontsize=8, ncols=nlegcol)
xx.set_title(f'Patch {patchid}')
for rem in range(ipatch+1, ncols*nrowspatch):
for rem in range(ipatch+1, ncols*nrows):
ax.flat[rem].axis('off')

if ax.ndim == 1:
Expand Down
2 changes: 1 addition & 1 deletion py/fastspecfit/logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,6 @@
This needs to be in its own file to prevent circular imports with other code.
"""
from desiutil.log import get_logger
from desiutil.log import get_logger, DEBUG

log = get_logger()
20 changes: 10 additions & 10 deletions py/fastspecfit/photometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
import fitsio
from astropy.table import Table

from fastspecfit.logger import log
from fastspecfit.logger import log, DEBUG
from fastspecfit.util import trapz, C_LIGHT, FLUXNORM


Expand Down Expand Up @@ -213,7 +213,7 @@ def kcorr_and_absmag(self, nanomaggies, ivar_nanomaggies, redshift, dmod,
synth_maggies_in = np.zeros(len(maggies))
return kcorr, absmag, ivarabsmag, synth_absmag, synth_maggies_in

maggies = nanomaggies * 1e-9
maggies = nanomaggies * 1e-9
ivarmaggies = (ivar_nanomaggies / 1e-9**2) * self.bands_to_fit

filters_in = self.filters[photsys]
Expand Down Expand Up @@ -244,16 +244,16 @@ def kcorr_and_absmag(self, nanomaggies, ivar_nanomaggies, redshift, dmod,

# m_R = M_Q + DM(z) + K_QR(z) or
# M_Q = m_R - DM(z) - K_QR(z)
absmag = -2.5 * np.log10(maggies[oband]) - dmod - kcorr

C = 0.8483036976765437 # (0.4 * np.log(10.))**2
ivarabsmag = maggies[oband]**2 * ivarmaggies[oband] * C
absmag = np.copy(synth_absmag)
ivarabsmag = np.zeros_like(absmag)

# if we use synthesized photometry then ivarabsmag is zero
# (which should never happen?)
I = (maggies[oband] * np.sqrt(ivarmaggies[oband]) <= snrmin)
absmag[I] = synth_absmag[I]
ivarabsmag[I] = 0.
I = (maggies[oband] * np.sqrt(ivarmaggies[oband]) > snrmin)
if np.any(I):
C = 0.8483036976765437 # (0.4 * np.log(10.))**2
absmag[I] = -2.5 * np.log10(maggies[oband[I]]) - dmod - kcorr[I]
ivarabsmag[I] = maggies[oband[I]]**2 * ivarmaggies[oband[I]] * C

return kcorr, absmag, ivarabsmag, synth_maggies_in

Expand Down Expand Up @@ -334,7 +334,7 @@ def get_ab_maggies(filters, flux, wave):
maggies = Photometry.get_ab_maggies_unchecked(filters, flux, wave)
except:
# pad in case of an object at very high redshift (z > 5.5)
log.warning('Padding model spectrum due to insufficient wavelength coverage to synthesize photometry.')
log.debug('Padding input spectrum due to insufficient wavelength coverage to synthesize photometry.')
padflux, padwave = filters.pad_spectrum(flux, wave, axis=0, method='edge')

if flux.ndim > 1:
Expand Down

0 comments on commit aae3bd7

Please sign in to comment.