-
-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Minor changes to DC loaders for future surveys #1122
Merged
rosteen
merged 3 commits into
astropy:main
from
aragilar:support-fallback-more-modifications
Apr 30, 2024
Merged
Changes from all commits
Commits
Show all changes
3 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,185 @@ | ||
from astropy.nddata import VarianceUncertainty | ||
from astropy.table import QTable | ||
from specutils import SpectrumList | ||
from specutils.io.default_loaders.dc_common import ( | ||
FITS_FILE_EXTS, add_single_spectra_to_map, | ||
) | ||
from specutils.io.parsing_utils import read_fileobj_or_hdulist | ||
from specutils.io.registers import data_loader | ||
|
||
# There appears to be nothing which says "this is a SAMI 1D spectra", so guess | ||
# it based on the headers that should be there | ||
SAMI_1D_SPECTRA_HEADER_KEYWORDS = [ | ||
"BUNIT", "CATADEC", "CATARA", "CDELT1", "CRPIX1", "CRVAL1", "CTYPE1", | ||
"CUNIT1", "DROPFACT", "ELLIP", "GRATID", "IFUPROBE", "KPC_SIZE", "NAME", | ||
"N_SPAX", "POS_ANG", "PSFALPHA", "PSFBETA", "PSFFWHM", "RADESYS", "RADIUS", | ||
"RO_GAIN", "RO_NOISE", "STDNAME", "WCSAXES", "Z_TONRY", | ||
] | ||
|
||
|
||
def identify_sami_cube(origin, *args, **kwargs): | ||
""" | ||
Identify if the current file is a SAMI cube file | ||
""" | ||
# TODO check this | ||
with read_fileobj_or_hdulist(*args, **kwargs) as hdulist: | ||
header = hdulist[0].header | ||
data = hdulist[0].data | ||
if "SAMI" in header.get("INSTRUME", "") and len(data.shape) == 3: | ||
return True | ||
return False | ||
|
||
|
||
def identify_sami_1d_spec(origin, *args, **kwargs): | ||
""" | ||
Identify if the current file is a SAMI 1d spectra file of some kind | ||
""" | ||
# TODO check this | ||
with read_fileobj_or_hdulist(*args, **kwargs) as hdulist: | ||
header = hdulist[0].header | ||
for key in SAMI_1D_SPECTRA_HEADER_KEYWORDS: | ||
if key not in header: | ||
return False | ||
return True | ||
|
||
|
||
@data_loader( | ||
label="SAMI-cube", extensions=FITS_FILE_EXTS, dtype=SpectrumList, | ||
identifier=identify_sami_cube, priority=10, | ||
) | ||
def sami_cube_loader(filename): | ||
spectra_map = { | ||
"sky": [], | ||
"combined": [], | ||
"unreduced": [], | ||
"reduced": [], | ||
} | ||
primary_header = None | ||
|
||
with read_fileobj_or_hdulist(filename) as hdulist: | ||
for i, hdu in enumerate(hdulist): | ||
if i == 0: | ||
# This is the primary extension, and the one with the | ||
# science data. The header is fairly complete. | ||
primary_header = hdu.header | ||
spec = add_single_spectra_to_map( | ||
spectra_map, | ||
header=primary_header, | ||
data=hdu.data, | ||
index=None, | ||
all_standard_units=True, | ||
all_keywords=True, | ||
valid_wcs=True, | ||
) | ||
|
||
elif "VARIANCE" == hdu.header.get("EXTNAME"): | ||
# This is the variance extension, and is missing wcs and | ||
# units. | ||
uncertainty = VarianceUncertainty( | ||
hdu.data, unit=spec.flux.unit ** 2 | ||
) | ||
spec.uncertainty = uncertainty | ||
|
||
elif "WEIGHT" == hdu.header.get("EXTNAME"): | ||
# This is the weight extension, and is missing wcs. The | ||
# units are effectively "normalised" (from 0-1 it seems). | ||
spec.meta["sami_cube_weight_map"] = hdu.data | ||
|
||
elif "COVAR" == hdu.header.get("EXTNAME"): | ||
# This is the spatial covariance extension. It's not clear | ||
# as to how best to expose this, so skipping for now. | ||
pass | ||
|
||
elif "QC" == hdu.header.get("EXTNAME"): | ||
# This is the QC extension, and is a binary table. This we | ||
# add to the metadata. | ||
spec.meta["sami_QC_table"] = QTable.read(hdu) | ||
|
||
elif "DUST" == hdu.header.get("EXTNAME"): | ||
# This is the dust extension, and is missing wcs and | ||
# units. This should likely be represented as an array plus | ||
# the metadata in the header. | ||
spec.meta["sami_dust_vector_weights"] = hdu.data | ||
|
||
elif "BIN_MASK" == hdu.header.get("EXTNAME"): | ||
# This is the bin mask extension, where the value of each | ||
# pixel indicates the bin to which it belongs. The bin mask | ||
# is used to construct the binned fluxes and variances in | ||
# the above two extensions from the default cubes. | ||
# This is not the same as the aperture spectra mask with the | ||
# same HDU name. | ||
spec.meta["sami_bin_mask"] = hdu.data | ||
|
||
else: | ||
raise NotImplementedError( | ||
"Extension is not handled: index {}; name {}".format( | ||
i, hdu.header.get("EXTNAME") | ||
) | ||
) | ||
|
||
spectra = SpectrumList( | ||
spectra_map["combined"] + | ||
spectra_map["reduced"] + | ||
spectra_map["unreduced"] + | ||
spectra_map["sky"] | ||
) | ||
return spectra | ||
|
||
|
||
@data_loader( | ||
label="SAMI-1d-spec", extensions=FITS_FILE_EXTS, dtype=SpectrumList, | ||
identifier=identify_sami_1d_spec, priority=10, | ||
) | ||
def sami_1d_spec_loader(filename): | ||
spectra_map = { | ||
"sky": [], | ||
"combined": [], | ||
"unreduced": [], | ||
"reduced": [], | ||
} | ||
primary_header = None | ||
|
||
with read_fileobj_or_hdulist(filename) as hdulist: | ||
for i, hdu in enumerate(hdulist): | ||
if i == 0: | ||
# This is the primary extension, and the one with the | ||
# science data. The header is fairly complete. | ||
primary_header = hdu.header | ||
spec = add_single_spectra_to_map( | ||
spectra_map, | ||
header=primary_header, | ||
data=hdu.data, | ||
index=None, | ||
all_standard_units=True, | ||
all_keywords=True, | ||
valid_wcs=True, | ||
) | ||
|
||
elif "VARIANCE" == hdu.header.get("EXTNAME"): | ||
# This is the variance extension, and is missing wcs and | ||
# units. | ||
uncertainty = VarianceUncertainty( | ||
hdu.data, unit=spec.flux.unit ** 2 | ||
) | ||
spec.uncertainty = uncertainty | ||
|
||
elif "BIN_MASK" == hdu.header.get("EXTNAME"): | ||
# Contains the bin mask used to construct the aperture | ||
# spectra. A 1 indicates a spaxel was included in the | ||
# aperture, a 0 indicates a spaxel was not included. | ||
spec.meta["sami_aperture_spectra_mask"] = hdu.data | ||
|
||
else: | ||
raise NotImplementedError( | ||
"Extension is not handled: index {}; name {}".format( | ||
i, hdu.header.get("EXTNAME") | ||
) | ||
) | ||
|
||
spectra = SpectrumList( | ||
spectra_map["combined"] + | ||
spectra_map["reduced"] + | ||
spectra_map["unreduced"] + | ||
spectra_map["sky"] | ||
) | ||
return spectra |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why the change here? It looks like this function acts on an existing
spectra_map
object and thus, as the comment says, shouldn't return anything.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The original idea (based on what spectra I had access to at the time) was that files were either a single extension (with data represented as a 2d array, which is what most of the GAMA spectra were) or were across multiple extensions. We've got (legacy, so cannot change the format the files were written in) files that instead require more manipulation (e.g. the WCS was only in the first extension), so we want to be able to keep a reference around to the 1d spectrum independent of the mapping.
There are at least 3 surveys where we're going to use this (MGC, SAMI, S7), so rather than hacking this into each of their loaders, I figured I'd do the change in the place where it needs to be.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gotcha, seems reasonable enough. Any chance you could add a test for the new fallback header code?