-
Notifications
You must be signed in to change notification settings - Fork 2
das3_cdf
das3_cdf - Output das v2 and v3 streams as a CDF (Common Data Format) file
das3_cdf [options] [< DAS_STREAM]
By default das3_cdf reads a das2 or das3 stream from standard input and writes a CDF file to standard output. Unlike other das stream processors das3_cdf is not a good filter. It does not start writing ANY output until ALL input is consumed. This is unavoidable as the CDF format is not a streaming format. Thus a temporary file must be created whose bytes are then feed to standard output. If the end result is just to generate a local file anyway use the '--output' option below to avoid the default behavior.
Data values are written to CDF variables and metadata are written to CDF attributes. The mapping of stream properties to CDF attributes follows.
<stream> Properties -> CDF Global Attributes
<dataset> Properties -> CDF Global Attributes (prefix as needed)
<coord>,<data> Properties -> CDF Variable Attributes
During the metadata mapping, common das3 property names are converted to equivalent ISTP metadata names. The property conversions are:
label -> LABLAXIS (with units stripped)
title,description -> FIELDNAM
summary -> CATDESC
notes -> VAR_NOTES
format -> FORMAT
frame -> REFERENCE_FRAME
nominalMin,nominalMax -> LIMITS_NOMINAL_MIN,LIMITS_NOMINAL_MAX
scaleMin,scaleMax -> SCALEMIN,SCALEMAX
scaleType -> SCALETYP
validMin,validMax -> VALIDMIN,VALIDMAX
warnMin,warnMax -> LIMITS_WARN_MIN,LIMITS_WARN_MAX
compLabel -> LABL_PTR_1
Note that if a property is named 'cdfName' it is not written to the CDF but instead changes the name of a CDF variable.
Other CDF attributes are also set based on the data structure type. Some examples are:
DasVar.units -> UNITS
DasAry.fill -> FILLVAL
(algorithm) -> DEPEND_N
DasFrame.dir -> LABL_PTR_1 (if compLabel missing)
Note that if the input is a legacy das2 stream, it is upgraded internally to a das3 stream prior to writing the CDF output.
-h,--help
Write this text to standard output and exit.
-t FILE,--template=FILE
Initialize the output CDF with an empty template CDF file first. FILE is not a CDF skeleton, but could be an empty CDF generated from a skeleton file.
-i URL,--input=URL
Instead of reading from standard input, read from this URL. To read from a local file prefix it with 'file://'. Only file://, http:// and https:// are supported.
-o DEST,--output=DEST
Instead of acting as a poorly performing filter, write data to this location. If DEST is a file then data will be written directly to that file. If DEST is a directory then an auto- generated file name will be used. This is useful when reading das servers since they provide default filenames.
-N,--no-istp
Don't automatically add certian ITSP meta-data attributes such as 'Data_version' if they are missing.
-r,--remove
Tired of libcdf refusing to overwrite a file? Use this option with '-o'
-s DIR,--scratch=DIR
Scratch space directory for writing temporary files when run as a data stream filter. Ignored if -o is given.
-l LEVEL,--log=LEVEL
Set the logging level, where LEVEL is one of 'debug', 'info', 'warning', 'error' in order of decreasing verbosity. All log messages go to the standard error channel, the default is 'info'.
-c FILE,--credentials=FILE
Set the location where server authentication tokens (if any) are saved. Defaults to HOME/.dasauth
-u,-uncompressed
Disables zlib compression. All variables are written uncompressed This is needed for any CDF files submitted to the Planetary Data system. Per ISTP rules, Epoch variables are not compressed.
-m MEGS,--memory=MEGS
To avoid constant writes, das3_cdf buffers datasets in memory until they are 16 MB or larger and then they are written to disk. Use this parameter to change the threshold. Using a large value can increase performance for large datasets. The special values 'inf', 'infinite' or '∞' can be used to only write record data after the stream completes.
-
Convert a local das stream file to a CDF file.
cat my_data.d3b | das3_cdf -o my_data.cdf
-
Read Juno/Waves data over the network and write to the current directory, auto-generating the CDF file name.
das3_cdf -r -i "https://jupiter.physics.uiowa.edu/das/server?server=dataset&dataset=Juno/WAV/Survey&start_time=2020-01-01&end_time=2020-01-02" -o ./
Which when rendered creates this plot:
-
Create a PDS archive file. Compression is disabled and records are buffered in RAM before writing a single continuous block per variable.
cat my_pds_data.d3b das3_cdf -o my_pds_data.cdf -u -m infinite
ISTP Note:
- Most das2 stream sources do not have sufficient meta-data to create an ISTP compliant CDF. In these cases using a skeleton file and a variable name map are recommended.
See Also:
- das3_node
- Wiki page https://github.com/das-developers/das2C/wiki/das3_cdf
- ISTP CDF guidelines: https://spdf.gsfc.nasa.gov/istp_guide/istp_guide.html
Source: das3_cdf.c
Author: C. Piker