-
Notifications
You must be signed in to change notification settings - Fork 2
das3_cdf
das3_cdf - Output das v2 and v3 streams as a CDF (Common Data Format) file
das3_cdf [options] [< DAS_STREAM]
By default das3_cdf reads a das2 or das3 stream from standard input and writes a CDF file to standard output. Unlike other das stream processors das3_cdf is not a good filter. It does not start writing ANY output until ALL input is consumed. This is unavoidable as the CDF format is not a streaming format. Thus a temporary file must be created whose bytes are then feed to standard output. If the end result is just to generate a local file anyway use the '--output' option below to avoid the default behavior.
Data values are written to CDF variables and metadata are written to CDF attributes. The mapping of stream properties to CDF attributes follows.
<stream> Properties -> CDF Global Attributes
<dataset> Properties -> CDF Global Attributes (prefix as needed)
<coord>,<data> Properties -> CDF Variable Attributes
During the metadata mapping, common das3 property names are converted to equivalent ISTP metadata names. The property conversions are:
label -> LABLAXIS (with units stripped)
title,description -> FIELDNAM
summary -> CATDESC
notes -> VAR_NOTES
format -> FORMAT
frame -> REFERENCE_FRAME
nominalMin,nominalMax -> LIMITS_NOMINAL_MIN,LIMITS_NOMINAL_MAX
scaleMin,scaleMax -> SCALEMIN,SCALEMAX
scaleType -> SCALETYP
validMin,validMax -> VALIDMIN,VALIDMAX
warnMin,warnMax -> LIMITS_WARN_MIN,LIMITS_WARN_MAX
compLabel -> LABL_PTR_1
Note that if a property is named 'cdfName' it is not written to the CDF but instead changes the name of a CDF variable.
Other CDF attributes are also set based on the data structure type. Some examples are:
DasVar.units -> UNITS
DasAry.fill -> FILLVAL
(algorithm) -> DEPEND_N
DasFrame.dir -> LABL_PTR_1 (if compLabel missing)
Note that if the input is a legacy das2 stream, it is upgraded internally to a das3 stream prior to writing the CDF output.
-h,--help
Write this text to standard output and exit.
-l LEVEL,--log=LEVEL
Set the logging level, where LEVEL is one of 'debug', 'info', 'warning', 'error' in order of decreasing verbosity. All log messages go to the standard error channel, the default is 'info'.
-m MEGS,--memory=MEGS
To avoid constant writes, das3_cdf buffers datasets in memory until they are 16 MB or larger and then they are written to disk. Use this parameter to change the threshold. Using a large value can increase performance for large datasets. The special values 'inf', 'infinite' or '∞' can be used to only write record data after the stream completes.
-t DIR,--temp-dir=DIR
Directory for writing temporary files when run as a command pipeline filter. Defaults to "$HOME/.dastmp". Ignored if -o is given.
-a FILE,--auth-toks=FILE
Set the location where server authentication tokens (if any) are saved. Defaults to HOME/.dasauth
-i URL,--input=URL
Instead of reading from standard input, read from this URL. To read from a local file prefix it with 'file://'. Only file://, http:// and https:// are supported.
-m FILE,--map-vars=FILE
Provide a mapping from automatic variable names to CDF variables The map file has one name pair per line and has pattern:
OUTPUT_NAME = [INPUT_PKTID] INPUT_DIM [INPUT_ROLE]
Only the input dimension name is required, the variable role and packet ID are only needed for streams with repeated dimension names. Remapping variable names is helpful when using template CDFs. A pound symbol, '#', denotes a comment that runs to the end of the line.
-f,--filter-vars
Only output "data" variables mentioned in the variable map file. Thus a map file with identical input and output names can be used to sub-select das stream inputs. Support variables needed by the "data" variable are always emitted.
-o DEST,--output=DEST
Instead of acting as a poorly performing filter, write data to this location. If DEST is a file then data will be written directly to that file. If DEST is a directory then an auto- generated file name will be used. This is useful when reading das servers since they provide default filenames.
-N,--no-istp
Don't automatically add certian ITSP meta-data attributes such as 'Data_version' if they are missing.
-s FILE,--skeleton=CDF_FILE
Initialize the output CDF with an empty skeleton CDF file first. The program "skeletoncdf" providid by the NASA-Goddard can be used to generate a binary CDF skeleton from a text file.
-r,--remove
Remove the destination file before writing. By default das3_cdf refuses to overwrite an existing output file. Use with '-o'.
-u,-uncompressed
Disables zlib compression. All variables are written uncompressed This is needed for any CDF files submitted to the Planetary Data system. Per ISTP rules, Epoch variables are not compressed.
-
Convert a local das stream file to a CDF file.
cat my_data.d3b | das3_cdf -o my_data.cdf
-
Read Juno/Waves data over the network and write to the current directory, auto-generating the CDF file name.
das3_cdf -r -i "https://jupiter.physics.uiowa.edu/das/server?server=dataset&dataset=Juno/WAV/Survey&start_time=2020-01-01&end_time=2020-01-02" -o ./
Which when rendered creates this plot:
-
Create a CDF-A/PDS archive file. Compression is disabled and records are buffered in RAM before writing a single continuous block per variable.
cat my_pds_data.d3b | das3_cdf -o my_pds_data.cdf -u -m infinite
-
Create and use a template CDF to add meta-data to the output while renaming output variables.
Run once to produce metadata and variable mappings: $ vim my_metadata.skt
skeletoncdf my_metadata.skt # produces an empty CDF for use below vim my_varnames.conf
Run as needed to produce output files:
cat my_data.d2s | das3_cdf -m my_varnames.conf -s my_metadata.cdf -o ./
ISTP Note:
- Most das2 stream sources do not have sufficient meta-data to create an ISTP compliant CDF. In these cases using a skeleton file and a variable name map are recommended.
See Also:
- das3_node
- Wiki page https://github.com/das-developers/das2C/wiki/das3_cdf
- ISTP CDF guidelines: https://spdf.gsfc.nasa.gov/istp_guide/istp_guide.html
Source: das3_cdf.c
Author: C. Piker