Skip to content

Commit

Permalink
Release 1.4.0 (#331)
Browse files Browse the repository at this point in the history
* SDAP-493: Pagination improvements (#282)

* removed resultSizeLimit param from matchup

* Add # of primaries/avergae secondaries to job output

* rename to executionId

* update changelog

* add totalSecondaryMatched field to /job output

* num unique secondaries addition

* updated docs to use correct sea_water_temperature param name

* bugfix

* fix division by zero bug

* SDAP-500: Improvements to SDAP Asynchronous Jobs (#287)

* removed resultSizeLimit param from matchup

* Add # of primaries/avergae secondaries to job output

* rename to executionId

* update changelog

* add totalSecondaryMatched field to /job output

* num unique secondaries addition

* updated docs to use correct sea_water_temperature param name

* bugfix

* fix division by zero bug

* pagination improvements

* removed debugging line

* changelog

* Update helm cassandra dependency (#289)

* Update helm cassandra dependency

* Bump default cassandra PV to 4

* Bump default cassandra PV to 4 in tools

* Changelog

* Fixed small documentation issue

---------

Co-authored-by: rileykk <[email protected]>

* update changelog

* rename key to executionID

---------

Co-authored-by: Riley Kuttruff <[email protected]>
Co-authored-by: rileykk <[email protected]>

* SDAP-499: Add page number to default filename for matchup output (#288)

* removed resultSizeLimit param from matchup

* Add # of primaries/avergae secondaries to job output

* rename to executionId

* update changelog

* add totalSecondaryMatched field to /job output

* num unique secondaries addition

* updated docs to use correct sea_water_temperature param name

* bugfix

* fix division by zero bug

* add page number to default filename for matchup output

* pagination improvements

* removed debugging line

* changelog

* Update helm cassandra dependency (#289)

* Update helm cassandra dependency

* Bump default cassandra PV to 4

* Bump default cassandra PV to 4 in tools

* Changelog

* Fixed small documentation issue

---------

Co-authored-by: rileykk <[email protected]>

* Revert "Update helm cassandra dependency (#289)"

This reverts commit 1e8cc4e.

* changelog

---------

Co-authored-by: Riley Kuttruff <[email protected]>
Co-authored-by: rileykk <[email protected]>

* SDAP-506: STAC Catalog for Matchup outputs (#291)

* removed resultSizeLimit param from matchup

* Add # of primaries/avergae secondaries to job output

* rename to executionId

* update changelog

* add totalSecondaryMatched field to /job output

* num unique secondaries addition

* updated docs to use correct sea_water_temperature param name

* bugfix

* fix division by zero bug

* add page number to default filename for matchup output

* pagination improvements

* removed debugging line

* changelog

* Update helm cassandra dependency (#289)

* Update helm cassandra dependency

* Bump default cassandra PV to 4

* Bump default cassandra PV to 4 in tools

* Changelog

* Fixed small documentation issue

---------

Co-authored-by: rileykk <[email protected]>

* stac catalog

* Updated openapi spec

* move stac endpoints to matchup tag in openapi spec

* Revert "Update helm cassandra dependency (#289)"

This reverts commit 1e8cc4e.

* fix bug where still-running jobs failed /job endpoint due to missing metadata

* Update .asf.yaml (#293)

Co-authored-by: rileykk <[email protected]>

* update changelog

* re-add removed changelog entry

---------

Co-authored-by: Riley Kuttruff <[email protected]>
Co-authored-by: rileykk <[email protected]>

* SDAP-508 report sat spatial extents (#295)

Co-authored-by: rileykk <[email protected]>

* SDAP-505: Support DOMS insitu API (#299)

* support DOMS insitu API

* update NCAR insitu url

* add endpoint for getting insitu units

* SDAP-472 - data-access overhaul to support multiple simultaneous data backends (#294)

* Separated NTS backends

* n/a

* More nts backend stuff

* Working(?) np backend

* Working(?) np backend

* gitignore ini

* ASF headers

* First functioning test of 2 simultaneous backends

* Removed accidentally committed ini files

* Working zarr backend ds list

+ datasets are no longer case sensitive
+ handling for failed zarr ds opens (bad path, bad creds, &c)

* Capture and handle NTS requests routed to backend that doesn't (yet) support them

* analysis setup fails to find VERSION.txt when building locally

* Implemented more NTS functions in zarr backend

* Added misc backend time metrics record field in NCSH

* fixes

* Dynamic dataset management

* Dynamic dataset management

* Dataset management

* Timeseriesspark support

* Update backend dict on dataset mgmt query

* Fixes and improvements

* Adapted matchup to work with zarr backends

* Zarr support

- Distinct slices of time is now default
- No longer assuming+shaping as multivar tiles unless needed

* DDAS adjustments

* find_tile_by_polygon_and_most_recent_day_of_year impl

* Don't sel by time if neither max nor min time are given

* Fix not calling partial when needed

* Pinned s3fs and fsspec versions

* Fixed some dependencies to ensure image builds properly + s3fs works

* Config override for backends

* Deps update

* Add metadata from Zarr collection to /list

* Zarr: Probe lat order and flip if necessary

* Strip quotes from variable names

CM can sometimes publish with extra quotes resulting in KeyErrors

* removed resultSizeLimit param from matchup

* Add # of primaries/avergae secondaries to job output

* rename to executionId

* update changelog

* add totalSecondaryMatched field to /job output

* num unique secondaries addition

* updated docs to use correct sea_water_temperature param name

* bugfix

* fix division by zero bug

* add params to dataset management handler classes

* add page number to default filename for matchup output

* pagination improvements

* removed debugging line

* changelog

* Update helm cassandra dependency (#289)

* Update helm cassandra dependency

* Bump default cassandra PV to 4

* Bump default cassandra PV to 4 in tools

* Changelog

* Fixed small documentation issue

---------

Co-authored-by: rileykk <[email protected]>

* stac catalog

* Updated openapi spec

* move stac endpoints to matchup tag in openapi spec

* SDAP-507 - Changes to remove geos sub-dependency

* SDAP-507 - Changelog

* SDAP-507 - Changes to remove geos sub-dependency

* SDAP-507 - Changelog

* delete instead of comment out

* Revert "Update helm cassandra dependency (#289)"

This reverts commit 1e8cc4e.

* deleted disabled endpoint files

* fix bug where still-running jobs failed /job endpoint due to missing metadata

* Update .asf.yaml (#293)

Co-authored-by: rileykk <[email protected]>

* Moved changelog entries

* SDAP-472 changelog entries

---------

Co-authored-by: rileykk <[email protected]>
Co-authored-by: skorper <[email protected]>

* Merge master into develop post-1.2.0 release (#305)

* Update .asf.yaml (#293)

Co-authored-by: rileykk <[email protected]>

* domspurge patch (#280)

* Fix delete query to account for PK update

* Fixed mistake in domspurge readme

---------

Co-authored-by: rileykk <[email protected]>

* Patch: Made conda dep changes accidentally omitted from geos PR

* SDAP-511 - Switch package manager to poetry (#301)

* Redid branch w/ correct base

I tried to just rebase but it didn't work properly

* Changelogs

* Delete old dockerfile

* Removed old conda install files

* Removed version.txt to just use pyproject.toml

---------

Co-authored-by: rileykk <[email protected]>

* pyproject.toml dependency updates for Zarr

---------

Co-authored-by: rileykk <[email protected]>
Co-authored-by: Stepheny Perez <[email protected]>

* SDAP 515 - Improved handling of unreachable remote SDAPs (#308)

* Improved error handling to account for any reason a remote is not reachable

* Further fixes for unreachable remote SDAPs

* Improved logging

* SDAP-513 - Configurable Solr init pod image in helm (#306)

Co-authored-by: rileykk <[email protected]>

* Added some missing ASF headers and removed some unneeded files (#302)

* Remove IDE files

* Added more missing headers

---------

Co-authored-by: rileykk <[email protected]>

* Update requirements.txt (#309)

* Updates to openapi (#310)

* Disable the try it now button

* Added additional spark algorithms

* Updates to openapi to include additional algorithms

* SDAP-518 - Collection Config Docs (#311)

* Initial work on CC docs

* Add collections to index toctree

* YAML highlighting

* NetCDF section

* remove incubation msg from intro.rst

* Added recs for gridded tile size

---------

Co-authored-by: rileykk <[email protected]>

* data-access patch (#313)

* Patches to backend mgmt and zarr backend

- Code cleanup
- Added dask as dependency, so it will be leveraged with Zarr datasets
- Fixed creation of tile times array from Zarr data to ensure its in seconds from 1970-01-01. Original method has been seen to produce incorrect results from bad assumptions

* Changelog

---------

Co-authored-by: rileykk <[email protected]>

* Changed SDAP startup behavior to wait for all datasets to be prepared before accepting HTTP requests (#314)

Co-authored-by: rileykk <[email protected]>

* update quickstart with updated solr command (#298)

Co-authored-by: Riley Kuttruff <[email protected]>

* 1.3.0 to develop (#317)

* Update .asf.yaml (#293)

Co-authored-by: rileykk <[email protected]>

* domspurge patch (#280)

* Fix delete query to account for PK update

* Fixed mistake in domspurge readme

---------

Co-authored-by: rileykk <[email protected]>

* Patch: Made conda dep changes accidentally omitted from geos PR

* SDAP-511 - Switch package manager to poetry (#301)

* Redid branch w/ correct base

I tried to just rebase but it didn't work properly

* Changelogs

* Delete old dockerfile

* Removed old conda install files

* Removed version.txt to just use pyproject.toml

---------

Co-authored-by: rileykk <[email protected]>

* Update release files

- 1.3.0 header in changelogs
- Removed incubation DISCLAIMER
- Removed incubation language from NOTICE and updated URLs
- Bumped versions
- Made copyright year current
- Poetry re-lock

* Fix Dockerfile for DISCLAIMER rm

---------

Co-authored-by: rileykk <[email protected]>
Co-authored-by: Stepheny Perez <[email protected]>

* SDAP-469 - Support for data with elevation (#276)

* SDAP-469 - Configured timeseriesspark to have the option to bound by depth

* Elevation fetch & masking

untested

* Elevation subsetting

* fixes

* Check if there's elevation when pulling the tile data

* Elevation masking for tiles with no elevation does nothing

* Add elevation fields to collection creation script

* Name elevation fields to be dynamically typed correctly

* Elevation array shape + add to nexuspoint

* Add elev to data in bounds result

* Improved handling of elev arguments & masking

* tile elev in nexusproto backend

* Docker fix

* Additional algorithm support for SDAP 469 3d data (#10)

* Added support for default elevation value of 0 when parameter not provided

* Added support for setting min and max elevation when using single elevation arg

* Added support for data ingested with elevation

* Fixed logic for elevation clause when min and max elevations are equivalent

* Fixed logic for elevation clause when min and max elevations are equivalent

* Bug fix for missing elevation parameters

* Reverted timeAvgMapSpark to use NexusRequestObject

* Fixed bug with order of arguments

* Commented out saving netcdf files

* Bug fix for how numpy arrays are handled

* Cleaned up logic for handling the different arg cases

* Reworked elevation clause logic and added to polygon search

* Added elevation args to find_tiles_in_polygon

---------

Co-authored-by: rileykk <[email protected]>
Co-authored-by: Kevin <[email protected]>

* SDAP-522 - Fixes for broken endpoints found by work for SDAP-521 (#319)

* Implemented fixes

* Changelog

---------

Co-authored-by: rileykk <[email protected]>

* Error handling on ds listing for zarr (#320)

Co-authored-by: rileykk <[email protected]>

* SDAP-319 - Fixed typo (#322)

* Added forgotten changelog entry (#321)

Forgot from PR #276

* [SDAP-492] STV-FIS tomogram data visualization support (#323)

* SDAP-469 - Configured timeseriesspark to have the option to bound by depth

* Elevation fetch & masking

untested

* Elevation subsetting

* fixes

* Check if there's elevation when pulling the tile data

* Elevation masking for tiles with no elevation does nothing

* SDAP-492 - Added dependencies + alg classes

* Initial subsetting

Still need to org and plot the data

* Image rendering

* Lon + Lat tomo fetch impls

* Add elevation fields to collection creation script

* Name elevation fields to be dynamically typed correctly

* Elevation array shape + add to nexuspoint

* WIP: Optimizations for lateral slicing (lon only so far)

Untested

* Elevation array shape + add to nexuspoint

* Fixes

* bugfix

* Use actual extent in extent param

* Minor improvements

* More tomogram work

* Render profile tomo image results using `plt.pcolormesh` instead of `plt.imshow`

* SIGNIFICANT improvement of tomogram processing time

* Add elev to data in bounds result

* Add elev to data in bounds result

* Added GIF renderer

* 3D tomogram viz endpoint

* Add labels

* Colorbars

* Add basemap below tomo render

* Additional arguments and render types

* Simple CSV renderer - point cloud only

* Improved handling of elev arguments & masking

* tile elev in nexusproto backend

* poetry re-lock after deps update from merge

* minor updates

* Docker fix

* Speed up 3d result build

* poetry relock

* Vertical truncation of tomograms by RH98/GND height maps

* Updates:

- Disabled elev tomo endpoint
- Added param to render tomograms in vertically cumulative percentiles. (ie, cumulative return power for each voxel column)
- Added NetCDF renderer for tomo3d endpoint

* Tomo improvements

* Fix for tomo 3d basemap being inverted by latitude

* poetry lock

* Simplified profile tomo API + added NC output formatter

* Added multi-slice capability to 2d plotting

* Add cbar min/max option to tomo profile rendering

+ better nodata error handling

* Support for arbitrary line slicing of tomogram data

Currently does not support multislicing

* Remove accidentally committed test code

* Add slicing along line to NTS w/ nexusproto implementation

+ tomogram arb slicing tries to use this with bbox subsetting as a fallback

* Improve performance of elevation binning

* Improve performance of data gridding

* Update CL + remove LGPL 3 indirect dependency

---------

Co-authored-by: rileykk <[email protected]>

* [SDAP-497] - Release build script (#290)

* Build script

* Build script

* Rebased onto 1.2.0 release branch to avoid messing up changelog

* Retry failed builds/pushes once to avoid time loss from transient errors

* Moved changelog entry

* Start of src pull changes

* mv build script

* Better ASF pulling

* Build from GitHub

* Simple docs

* Some cleanup, renaming, deleting unused stuff etc

* Update build.py

* Added build from ASF archive

* Updates and fixes

* Updates for post-grad changes

* Add support for custom nexusproto builds from git

* Build tool updates

* Update changelog

* Implement build from local directory

---------

Co-authored-by: rileykk <[email protected]>

* SDAP-520 - Guide for evaluating SDAP release candidates (#318)

* SDAP-520 Added RC eval guide to RTD

* remove incubator

* add to toctree

* Attempt to fix the many Sphinx warnings on build

* Changelog

---------

Co-authored-by: rileykk <[email protected]>

* added primitive match return function to CDMS reader (#260)

* added primitive match return function

* Updated function to be used with generic use cases

* Updated documentation for assemble functions

* Updated documentation, removed xarray dependency

* merged with develop, updated changelog

* SDAP-470 Fixed CHANGELOG.md (#324)

* SDAP-526 - Upgrade canopy and ground masking (#326)

* Upgrade canopy and ground masking

* Changelog update

---------

Co-authored-by: rileykk <[email protected]>

* Patch to add geopy dep that was forgotten (#328)

Co-authored-by: rileykk <[email protected]>

* SDAP-527: Fixed creation of execution status Cassandra table (#329)

Co-authored-by: rileykk <[email protected]>

* SDAP-521 - Improved SDAP testing suite (#325)

* SDAP-520 Added RC eval guide to RTD

* remove incubator

* add to toctree

* Granule download script for testing

* Move & update test script

Made current with what was deployed for the CDMS project. Will need extensive editing.

* Test script pruning + guide start

* Updates

* Updates

* Guide for install and run

* Attempt to fix the many Sphinx warnings on build

* Fix bad ref

* Fix bad ref

* Fix bad ref (third time's the charm?)

* Removal of ingested test data

* Reduced datainbounds L2 test bbox to ease memory footprint

* Revert "Reduced datainbounds L2 test bbox to ease memory footprint"

This reverts commit 46cf5ad.

* Update docs for missing test collection

* SDAP-521 Updated quickstart and test guide. (#327)

* SDAP-521 Updated quickstart and test guide.

* SDAP-521 Updated solr start up env variables to be consistent with helm chart.

* SDAP-521 Updated README.md

---------

Co-authored-by: rileykk <[email protected]>
Co-authored-by: Nga Chung <[email protected]>

* SDAP-529: Added configuration for verbose logging for collection manager in the Helm chart (#330)

Co-authored-by: rileykk <[email protected]>

* Bump versions for release

* Some more version fixes

* Some more version fixes

* Fix doc typo

* I'm having issues with the venv. This needs more testing but shouldn't be a release blocker

* Final release dates to changelog

* Moved incorrect CL entry to proper section

---------

Co-authored-by: Stepheny Perez <[email protected]>
Co-authored-by: rileykk <[email protected]>
Co-authored-by: skorper <[email protected]>
Co-authored-by: Kevin <[email protected]>
Co-authored-by: alovett-COAPS <[email protected]>
Co-authored-by: Nga Chung <[email protected]>
  • Loading branch information
7 people authored Nov 4, 2024
1 parent fa359e4 commit ff51bf7
Show file tree
Hide file tree
Showing 67 changed files with 7,608 additions and 1,743 deletions.
21 changes: 21 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,27 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [1.4.0] - 2024-11-04
### Added
- SDAP-469: Support for three dimensional data. 2D data defaults to a layer at 0m elevation.
- SDAP-492: Added some demo algorithms for working with and visualizing tomography data. Currently designed for data from airborne SAR campaigns, but can be readily generalized.
- SDAP-526: Upgrade 2D tomography endpoint canopy and ground masking feature to allow for primary and backup datasets
- SDAP-497: Added tool to ease building of releases. Can build from ASF distributions, git repos, and local
- SDAP-520: (Documentation) Added guide to docs for evaluating official release candidates.
- SDAP-529: Added configuration for verbose logging for collection manager in the Helm chart
### Changed
- SDAP-470: Modified `cdms-reader` tool to support primary to secondary matchups
### Deprecated
### Removed
### Fixed
- SDAP-525: Fixed expired AWS creds for Zarr datasets breaking list endpoint
- SDAP-527: Fixed incorrect initialization of `doms.doms_executions` Cassandra table, which broke `/matchup` endpoint for new installations.
- SDAP-522: Fixed several broken endpoints discovered by SDAP-521 work
- Fixed `/version` by updating to correct NEXUS package name
- Fixed `/heartbeat` by moving heartbeat evaluations to all backends
- Fixed CDMS STAC catalog pagination
### Security

## [1.3.0] - 2024-06-10
### Added
- SDAP-506:
Expand Down
2 changes: 1 addition & 1 deletion README
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Apache SDAP release 1.3.0
Apache SDAP release 1.4.0

This is a source distribution of Apache SDAP - NEXUS.

Expand Down
96 changes: 85 additions & 11 deletions analysis/webservice/algorithms/DataInBoundsSearch.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,11 @@
# limitations under the License.


import io
import gzip
import json
import numpy
import logging

from datetime import datetime
from pytz import timezone
Expand Down Expand Up @@ -68,6 +71,9 @@ class DataInBoundsSearchCalcHandlerImpl(NexusCalcHandler):
}
singleton = True

def __init__(self, tile_service_factory, **kwargs):
NexusCalcHandler.__init__(self, tile_service_factory, desired_projection='swath')

def parse_arguments(self, request):
# Parse input arguments

Expand Down Expand Up @@ -114,27 +120,68 @@ def parse_arguments(self, request):
"Maximum (Northern) Latitude. 'metadataFilter' must be in the form key:value",
code=400)

return ds, parameter_s, start_time, end_time, bounding_polygon, metadata_filter
min_elevation, max_elevation = request.get_elevation_args()

if (min_elevation and max_elevation) and min_elevation > max_elevation:
raise NexusProcessingException(
reason='Min elevation must be less than or equal to max elevation',
code=400
)

compact_result = request.get_boolean_arg('compact')

return ds, parameter_s, start_time, end_time, bounding_polygon, metadata_filter, min_elevation, max_elevation, compact_result

def calc(self, computeOptions, **args):
ds, parameter, start_time, end_time, bounding_polygon, metadata_filter = self.parse_arguments(computeOptions)
ds, parameter, start_time, end_time, bounding_polygon,\
metadata_filter, min_elevation, max_elevation, compact = self.parse_arguments(computeOptions)

includemeta = computeOptions.get_include_meta()

log = logging.getLogger(__name__)

min_lat = max_lat = min_lon = max_lon = None
tile_service = self._get_tile_service()

if bounding_polygon:
min_lat = bounding_polygon.bounds[1]
max_lat = bounding_polygon.bounds[3]
min_lon = bounding_polygon.bounds[0]
max_lon = bounding_polygon.bounds[2]

tiles = self._get_tile_service().get_tiles_bounded_by_box(min_lat, max_lat, min_lon, max_lon, ds, start_time,
end_time)
tiles = tile_service.find_tiles_in_box(min_lat, max_lat, min_lon, max_lon, ds, start_time, end_time,
min_elevation=min_elevation, max_elevation=max_elevation, fetch_data=False)

need_to_fetch = True
else:
tiles = self._get_tile_service().get_tiles_by_metadata(metadata_filter, ds, start_time, end_time)
need_to_fetch = False

data = []
for tile in tiles:

log.info(f'Matched {len(tiles):,} tiles.')

for i in range(len(tiles)-1, -1, -1): # tile in tiles:
tile = tiles.pop(i)

tile_id = tile.tile_id

log.info(f'Processing tile {tile_id} | {i=}')

if need_to_fetch:
tile = tile_service.fetch_data_for_tiles(tile)[0]
tile = tile_service.mask_tiles_to_bbox(min_lat, max_lat, min_lon, max_lon, [tile])
tile = tile_service.mask_tiles_to_time_range(start_time, end_time, tile)

if min_elevation is not None and max_elevation is not None:
tile = tile_service.mask_tiles_to_elevation(min_elevation, max_elevation, tile)

if len(tile) == 0:
log.info(f'Skipping empty tile {tile_id}')
continue

tile = tile[0]

for nexus_point in tile.nexus_point_generator():

point = dict()
Expand All @@ -159,15 +206,26 @@ def calc(self, computeOptions, **args):
except (KeyError, IndexError):
pass
else:
point['variable'] = nexus_point.data_vals
variables = []

data_vals = nexus_point.data_vals if tile.is_multi else [nexus_point.data_vals]

for value, variable in zip(data_vals, tile.variables):
if variable.standard_name:
var_name = variable.standard_name
else:
var_name = variable.variable_name

variables.append({var_name: value})

point['variables'] = variables

data.append({
'latitude': nexus_point.latitude,
'longitude': nexus_point.longitude,
'time': nexus_point.time,
'data': [
point
]
'elevation': nexus_point.depth,
'data': point
})

if includemeta and len(tiles) > 0:
Expand All @@ -178,14 +236,22 @@ def calc(self, computeOptions, **args):
result = DataInBoundsResult(
results=data,
stats={},
meta=meta)
meta=meta,
compact=compact
)

result.extendMeta(min_lat, max_lat, min_lon, max_lon, "", start_time, end_time)

log.info(f'Finished subsetting. Generated {len(data):,} points')

return result


class DataInBoundsResult(NexusResults):
def __init__(self, results=None, meta=None, stats=None, computeOptions=None, status_code=200, compact=False, **args):
NexusResults.__init__(self, results, meta, stats, computeOptions, status_code, **args)
self.__compact = compact

def toCSV(self):
rows = []

Expand Down Expand Up @@ -229,7 +295,15 @@ def toCSV(self):
return "\r\n".join(rows)

def toJson(self):
return json.dumps(self.results(), indent=4, cls=NpEncoder)
if not self.__compact:
return json.dumps(self.results(), indent=4, cls=NpEncoder)
else:
buffer = io.BytesIO()
with gzip.open(buffer, 'wt', encoding='ascii') as zip:
json.dump(self.results(), zip, cls=NpEncoder)

buffer.seek(0)
return buffer.read()

class NpEncoder(json.JSONEncoder):
def default(self, obj):
Expand Down
10 changes: 1 addition & 9 deletions analysis/webservice/algorithms/Heartbeat.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,15 +29,7 @@ class HeartbeatCalcHandlerImpl(NexusCalcHandler):
singleton = True

def calc(self, computeOptions, **args):
solrOnline = self._get_tile_service().pingSolr()

# Not sure how to best check cassandra cluster status so just return True for now
cassOnline = True

if solrOnline and cassOnline:
status = {"online": True}
else:
status = {"online": False}
status = self._get_tile_service().heartbeat()

class SimpleResult(object):
def __init__(self, result):
Expand Down
Loading

0 comments on commit ff51bf7

Please sign in to comment.