You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Vault now supports Vault Enterprise namespaces through the namespace property in the Vault backend configuration properties.
Vault, Credhub, and OpsMan backends now support specifying CA certs to trust when connecting. These are all done through ca_certs properties in each of these respective backends' properties.
Improvements
CredHub now parses some additional information. Namely: the ca key of certificate types. And then all of the keys of value and json types. Those certificates can't hide from me forever.
The scheduler is now multithreaded, so more than one backend can be scraped at a time. This should ease the clogging of large slow backends early in the queue order from completely stopping up smaller, quicker backends that may be waiting their turn.
The logic for authentication scheduling has been tweaked to allow the backends multiple potential chances to reauthenticate before their auth expires, whereas before, they really probably only got one chance each auth cycle.
tlsclient client should no longer discard all fetched information when one host fails to be gathered.
Bug Fixes
tlsclient should no longer explode when you configure a host that is an IPv4 address.
Doomsday should no longer crash if the Credhub backend stumbles upon a certificate key which is null. I don't even know how you make a certificate key null, but now we should handle it.
The "Show All" button is no longer missing from the Web UI dashboard when there are no certificates expiring soon.
Additional Stuff
When queuing adhoc scrapes through doomsday refresh, it no longer pushes back the normally scheduled scrape.
Adhoc scrapes no longer are dropped if they are within a certain time window of a previous scrape on that backend. Instead, they will be skipped if there is currently a scrape for that running, or that would be run before the adhoc scrape is run.