Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

550af23fe6334a2a9e65ede07658bb0f-a-partner-or-redhat-associate-submits-an-error-free-chart-source-with-report-70b26bbb-partners-hashicorp-vault-0.17.0 #99053

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
apiversion: v1
kind: verify-report
metadata:
tool:
verifier-version: 1.12.2
profile:
VendorType: partner
version: v1.2
reportDigest: uint64:7407049884959662603
chart-uri: https://github.com/openshift-helm-charts/development/blob/main/tests/data/vault-0.17.0.tgz?raw=true
digests:
chart: sha256:d075d16bb33bb33f66cc6fb8af5cce935b647972de4b4af49be385fcce7bed05
package: 641f6c03aac5117eb679b341426adaa612829acc546c18e59c58ea1b45b2e513
lastCertifiedTimestamp: "2023-09-11T12:22:29.724879-04:00"
testedOpenShiftVersion: "4.13"
supportedOpenShiftVersions: '>=4.2'
webCatalogOnly: false
chart:
name: vault
home: https://www.vaultproject.io
sources:
- https://github.com/hashicorp/vault
- https://github.com/hashicorp/vault-helm
- https://github.com/hashicorp/vault-k8s
- https://github.com/hashicorp/vault-csi-provider
version: 0.17.0
description: Official HashiCorp Vault Chart
keywords:
- vault
- security
- encryption
- secrets
- management
- automation
- infrastructure
maintainers: []
icon: https://github.com/hashicorp/vault/raw/f22d202cde2018f9455dec755118a9b84586e082/Vault_PrimaryLogo_Black.png
apiversion: v2
condition: ""
tags: ""
appversion: 1.8.4
deprecated: false
annotations:
charts.openshift.io/name: vault
kubeversion: '>= 1.14.0-0'
dependencies: []
type: ""
chart-overrides: ""
results:
- check: v1.0/helm-lint
type: Mandatory
outcome: PASS
reason: Helm lint successful
- check: v1.0/not-contain-csi-objects
type: Mandatory
outcome: PASS
reason: CSI objects do not exist
- check: v1.0/contains-values-schema
type: Mandatory
outcome: PASS
reason: Values schema file exist
- check: v1.0/contains-values
type: Mandatory
outcome: PASS
reason: Values file exist
- check: v1.1/images-are-certified
type: Mandatory
outcome: PASS
reason: |-
Image is Red Hat certified : registry.connect.redhat.com/hashicorp/vault:1.8.4-ubi
Image is Red Hat certified : registry.connect.redhat.com/hashicorp/vault-k8s:0.14.0-ubi
- check: v1.0/is-helm-v3
type: Mandatory
outcome: PASS
reason: API version is V2, used in Helm 3
- check: v1.0/signature-is-valid
type: Mandatory
outcome: SKIPPED
reason: 'Chart is not signed : Signature verification not required'
- check: v1.1/has-kubeversion
type: Mandatory
outcome: PASS
reason: Kubernetes version specified
- check: v1.0/not-contains-crds
type: Mandatory
outcome: PASS
reason: Chart does not contain CRDs
- check: v1.0/contains-test
type: Mandatory
outcome: PASS
reason: Chart test files exist
- check: v1.0/required-annotations-present
type: Mandatory
outcome: PASS
reason: All required annotations present
- check: v1.0/chart-testing
type: Mandatory
outcome: PASS
reason: Chart tests have passed
- check: v1.0/has-readme
type: Mandatory
outcome: PASS
reason: Chart has a README

Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.git/
.terraform/
bin/
test/

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,239 @@
# Contributing to Vault Helm

**Please note:** We take Vault's security and our users' trust very seriously.
If you believe you have found a security issue in Vault, please responsibly
disclose by contacting us at [email protected].

**First:** if you're unsure or afraid of _anything_, just ask or submit the
issue or pull request anyways. You won't be yelled at for giving it your best
effort. The worst that can happen is that you'll be politely asked to change
something. We appreciate any sort of contributions, and don't want a wall of
rules to get in the way of that.

That said, if you want to ensure that a pull request is likely to be merged,
talk to us! You can find out our thoughts and ensure that your contribution
won't clash or be obviated by Vault's normal direction. A great way to do this
is via the [Vault Google Group][2]. Sometimes Vault devs are in `#vault-tool`
on Freenode, too.

This document will cover what we're looking for in terms of reporting issues.
By addressing all the points we're looking for, it raises the chances we can
quickly merge or address your contributions.

## Issues

### Reporting an Issue

* Make sure you test against the latest released version. It is possible
we already fixed the bug you're experiencing. Even better is if you can test
against `main`, as bugs are fixed regularly but new versions are only
released every few months.

* Provide steps to reproduce the issue, and if possible include the expected
results as well as the actual results. Please provide text, not screen shots!

* Respond as promptly as possible to any questions made by the Vault
team to your issue. Stale issues will be closed periodically.

### Issue Lifecycle

1. The issue is reported.

2. The issue is verified and categorized by a Vault Helm collaborator.
Categorization is done via tags. For example, bugs are marked as "bugs".

3. Unless it is critical, the issue may be left for a period of time (sometimes
many weeks), giving outside contributors -- maybe you!? -- a chance to
address the issue.

4. The issue is addressed in a pull request or commit. The issue will be
referenced in the commit message so that the code that fixes it is clearly
linked.

5. The issue is closed. Sometimes, valid issues will be closed to keep
the issue tracker clean. The issue is still indexed and available for
future viewers, or can be re-opened if necessary.

## Testing

The Helm chart ships with both unit and acceptance tests.

The unit tests don't require any active Kubernetes cluster and complete
very quickly. These should be used for fast feedback during development.
The acceptance tests require a Kubernetes cluster with a configured `kubectl`.

### Test Using Docker Container

The following are the instructions for running bats tests using a Docker container.

#### Prerequisites

* Docker installed
* `vault-helm` checked out locally

#### Test

**Note:** the following commands should be run from the `vault-helm` directory.

First, build the Docker image for running the tests:

```shell
docker build -f ${PWD}/test/docker/Test.dockerfile ${PWD}/test/docker/ -t vault-helm-test
```
Next, execute the tests with the following commands:
```shell
docker run -it --rm -v "${PWD}:/test" vault-helm-test bats /test/test/unit
```
It's possible to only run specific bats tests using regular expressions.
For example, the following will run only tests with "injector" in the name:
```shell
docker run -it --rm -v "${PWD}:/test" vault-helm-test bats /test/test/unit -f "injector"
```

### Test Manually
The following are the instructions for running bats tests on your workstation.
#### Prerequisites
* [Bats](https://github.com/bats-core/bats-core)
```bash
brew install bats-core
```
* [yq](https://pypi.org/project/yq/)
```bash
brew install python-yq
```
* [helm](https://helm.sh)
```bash
brew install kubernetes-helm
```

#### Test

To run the unit tests:

bats ./test/unit

To run the acceptance tests:

bats ./test/acceptance

If the acceptance tests fail, deployed resources in the Kubernetes cluster
may not be properly cleaned up. We recommend recycling the Kubernetes cluster to
start from a clean slate.

**Note:** There is a Terraform configuration in the
[`test/terraform/`](https://github.com/hashicorp/vault-helm/tree/main/test/terraform) directory
that can be used to quickly bring up a GKE cluster and configure
`kubectl` and `helm` locally. This can be used to quickly spin up a test
cluster for acceptance tests. Unit tests _do not_ require a running Kubernetes
cluster.

### Writing Unit Tests

Changes to the Helm chart should be accompanied by appropriate unit tests.

#### Formatting

- Put tests in the test file in the same order as the variables appear in the `values.yaml`.
- Start tests for a chart value with a header that says what is being tested, like this:
```
#--------------------------------------------------------------------
# annotations
```

- Name the test based on what it's testing in the following format (this will be its first line):
```
@test "<section being tested>: <short description of the test case>" {
```
When adding tests to an existing file, the first section will be the same as the other tests in the file.
#### Test Details
[Bats](https://github.com/bats-core/bats-core) provides a way to run commands in a shell and inspect the output in an automated way.
In all of the tests in this repo, the base command being run is [helm template](https://docs.helm.sh/helm/#helm-template) which turns the templated files into straight yaml output.
In this way, we're able to test that the various conditionals in the templates render as we would expect.

Each test defines the files that should be rendered using the `--show-only` flag, then it might adjust chart values by adding `--set` flags as well.
The output from this `helm template` command is then piped to [yq](https://pypi.org/project/yq/).
`yq` allows us to pull out just the information we're interested in, either by referencing its position in the yaml file directly or giving information about it (like its length).
The `-r` flag can be used with `yq` to return a raw string instead of a quoted one which is especially useful when looking for an exact match.
The test passes or fails based on the conditional at the end that is in square brackets, which is a comparison of our expected value and the output of `helm template` piped to `yq`.
The `| tee /dev/stderr ` pieces direct any terminal output of the `helm template` and `yq` commands to stderr so that it doesn't interfere with `bats`.

#### Test Examples

Here are some examples of common test patterns:

- Check that a value is disabled by default

```
@test "ui/Service: no type by default" {
cd `chart_dir`
local actual=$(helm template \
--show-only templates/ui-service.yaml \
. | tee /dev/stderr |
yq -r '.spec.type' | tee /dev/stderr)
[ "${actual}" = "null" ]
}
```

In this example, nothing is changed from the default templates (no `--set` flags), then we use `yq` to retrieve the value we're checking, `.spec.type`.
This output is then compared against our expected value (`null` in this case) in the assertion `[ "${actual}" = "null" ]`.
- Check that a template value is rendered to a specific value
```
@test "ui/Service: specified type" {
cd `chart_dir`
local actual=$(helm template \
--show-only templates/ui-service.yaml \
--set 'ui.serviceType=LoadBalancer' \
. | tee /dev/stderr |
yq -r '.spec.type' | tee /dev/stderr)
[ "${actual}" = "LoadBalancer" ]
}
```
This is very similar to the last example, except we've changed a default value with the `--set` flag and correspondingly changed the expected value.

- Check that a template value contains several values
```
@test "server/standalone-StatefulSet: custom resources" {
cd `chart_dir`
local actual=$(helm template \
--show-only templates/server-statefulset.yaml \
--set 'server.standalone.enabled=true' \
--set 'server.resources.requests.memory=256Mi' \
--set 'server.resources.requests.cpu=250m' \
. | tee /dev/stderr |
yq -r '.spec.template.spec.containers[0].resources.requests.memory' | tee /dev/stderr)
[ "${actual}" = "256Mi" ]

local actual=$(helm template \
--show-only templates/server-statefulset.yaml \
--set 'server.standalone.enabled=true' \
--set 'server.resources.limits.memory=256Mi' \
--set 'server.resources.limits.cpu=250m' \
. | tee /dev/stderr |
yq -r '.spec.template.spec.containers[0].resources.limits.memory' | tee /dev/stderr)
[ "${actual}" = "256Mi" ]
```
*Note:* If testing more than two conditions, it would be good to separate the `helm template` part of the command from the `yq` sections to reduce redundant work.
- Check that an entire template file is not rendered
```
@test "syncCatalog/Deployment: disabled by default" {
cd `chart_dir`
local actual=$( (helm template \
--show-only templates/server-statefulset.yaml \
--set 'global.enabled=false' \
. || echo "---") | tee /dev/stderr |
yq 'length > 0' | tee /dev/stderr)
[ "${actual}" = "false" ]
}
```
Here we are check the length of the command output to see if the anything is rendered.
This style can easily be switched to check that a file is rendered instead.
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
annotations:
charts.openshift.io/name: vault
apiVersion: v2
appVersion: 1.8.4
description: Official HashiCorp Vault Chart
home: https://www.vaultproject.io
icon: https://github.com/hashicorp/vault/raw/f22d202cde2018f9455dec755118a9b84586e082/Vault_PrimaryLogo_Black.png
keywords:
- vault
- security
- encryption
- secrets
- management
- automation
- infrastructure
kubeVersion: '>= 1.14.0-0'
name: vault
sources:
- https://github.com/hashicorp/vault
- https://github.com/hashicorp/vault-helm
- https://github.com/hashicorp/vault-k8s
- https://github.com/hashicorp/vault-csi-provider
version: 0.17.0
Loading
Loading