You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using a private endpoint for the OKE cluster in the kubernetes provider configuration in Terraform, I encounter the following error during the terraform plan phase:
Error: Provider configuration: cannot load Kubernetes client configwith provider["registry.terraform.io/hashicorp/kubernetes"],on providers.tf line 24, in provider "kubernetes":
24: provider "kubernetes" {invalid configuration: default cluster has no server defined
Steps to Reproduce
Configure a private endpoint for the Kubernetes cluster in the Terraform setup.
Use the following provider configuration for kubernetes:
Terraform should successfully connect to the private endpoint and validate the configuration.
Actual Behavior
The following error is raised:
Error: Provider configuration: cannot load Kubernetes client config
invalid configuration: default cluster has no server defined
Additional Information
The following comment in the Terraform configuration file highlights important considerations when using private endpoints:
### Important Notice #### OCI Resource Manager Private Endpoint is only available when using Resource Manager.# If you use local Terraform, you will need to setup an OCI Bastion for connectivity to the Private OKE.# If using OCI CloudShell, you need to activate the OCI Private Endpoint for OCI CloudShell.---resource"oci_resourcemanager_private_endpoint""private_kubernetes_endpoint" { .. }
---# Resolves the private IP of the customer's private endpoint to a NAT IP.data"oci_resourcemanager_private_endpoint_reachable_ip""private_kubernetes_endpoint" {
private_endpoint_id=var.create_new_oke_cluster? oci_resourcemanager_private_endpoint.private_kubernetes_endpoint[0].id: var.existent_oke_cluster_private_endpointprivate_ip=trimsuffix(oci_containerengine_cluster.oke_cluster[0].endpoints.0.private_endpoint, ":6443") # TODO: Pending rule when has existent clustercount=(var.cluster_endpoint_visibility=="Private") ?1:0
}
Possible Causes
Misalignment between the provider's expected configuration and the way OCI OKE private endpoints are handled.
2 Lack of documentation or automation for scenarios involving private endpoints with Terraform outside OCI Resource Manager.
Request
Could you confirm whether the current terraform-oci-oke-quickstart supports private endpoints for the Kubernetes provider when using local Terraform?
If supported, could you provide guidance on properly configuring Terraform with a private endpoint, considering the above comments?
If this is a bug, could you suggest a workaround or plan for resolution?
The text was updated successfully, but these errors were encountered:
Description
When using a private endpoint for the OKE cluster in the
kubernetes
provider configuration in Terraform, I encounter the following error during theterraform plan
phase:Steps to Reproduce
Configure a private endpoint for the Kubernetes cluster in the Terraform setup.
Use the following provider configuration for
kubernetes
:Run
terraform plan
.Expected Behavior
Terraform should successfully connect to the private endpoint and validate the configuration.
Actual Behavior
The following error is raised:
Additional Information
The following comment in the Terraform configuration file highlights important considerations when using private endpoints:
Possible Causes
2 Lack of documentation or automation for scenarios involving private endpoints with Terraform outside OCI Resource Manager.
Request
The text was updated successfully, but these errors were encountered: