You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! I'm using the gitops-bridge to bootstrap my EKS clusters with Velero and external-dns and I noticed some syncs are failing because the output from the bootstrap-addons doesn't include certain annotations. This is also using a fork of the example apprepo.
For example (external-dns appset): provider: aws serviceAccount: name: {{.metadata.annotations.external_dns_service_account}} annotations: eks.amazonaws.com/role-arn: '{{.metadata.annotations.external_dns_iam_role_arn}}' domainFilters: {{.metadata.annotations.external_dns_domain_filters}} txtOwnerId: {{.metadata.annotations.aws_cluster_name}} policy: {{default "upsert-only" .metadata.annotations.external_dns_policy}}
domainFliters is passed into the blueprint module from our terraform configuration, but doesn't make it into the metadata output consumed by the bridge, and I receive the following error: Failed to load target state: failed to generate manifest for source 2 of 2: rpc error: code = Unknown desc = helm template . --name-template external-dns --namespace external-dns --kube-version 1.31 --values /tmp/356309be-fd1c-48ea-b0ea-33396de4e076 <api versions removed> --include-crds failed exit status 1: Error: template: external-dns/templates/deployment.yaml:111:29: executing "external-dns/templates/deployment.yaml" at <.Values.domainFilters>: range can't iterate over Use --debug flag to render out invalid YAML
Checking out the blueprint module repo, I wonder if this might be related (in output.tf)? I've also noticed a similar issue with velero where the IAM roles make it in, but the s3 resources are omitted. { for k, v in { iam_role_arn = module.external_secrets.iam_role_arn namespace = local.external_secrets_namespace service_account = local.external_secrets_service_account } : "external_secrets_${k}" => v if var.enable_external_secrets },
Please pardon me if I'm creating an issue on the wrong repo, bit of a cross-cutting issue it seems 😛
The text was updated successfully, but these errors were encountered:
Hello! I'm using the gitops-bridge to bootstrap my EKS clusters with Velero and external-dns and I noticed some syncs are failing because the output from the bootstrap-addons doesn't include certain annotations. This is also using a fork of the example apprepo.
For example (external-dns appset):
provider: aws serviceAccount: name: {{.metadata.annotations.external_dns_service_account}} annotations: eks.amazonaws.com/role-arn: '{{.metadata.annotations.external_dns_iam_role_arn}}' domainFilters: {{.metadata.annotations.external_dns_domain_filters}} txtOwnerId: {{.metadata.annotations.aws_cluster_name}} policy: {{default "upsert-only" .metadata.annotations.external_dns_policy}}
domainFliters is passed into the blueprint module from our terraform configuration, but doesn't make it into the metadata output consumed by the bridge, and I receive the following error: Failed to load target state: failed to generate manifest for source 2 of 2: rpc error: code = Unknown desc =
helm template . --name-template external-dns --namespace external-dns --kube-version 1.31 --values /tmp/356309be-fd1c-48ea-b0ea-33396de4e076 <api versions removed> --include-crds
failed exit status 1: Error: template: external-dns/templates/deployment.yaml:111:29: executing "external-dns/templates/deployment.yaml" at <.Values.domainFilters>: range can't iterate over Use --debug flag to render out invalid YAMLChecking out the blueprint module repo, I wonder if this might be related (in output.tf)? I've also noticed a similar issue with velero where the IAM roles make it in, but the s3 resources are omitted.
{ for k, v in { iam_role_arn = module.external_secrets.iam_role_arn namespace = local.external_secrets_namespace service_account = local.external_secrets_service_account } : "external_secrets_${k}" => v if var.enable_external_secrets },
Please pardon me if I'm creating an issue on the wrong repo, bit of a cross-cutting issue it seems 😛
The text was updated successfully, but these errors were encountered: