Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Importing ovh_vrack forces replacement #734

Open
maxime1907 opened this issue Sep 30, 2024 · 7 comments
Open

[BUG] Importing ovh_vrack forces replacement #734

maxime1907 opened this issue Sep 30, 2024 · 7 comments

Comments

@maxime1907
Copy link

maxime1907 commented Sep 30, 2024

Describe the bug

We are trying to import an existing vrack into terraform. We have the correct vrack id and the import is successful, however, when running terraform plan it shows that we have to replace the vrack and everything inside it because the plan and ovh_subsidiary objects are added. I assume this is not intended behaviour.

Terraform Version

v1.9.6

OVH Terraform Provider Version

v0.50.0

Affected Resource(s)

Please list the resources as a list, for example:

  • ovh_vrack

Terraform Configuration Files

data "ovh_me" "account" {}

data "ovh_order_cart" "cart" {
  ovh_subsidiary = data.ovh_me.account.ovh_subsidiary
}

data "ovh_order_cart_product_plan" "vrack" {
  cart_id        = data.ovh_order_cart.cart.id
  price_capacity = "renew"
  product        = "vrack"
  plan_code      = "vrack"
}

resource "ovh_vrack" "vrack" {
  ovh_subsidiary = data.ovh_order_cart.cart.ovh_subsidiary
  name           = var.vrack_name
  description    = var.vrack_description

  plan {
    duration     = data.ovh_order_cart_product_plan.vrack.selected_price.0.duration
    plan_code    = data.ovh_order_cart_product_plan.vrack.plan_code
    pricing_mode = data.ovh_order_cart_product_plan.vrack.selected_price.0.pricing_mode
  }
}

Expected Behavior

VRack is imported and next terraform plan only updates its fields.

Actual Behavior

After importing the resource it is forced to be replaced.

Steps to Reproduce

  1. terraform import 'module.ovh_managed_kubernetes.ovh_vrack.vrack' 'pn-XXXXXX'
  2. terraform plan -out=terraform.plan

Temporary workaround

  lifecycle {
    ignore_changes = [
      ovh_subsidiary,
    ]
  }

References

Are there any other GitHub issues (open or closed) or Pull Requests that should be linked here? For example:

@amstuta
Copy link
Contributor

amstuta commented Oct 2, 2024

Hello @maxime1907, thanks for opening this issue.

I took your exact configuration and I'm not able to reproduce the issue (ovh_subsidiary is fetched from the same API call when reading a vrack resource and when using datasource ovh_me, so it should not be different when importing the resource).

What are the values that you have before and after for ovh_subsidiary ? And additionally that would help us debug if you could provide us the debug log of the import using TF_LOG=debug terraform import ....

@maxime1907
Copy link
Author

maxime1907 commented Oct 2, 2024

Hello, thanks for your answer

Here is the debug output log of this command:

TF_LOG=debug terraform import 'module.ovh_managed_kubernetes.ovh_vrack.vrack' 'pn-REDACTED' > /tmp/output.log 2>&1

output_redacted.log

Here is the output of terraform plan after the import:

  # module.ovh_managed_kubernetes.ovh_vrack.vrack must be replaced
-/+ resource "ovh_vrack" "vrack" {
      ~ id             = "pn-REDACTED" -> (known after apply)
        name           = "REDACTED"
      ~ ovh_subsidiary = "FR" -> (known after apply) # forces replacement
      ~ service_name   = "pn-REDACTED" -> (known after apply)
      ~ urn            = "urn:v1:eu:resource:vrack:pn-REDACTED" -> (known after apply)
        # (1 unchanged attribute hidden)

      ~ order (known after apply)

      ~ plan {
          ~ duration     = "P1M" -> (known after apply)
          ~ pricing_mode = "default" -> (known after apply)
            # (2 unchanged attributes hidden)
        }
    }

@amstuta
Copy link
Contributor

amstuta commented Oct 3, 2024

I just re-tested two things:

  1. A "classic" import

I used exactly the config you provided and used a name and description that are different from the ones of the already created vRack:

data "ovh_me" "account" {}

data "ovh_order_cart" "cart" {
  ovh_subsidiary = data.ovh_me.account.ovh_subsidiary
}

data "ovh_order_cart_product_plan" "vrack" {
  cart_id             = data.ovh_order_cart.cart.id
  price_capacity = "renew"
  product            = "vrack"
  plan_code        = "vrack"
}

resource "ovh_vrack" "vrack" {
  ovh_subsidiary = data.ovh_order_cart.cart.ovh_subsidiary
  name                = "new name"
  description       = "new description"

  plan {
    duration         = data.ovh_order_cart_product_plan.vrack.selected_price.0.duration
    plan_code      = data.ovh_order_cart_product_plan.vrack.plan_code
    pricing_mode = data.ovh_order_cart_product_plan.vrack.selected_price.0.pricing_mode
  }
}

Then I imported the resource: terraform import ovh_vrack.vrack <vRackID>
And ran a terraform plan that just tries to update the resource in-place:

  ~ update in-place

Terraform will perform the following actions:

  # ovh_vrack.vrack will be updated in-place
  ~ resource "ovh_vrack" "vrack" {
      ~ description    = "new description"
        id                    = "vRackID"
      ~ name             = "old name" -> "new name"
        # (3 unchanged attributes hidden)

        # (1 unchanged block hidden)
    }

Plan: 0 to add, 1 to change, 0 to destroy.
  1. Import with config file generation

I also tried an import with the import block like the following:

import {
  to = ovh_vrack.vrack
  id = "vRackID"
}

And ran: terraform plan -generate-config-out=vrack.tf, that generated the following content in vrack.tf:

resource "ovh_vrack" "vrack" {
  description    = null
  name           = "old name"
  ovh_subsidiary = "FR"
  plan {
    catalog_name = null
    duration     = "P1M"
    plan_code    = "vrack"
    pricing_mode = "default"
  }
}

The next terraform apply imported the resource into the state and the next applies don't detect any change.

I think that there is another issue in your configuration that triggers this replacement, or maybe a version mismatch (are you sure you're using v0.50.0 ? Maybe you need to re-run a terraform init ?)

@maxime1907
Copy link
Author

maxime1907 commented Oct 3, 2024

Maybe its because i am trying to import the vrack that gets automatically created when you create a public cloud project though the OVH manager interface

yes i am using the latest version which is v0.50.0 and re-ran terraform init without success

What i did is:

  1. Try to order a new cloud project with terraform but we encountered the sepa direct debit bug with old accounts
  2. I then manually created the cloud project through the ovh maanger interface which automatically creates a vrack
  3. i then imported the cloud project with its order id which worked
  4. and finally tried to import the vrack with its unique id which triggers a replacement on ovh_subsidiary = "FR"

note that i also have an attach resource who wants to be replaced, i dont know if that can help:

  # module.ovh_managed_kubernetes.ovh_vrack_cloudproject.attach must be replaced
-/+ resource "ovh_vrack_cloudproject" "attach" {
      ~ id           = "vrack_pn-REDACTED-cloudproject_REDACTED" -> (known after apply)
      ~ service_name = "pn-REDACTED" -> (known after apply) # forces replacement
        # (1 unchanged attribute hidden)
    }

also my ovh api keys are restricted to a specific list of routes for security measures, so it might have something to do with it:

GET:/me*
POST:/me*
PUT:/me*
DELETE:/me*

GET:/order/cart*
POST:/order/cart*
PUT:/order/cart*
DELETE:/order/cart*

GET:/vrack*
POST:/vrack*
PUT:/vrack*
DELETE:/vrack*

GET:/services*
POST:/services*
PUT:/services*
DELETE:/services*

GET:/cloud/project*
POST:/cloud/project*
PUT:/cloud/project*
DELETE:/cloud/project*

@amstuta
Copy link
Contributor

amstuta commented Oct 8, 2024

I cannot reproduce the issue, even with a vRack automatically created via a cloud project. I'm not sure how to dig more, I think you should try the import in a separated Terraform workspace to verify that the issue doesn't come from another resource in your plan. By using only the config from my previous comment, I never have a replacement.

@GeryDeMerciYanis
Copy link

GeryDeMerciYanis commented Oct 11, 2024

Hello,

I’m encountering an issue where Terraform is prompting me to recreate a VRack, even though it was successfully applied about 4 weeks ago.

Details:
• The VRack already exists on the OVH platform.
• It is also correctly reflected in my Terraform state file (tfstate).

  # ovh_vrack.vrack will be created
  + resource "ovh_vrack" "vrack" {
      + description    = "VPC for staging environment"
      + id             = (known after apply)
      + name           = "xxxxxx"
      + ovh_subsidiary = "FR"
      + service_name   = (known after apply)
      + urn            = (known after apply)

      + plan {
          + duration     = "P1M"
          + plan_code    = "vrack"
          + pricing_mode = "default"
        }
    }

  # ovh_vrack_cloudproject.vcp must be replaced
-/+ resource "ovh_vrack_cloudproject" "vcp" {
      ~ id           = "vrack_pn-xxx-cloudproject_xxxx" -> (known after apply)
      ~ service_name = "pn-xxx" # forces replacement -> (known after apply) # forces replacement
        # (1 unchanged attribute hidden)
    }


OVH provider version : 0.48.0

@amstuta
Copy link
Contributor

amstuta commented Oct 11, 2024

Hello @GeryDeMerciYanis, from the log you sent it seems that the resource ovh_vrack.vrack is not present in your state, else Terraform won't plan to just create it.
Could you provide us the content of your state file with sensitive stuffs anonymized ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants