We’re using Terraform to manage our AWS infrastructure and the state itself is also in AWS. We’ve got 2 separate accounts for test and prod and each has an S3 bucket with the state files for those accounts.

We’re not setting up alternate regions for disaster recovery and it’s got me wondering if the region the terraform S3 bucket is in goes down then we won’t be able to deploy anything with terraform.

So what’s the best practice for this? Should we have a bucket in every region with the state files for the projects in that region but then that doesn’t work for multi-region deployments.

  • @nomecks
    link
    31 year ago

    Using TF Cloud or TF Enterprise is best practice. They keep all the states secure in one place.

    • dbx12
      link
      fedilink
      11 year ago

      That’s just moving the problem somewhere else isn’t it? Unless tf cloud does keep multi region backups of states.

  • z3r0
    link
    fedilink
    2
    edit-2
    1 year ago

    If an entire region goes down, the Terraform status file stored there will not be useful at all because it only stores information about the resources you deployed in that particular region and your resources deployed there will also go down.

    Replicating the status file in another region will not be useful either because it will only contain information about the resources that are down in your region.

    The status file inventories all the resources you have deployed to your cloud provider. Basically Terraform uses it to know what resources are being managed by the current Terraform code and to be idempotent.

    If you want to set up another region for disaster recovery (Active-Passive) you can use the same Terraform code, but use a different configuration (meaning different tfvars files) to deploy the resources to a different region (not necessarily to another account). Just make sure that all your data is replicated into the passive region.