Centered image

Storing your Terraform state file locally is a bad idea.

Every time you apply Terrafrom it writes a copy of the current state of your infrastructure to a file called terraform.tfstate if this file is already present then it will move the old state file to a file called terraform.tfstate.backup and will create a new terraform.tfstate file. This presents three problems.

  • When stored locally the state files can be lost or deleted.
  • Only the last two versions are avaliable.
  • Working in teams is not possible or very difficult.

One solution is to use Amazon Web Services S3 to store the state files. S3 supports versioning which means you can version control the state of your infrastructure.

I will show you how to set this up. The first thing we need is to set up the AWSCLI tools.

sudo yum install python-pip -y
pip install --user  awscli

The run aws configure

[markb@feddy demo.2] $ aws configure
AWS Access Key ID [None]: ENTER-YOUR-ACCESS-KEY-HERE
AWS Secret Access Key [None]: ENTER-YOUR-SECRET-KEY-HERE
Default region name [None]: us-west-2
Default output format [None]: 

The next step is to create an Amazon Web Services S3 bucket to store your state files. You can do this via the GUI, here I have used Terraform. The bucket name is terraformtraining-7538 and I have also enabled versioning so you can roll back to any previous state. I have also prevented against accidental deletion of the bucket.

resource "aws_s3_bucket" "tfstate" {
bucket = "terraformtraining-7538"
acl    = "private"

  versioning {
    enabled = true
  }

  lifecycle {
    prevent_destroy = true
  }
}

All this is left to do now is to create a new file called backend.tf and store that with your new project, then run terraform init. If you already have a local state file you can push that to S3 by running terrafrom remote push.

terraform {
  backend "s3" {
    bucket = "terraformtraining-7538"
    key    = "terraform-state/project5"
  }
}