Using Makefile with Terraform and split project layout -


i have terraform project layout that's similar to

stage   └ makefile   └ terraform.tfvars   └ vpc   └ services       └ frontend-app       └ backend-app           └ vars.tf           └ outputs.tf           └ main.tf   └ data-storage       └ mysql       └ redis 

where contents of makefile similar to

.phony: plan apply destroy  all: plan  plan:     terraform plan -var-file terraform.tfvars -out terraform.tfplan  apply:     terraform apply -var-file terraform.tfvars  destroy:     terraform plan -destroy -var-file terraform.tfvars -out terraform.tfplan     terraform apply terraform.tfplan 

as far understand it, terraform run on templates in current directory. need cd stage/services/backend-app , run terraform apply there.

however able manage whole stack makefile. have not seen clean way pass arguments make.

my goal have targets such as

make s3 plan # verify syntax make s3 apply # apply plan 

unless there's better way run terraform parent directory? there similar to:

make plan  # create stage plan make apply # apply stage plan 

another solution create tmp folder on each run , use terraform init ... , terraform get..., (the example shows remote state management using partial configuration):

readonly orig_path=$(pwd) && \ mkdir tmp && \ cd tmp && \ terraform init -backend=true -backend-config="$tf_backend_config" -backend-config="key=${account}/${envir}/${project}.json" $project_path && \ terraform $project_path && \ terraform apply && \ cd $orig_path && \ rm -fr tmp 

or maybe wrap above shell script, , call make file under "apply" etc.

-- adding section address comment/question sam hammamy --

in general, way how current versions of terraform processes projects, want think ahead of time how structure our projects, , how break them down manageable still functional pieces. why break them "foundational" projects vpc, vpn, securitygroups, iam-policies, bastions etc. vs. 'functional" "db", "web-cluster" etc. run/deploy/modify "fundamental" pieces once or occasionally, while "functional" pieces might re-deploy several times day.

which means fragmenting of our iac code that, end of fragmenting of our remote state accordingly, , execution of our project deployment well.

for project structure, reflects "philosophy" end project structure similar (common modules not shown):

├── projects │   └── application-name │       ├── dev │       │   ├── bastion │       │   ├── db │       │   ├── vpc │       │   └── web-cluster │       ├── prod │       │   ├── bastion │       │   ├── db │       │   ├── vpc │       │   └── web-cluster │       └── backend.config └── run-tf.sh 

where each project subfolder, , each application_name/env/component = folder (i.e. dev/vpc) added placeholder backend configuration file: backend.tf:

terraform {     backend "s3" {     } } 

where folder content each component contain files similar to:

│       ├── prod │       │   ├── vpc │       │   │   ├── backend.tf │       │   │   ├── main.tf │       │   │   ├── outputs.tf │       │   │   └── variables.tf 

at "application_name/" or "application_name/env" level added backend.config file, content:

bucket     = "bucket_name" region     = "region_name" lock       = true lock_table = "lock_table_name" encrypt    = true 

our wrapper shell script expects parameters application-name, environment, component, , actual terraform cmd run.

the content of run-tf.sh script (simplified):

#!/bin/bash  application=$1 envir=$2 component=$3 cmd=$4  tf_backend_config="root_path/$application/$envir/$component/backend.config"  terraform init -backend=true -backend-config="$tf_backend_config" -backend-config="key=tfstate/${application}/${envir}/${component}.json"   terraform  terraform $cmd 

here how typical run-tf.sh invocation looks (to executed makefile):

$ run-tf.sh application_name dev vpc plan  $ run-tf.sh application_name prod bastion apply 

Comments

Popular posts from this blog

javascript - Clear button on addentry page doesn't work -

c# - Selenium Authentication Popup preventing driver close or quit -

tensorflow when input_data MNIST_data , zlib.error: Error -3 while decompressing: invalid block type -