By: Saikrishna Madupu – Sr Devops Engineer
This blog describes how to incorporate an extensive quantity of current AWS infrastructure into Terraform. This process is relevant to an organization that manually constructed its AWS infrastructure and wants to implement it in Terraform code for improved automation and cost savings. This is a substantial undertaking with a number of benefits.
- First, a brief description of TerraCognita. TerraCognita is a tool that helps import existing infrastructure from AWS, Google Private Cloud, AzureRM and VMware vSphere cloud providers as Terraform (v1.1.9) resource/state.1
I discovered through study that Terraform import only permits the import of a single resource at a time. You will need to import resources into one account at a time if you wish to build up your organization in the cloud at this time because there is no possibility to import resources into several accounts at the same time.
Adding TerraCognita will allow you to import a rather large file including a variety of variables. Using a flag, you have the ability to import only particular variables into a program. You can leverage the same strategy to restrict the variables that are generated by the variable.tf file.
Installation:
Go Libraries:
You can build and install with the latest sources. It uses Go Modules, so GO 1.17+ is required.
- git clone https://github.com/cycloidio/terracognita
- cd terracognita
- make install
Linux:
curl -L https://github.com/cycloidio/terracognita/releases/latest/download/terracognita-linux-amd64.tar.gz -o terracognita-linux-amd64.tar.gz
tar -xf terracognita-linux-amd64.tar.gz
chmod u+x terracognita-linux-amd64
sudo mv terracognita-linux-amd64 /usr/local/bin/terracognita
MacOs:
brew install terracognita
Prerequisites:
- Cloud account
- CLI installed and configured regardless of the cloud you use
- Terraform
Use Cases:
Sample CLI command to import all s3 buckets for AWS:
- Importing the s3 bucket which was created manually and manage it using Terraform IAC to handle its configuration using TerraCongita:
terracognita aws --hcl s3 --tfstate terraform.tfstate --aws-default-region us-east-1 -i aws_s3_bucket
Returns Output:
terracognita-s3 % terracognita aws --hcl s3 --tfstate terraform.tfstate --aws-default-region us-east-1 -i aws_s3_bucket
We are about to remove all content from "s3", are you sure? Yes/No (Y/N):
y
Starting Terracognita with version v0.8.1
Importing with filters:
Tags: [],
Include: [aws_s3_bucket],
Exclude: [],
Targets: [],
Importing aws_s3_bucket [6/6] Done!
Writing HCL Done!
Writing TFState Done!
saikrishnamadupu@Administrators-MacBook-Pro terracognita-s3 % ls -ltr
total 32
-rw-r--r-- 1 saikrishnamadupu staff 13164 Feb 12 00:07 terraform.tfstate
drwx------ 4 saikrishnamadupu staff 128 Feb 12 00:07 s3
It returns all the buckets info into the folder called s3 where it contains s3_storage.tf file that contains all the buckets list and its configuration.
Post-import manual work:
- TerraCognita does not make a versions.tf file on its own, you will need to copy the information from the module file, where it is automatically generated, and paste it into a versions.tf file on your own if one is needed.
- TerraCognita does not produce any outputs.tf files of its own. You are responsible for creating them if they are necessary.
Supported providers:
- Aws: v4.9.0
- AzureRM: v3.6.0
- Google: v4.9.0
- vSphere: v2.2.0
Ref: Terracognita
About the Author
Saikrishna Madupu, Sr. DevOps Engineer Sai is an IT professional with experience in DevOps Automation, Configuration Management tools, Container workloads and orchestration of those workloads via Kubernetes. A self-starter, and passionate problem-solver, with a flair for innovative design, with an ability to work towards automation whenever possible. He is an experienced Linux, Cloud data center operations and infrastructure engineer. He worked as a Devops cloud consultant in the past, helping clients in migrate on-prem applications to cloud, and holds certifications for AWS, Terraform, Gitlab, and other technologies. |