HumanGov: Deployment Of A Reusable Saas Multi-tenant AWS Infrastructure Using Terraform Modules Securely Storing Terraform Configuration Files On AWS Code Commit.

Akram Rayri
3 min readAug 15, 2023

--

In this project based on a real-world scenario, I designed and deployed a reusable, multi-tenant SaaS infrastructure on AWS using Terraform modules.

The infrastructure was based on AWS services such as EC2 instances, DynamoDB databases, and S3 buckets. To ensure the secure storage of Terraform configuration files, I used AWS CodeCommit as a remote repository to ensure versioning of the workflow.

As presented in the architecture above, the HumanGov project consists of creating application for public education sector across all the 50 states of USA. Hence, as a DevOps engineer I had to create the same resources for each region for 50 times. In order to do so, I leverage the power of Terraform (Infrastructure as a Code) to automatically provision the infrastructure in few hours and in simultaneous way.

First, I had to create the root directory where *.tf files will contain the population of the infrastructure. Then, under the modules directory, I created the main.tf , variables.tf and outputs.tf files that contains the infrastructure as a code scripts. While creating the scripts, several concepts of Terraform were used, like “random” and “aws” as a providers, resources to create the EC2 instances S3 buckets and DynamoDB tables to store NoSQL data. Variables were also used to insure the efficiency and dynamic approach of coding. Provisioners, can be utilized to bootstrap or configure the EC2 instances several times with only few lines of code. Outputs we leveraged as well so that after the creation of infrastructure we will make sure that the intended resources were created for each region with the appropriate tags and names.

Second, After creation of the infrastructure, Terraform is automatically generating the “terraform.tfstate” which is a crucial file in case of unexpected incident that can cause the lost of the whole resources for the whole project. Hence, it is important to take into consideration storing the Terraform state file not only in case of files loss, but also it may contain sensitive information like passwords and authentication keys. For that reason, I have created a “backend.tf” file to automatically store the Terraform state in a S3 bucket that is protected by ACLs, also creating a specific Dynamodb table to store the “lock file” that contains metadata about who is making changes at the same moment on our infrastructure.

Third, by using “.gitignore” file, I was making sure that all the files except the sensitive ones will be uploaded successfully to the CodeCommit remote repository that I also make it per-autheticated with AWS Cloud9.

--

--

Akram Rayri
Akram Rayri

Written by Akram Rayri

ICT Consultant and Engineer with focus on Cloud & DevOps | AWS | Microsoft Azure | Google Cloud | Oracle Cloud

No responses yet