Task 1 — Hybrid Multi-Cloud
This task is to give by mr.vimal Daga sir. In the task, I am creating multiple resources of AWS service Like EC2, EBS, S3, CloudFront, private key-pair,security-group with the help of Terraform code.
Prerequisite to run and create the AWS service with terraform code.
1.AWS ACCOUNT.
2.OS.
3. Configure AWS in OS.
4. Download and Configure Terraform.
Know you see how to create a Terraform code for using AWS Services.
step1: I create a private key pair in a specific AWS account. I saw it with my screenshot or code.
// Specific AWS Account
provider “aws” {
region = “ap-south-1”
profile = “Hitesh”
}
// private key_pair created
resource “tls_private_key” “task1_key” {
algorithm = “RSA”
}
module “key_pair” {
source = “terraform-aws-modules/key-pair/aws”
key_name = “task1”
public_key = tls_private_key.task1_key.public_key_openssh
}
step 2: create a security group with Terraform code.
// creating security group
resource “aws_security_group” “task1_security_group” {
name = “task1_work”
description = “Project webserver”
vpc_id = “vpc-48e1fd20”
ingress {
description = “TLS from VPC”
from_port = 80
to_port = 80
protocol = “tcp”
cidr_blocks = [“0.0.0.0/0”]
}
ingress {
description = “TLS from VPC”
from_port = 22
to_port = 22
protocol = “tcp”
cidr_blocks = [“0.0.0.0/0”]
}
egress {
from_port = 0
to_port = 0
protocol = “-1”
cidr_blocks = [“0.0.0.0/0”]
}
tags = {
Name = “task1_security_group”
}
}
step 3. create an AWS instance and install many software like httpd,git,php with Terraform code.
resource “aws_instance” “task1_ec2” {
ami = “ami-005956c5f0f757d37”
instance_type = “t2.micro”
key_name = module.key_pair.this_key_pair_key_name
security_groups = [“task1_work”]
connection {
type = “ssh”
user = “ec2-user”
private_key = tls_private_key.task1_key.private_key_pem
host = aws_instance.task1_ec2.public_ip
}
provisioner “remote-exec” {
inline = [
“sudo yum install httpd php git -y”,
“sudo service httpd start”,
“sudo chkconfig httpd on “,
]
}
tags = {
Name = “task1_Os1”
}
}
step 4: create a ebs AWS Service with Terraform code.
resource “aws_ebs_volume” “task1_volume” {
availability_zone = aws_instance.task1_ec2.availability_zone
size = 1
tags = {
Name = “task1-ebs_volume”
}
}
step 5: Attach the EBS with EC2 instance and format the EBS or mount with /var/www/html and download the code from git and copy to var/www/html after mounting the EBS then download the code from Github otherwise your all data will be removed from the /var/www/html folder with the help of Terraform code.
//ebs attach with instance
resource “aws_volume_attachment” “task1_ebs_att” {
depends_on=[aws_ebs_volume.task1_volume,aws_instance.task1_ec2]
device_name = “/dev/sdh”
volume_id = aws_ebs_volume.task1_volume.id
instance_id = aws_instance.task1_ec2.id
force_detach =true
connection {
type = “ssh”
user = “ec2-user”
private_key = tls_private_key.task1_key.private_key_pem
host = aws_instance.task1_ec2.public_ip
}
provisioner “remote-exec” {
inline = [
“sudo mkfs.ext4 /dev/xvdh”,
“sudo mount /dev/xvdh /var/www/html”,
“sudo rm -rf /var/www/html/*”,
“sudo git clone https://github.com/hitkool38279/task1_hybdrid_multi_cloud.git /var/www/html/”
]
}
}
step 6: create an s3 bucket with the help of Terraform code.
//create s3 bucket
resource “aws_s3_bucket” “task1cloudbucket1”{
bucket = “task1bucket1”
acl = “public-read”
tags = {
Name = “my-bucket”
}
}
step 7: Attach the data with an s3 bucket with Terraform code.
//attach the data in s3 bucket
resource “aws_s3_bucket_object” “task1”{
bucket = aws_s3_bucket.task1cloudbucket1.bucket
key = “a.png”
source = “C://Users//Devil//Desktop/terraform.jpg”
acl = “public-read”
}
step 8: create a Cloud Front and attach the s3 bucket for providing the content Delivery Network (CDN) service with the help of Terraform code.
//create AWS CloudFront service
resource “aws_cloudfront_distribution” “s3_task_distribution” {
origin {
domain_name = aws_s3_bucket.task1cloudbucket1.bucket_regional_domain_name
origin_id = aws_s3_bucket.task1cloudbucket1.id
}
enabled = true
is_ipv6_enabled = true
comment = “mytaskcloudfront”
default_cache_behavior {
allowed_methods = [“DELETE”, “GET”, “HEAD”, “OPTIONS”, “PATCH”, “POST”, “PUT”]
cached_methods = [“GET”, “HEAD”]
target_origin_id = aws_s3_bucket.task1cloudbucket1.id
forwarded_values {
query_string = false
cookies {
forward = “none”
}
}
viewer_protocol_policy = “allow-all”
}
price_class = “PriceClass_200”
restrictions {
geo_restriction {
restriction_type = “whitelist”
locations = [“US”, “CA”, “IN”]
}
}
viewer_certificate {
cloudfront_default_certificate = true
}
step 9: know automatically run the site and attach the CloudFront DNS provide in the code and that which is available in CloudFront that automatically place in the code
connection {
type = “ssh”
user = “ec2-user”
private_key = tls_private_key.task1_key.private_key_pem
host = aws_instance.task1_ec2.public_ip
}
provisioner “remote-exec” {
inline = [
# “sudo su << \”EOF\” \n echo \”<img src=’${self.domain_name}’>\” >> /var/www/html/index.html \n \”EOF\””
“sudo su << EOF”,
“echo \”<img src=’http://${self.domain_name}/${aws_s3_bucket_object.task1.key}'>\" >> /var/www/html/index.html”,
“EOF”
]
}
}
step 10: know automatically run the code in the google chrome with the help of Terraform Code.
provisioner “remote-exec” {
inline = [
# “sudo su << \”EOF\” \n echo \”<img src=’${self.domain_name}’>\” >> /var/www/html/index.html \n \”EOF\””
“sudo su << EOF”,
“echo \”<img src=’http://${self.domain_name}/${aws_s3_bucket_object.task1.key}'>\" >> /var/www/html/index.html”,
“EOF”
]
}
}
the final output of the code.
with the single command of Terraform, you create and destroy the code, first of all, you download the plugins which service you want to use in the terraform code like AWS, OpenStack, etc. means which provider service you want to use in the terraform code.
terraform init //that command is used for download the plugin
terraform apply -auto-approve // that command is used to run the terraform behind the seen Providers services create.
terraform destroy // that command is used to destroy all service which is created with the help of Terraform code on a single go.
Thank you.
If you have any queries please contact me.