The Future of Cloud: Serverless CI/CD and ECS Containerization with Load-Balancer
Introduction
This era is regarded as the Cloud Era You can find Cloud Platforms are being used in every industry and it is extremely flexible for all types of environment and provides easy integrations. So in an ever-evolving generation what next, For cloud platforms well the answer is serverless the major part where systems fail is on the server side Integrating it managing it making it available 24x7 is a task on its own What if there is a way where you dont have to worry about it and focus only on the quality of the content that is being served? This is where serverless architecture comes into play. Let's do a little Project on How serverless architecture can be integrated.
Serverless Architecture
AWS Services used :
AWS Code Pipeline
AWS ECR (Elastic Container Registry)
AWS Fargate(serverless EC2)
AWS ECS(Elastic Container Service)
If you can replicate this Project session as a DevOps Engineer you will have hands-on experience with the most sought-after tools and their integrations we are going to pull as code from GitHub and build a Docker image and store it in ECR then deploy that image in ECS fargate
Step-1: Writting a Dockerfile
For this project we are going to use a Nginx image and copy an index.html file from the source directory and start the container in ECS fargate You can use my GitHub repo it has the source code, Dockerfile needed for this project fork or clone it from https://github.com/2402199/CiCD-ECS
FROM nginx
COPY index.html /usr/share/nginx/html/index.html
EXPOSE 5000
Here is the index.html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Document</title>
</head>
<body>
<h1>Codepipeline suuccessfull</h1>
</body>
</html>
Step 2: Integrating Code Pipeline
I have posted a detailed blog on Mastering CI/CD for JavaScript (REACT & VITE) Applications in Code Pipeline refer to it for any doubts
Source stage :
Note: IAM Roles are the most important in this project ensure you have AmazonEC2ContainerRegistryFullAccess, AWS ECS Policies are added to the role for Code PIpeline and Code build
Navigate to Codepipeline in the AWS console and click “Create Pipeline”.
In the source stage choose the existing IAM role or a new IAM role
The IAM role selected must have access to S3, EC2 container registry(ECR), and ECS
Then choose a way to connect to GitHub (version 1 is easier), You need to authorize your Github repo and select the GitHub repo and branch
Choose webhooks or code pipelines to monitor the GitHub activities.
Build stage :
Note: IAM Roles are the most important in this project ensure you have AmazonEC2ContainerRegistryFullAccess, AWS ECS Policies are added to the role for Code PIpeline and Code build
After Source, Continue to the Build stage Create a new Build stage Enter a name for the code build and a new or existing IAM role should be attached (note: the attached role must have access to all S3 and AWS ECR
Then, choose a managed image and choose the OS (LINUX/UBUNTU)and architecture (x86_64).
The most important step in the build process is the buildspec.yml file we can insert commands in the build spec editor in the code build
And then proceed with the deploy
Here is the Buildspec.yml
version: 0.2
phases:
build:
commands:
- aws ecr get-login-password --region ap-south-1 | docker login --username AWS --password-stdin <'ECR image uri name'>
- docker build -t ecs_codedeploy .
- docker tag ecs_codedeploy:latest <Image uri>/ecs_codedeploy:latest
- docker push <Image URI>/ecs_codedeploy:latest
What this does is that it will build the code and push it into my public repo in ECR You have to create a new ECR repo and change the commands respectively
Deploy stage :
Choose the deploy method as ECS DEPLOY
Enter the specifications of the already created cluster and the service name and create the pipeline
The pipeline will run and will fail in the deploy stage because we haven’t added the imagedefinitions.json file in the code
imagedefinitions.json file
The syntax for the file is ….
[ { “name”:”<continer_name>”, “imageUri”:<image_uri> } ]
This file has all the details for a task to run in the cluster
Now push with this file the pipeline will run smoothly
Step-3: Integrating ECS
Note: IAM Roles are the most important in this project ensure you have AmazonEC2ContainerRegistryFullAccess, AWS ECS Policies are added to the role for Code PIpeline and Code build
Create a Cluster in the ECS console and select ECS Fargate as the server option as this makes our container run serverless which is managed and maintained by AWS
Next, Create a Task for the Cluster to say what and all tasks it should perform like what image it should use where is the image located and all
Then Create a service and select the task in the service as the task we created in the previous step Since we have to integrate an Application Load balancer in our architecture select a load balancer and create a basic application load balancer
If you copy the DNS of the application Load balancer you can see your code deployed any update in the code and pushed to GitHub and the Pipeline will be triggered and the update will be published.
Happy Coding Guys !!!