CODEX

Makers DevOps Bootcamp: Week 2 — CI/CD, Serverless and more AWS

Tuyet Le
CodeX
Published in
6 min readMar 16, 2021

--

(My reflection of week one here.)

Photo by Billy Huynh on Unsplash

We kicked off week two by diving straight into our new weekly team project and the goals were:

  • to use AWS S3 Buckets to host a static website app;
  • to build our first CI/CD pipeline using Jenkins;
  • to deploy a static website with AWS Lambda and Amazon API Gateway;
  • to be a better DevOps engineer than the day before;
  • to have fun!

By the end of the week, we should have a simple serverless website that’s deployed on AWS.

The AWS ecosystem is HUGE so I was excited to learn more about other AWS services and continuous integration (CI) and continuous delivery/deployment (CD), which you often hear about in DevOps and modern software development practices.

A summary of CI/CD

CI/CD are practices that make up the fundamental architecture of DevOps enabling engineers to deliver software and better quality code faster.

CI

CI is a major step in the DevOps lifecycle. It’s the process of frequently merging developers’ code changes into a shared repository thanks to the help of automated testing. When developers make a merge pull request, it triggers a pipeline that automatically builds, tests and validates the code changes before merging into the repository. Adding test automation as part of the CI solution helps with identifying errors faster and ensures better quality builds.

CD

CD deploys that CI-validated code to the application. Continuous delivery is often used interchangeably with continuous deployment. The subtle difference between the two is that continuous delivery deploys the code into a staging environment where team members can play around with the application before it goes live to customers. Continuous deployment on the other hand automatically deploys the code straight to production.

Creating an Amazon S3 Bucket

Project time!

Before creating our first CI/CD pipeline solution, we first needed to create an S3 bucket on AWS and configure it to host our static website.

AWS S3 Bucket

What’s an Amazon S3 Bucket exactly?

It’s a simple object storage service by Amazon that’s used to store and protect data for different use cases, such as images, text files, websites, mobile applications and more. It’s most commonly used to host static websites and often included in the automation pipeline where the app is deployed and directly hosted on the S3 bucket.

Our coach thankfully provided a simple app with a simple HTML file for us to work with so we didn’t need to spend time building a website, allowing us to focus on the main goals of the week.

Setting up Jenkins on AWS

Once we created our S3 bucket, it was time to learn about the popular CI tool, Jenkins, to build our CI/CD pipeline with and get it set up on an AWS remote server.

What is Jenkins?

Jenkins manages all types of automation including software building, application testing and deployment. To help visualise how Jenkins comes into play in CI/CD, I doodled the flow (apologies for the stickmen drawing 😅).

CI/CD pipeline with Jenkins

Creating an Amazon EC2 instance and installing Jenkins on it

Setting up Jenkins onto an AWS remote server, otherwise known as an EC2 instance, took a fair amount of time and research to understand and achieve. The Jenkins documentation on how to install Jenkins on AWS was a great resource we used that breaks down the steps clearly.

Here’s an overview of the steps we did on the AWS Management Console to set up an EC2 instance with Jenkins:

  1. Create a key pair — this creates a set of security credentials consisting of a public and private key used for you to prove your identity when connecting to an EC2 instance
  2. Create a security group — this creates a virtual firewall for the EC2 instance to control incoming and outgoing traffic
  3. Launch the EC2 instance
  4. Connect to the EC2 instance using SSH from the command line
  5. Download and install Jenkins onto the EC2 instance
  6. Configure Jenkins that’s now installed on the EC2 instance by installing the recommended plugins, connecting it to our Github repository and setting up user access for all team members to access Jenkins (note: we used the same key pair credentials for all team members)

After completing these steps, we were able to get Jenkins working on our EC2 instance!

Building our CI/CD pipeline (Jenkinsfile and Webhooks)

You may be wondering now, how does our Jenkins instance know when a developer has updated the repository? Enter… Webhooks!

Webhooks allow you to set up integrations to provide other applications with real-time information when certain events are triggered. So in our case, when one of our team members makes a push or a merge pull request to the repository, webhook will send an HTTP POST request to our Jenkins instance. This then automatically triggers the CI build job on our Jenkins instance.

Jenkinsfile

We defined our pipeline by writing scripts in a Jenkinsfile. A Jenkinsfile is a text file written in groovy (what a fun name!) that’s created in the root directory of the project.

We implemented a very basic three-stage continuous delivery pipeline— build, test and deploy — with a script to print the ID of the build, to run the unit tests and to deploy the HTML files to our S3 Bucket respectively.

Our very simple pipeline written in grooovy

The stages are executed sequentially so if our unit tests fail, the pipeline will not move on to the deploy stage.

After lots of configuration, and testing different scripts in the pipeline, our HTML files successfully deployed! 🎉

Home page

Going serverless with AWS Lambda

The last step was to get the button to change the monkey image as well as the text. Currently, the button doesn’t do anything when clicked.

We learnt that we could set up this functionality with AWS Lambda, which is a serverless service that allows you to run code based on the incoming request or event. A huge advantage of many serverless solutions including AWS Lambda is that it supports event triggers, making it great for pipelines.

We had a workshop to diagram what a serverless architecture looks like and how Amazon S3 can work with Amazon API Gateway and Amazon Lambda to achieve this

Minus the user authentication part of the diagram above, setting up the Amazon API Gateway and Amazon Lambda were the last pieces to add to create a serverless website.

We created a Lambda function to change the text of the HTML body when the user clicks on the button.

Our Lambda function written in Python

To enable the S3 bucket to invoke the Lambda function when the user clicks the button, we first configured the S3 trigger in the Lambda console on AWS to enable it to send an event notification via an API Gateway. Next was to set up an HTTP API in the Amazon API Gateway and add the endpoint URL in our HTML file so that the request is properly sent to the Lambda function and returns the function’s response to the client.

Voila! Our button now works — it displays the new text and monkey photo!

Reflection

The AWS ecosystem is huge and learning about how pipelines actually work was all mind-blowing 🤯

I enjoyed learning more about other popular AWS services including Lambda, EC2 and S3 bucket, and seeing how they all work together to host a simple website on a virtual server.

There were a lot of steps and time spent on setting up and configuring Jenkins before we could even start building our pipeline so it will be interesting to see how working with Github Actions, another popular CI tool, differs from working with Jenkins in the coming week.

--

--

Tuyet Le
CodeX

Software Engineer & Career Changer 👩🏻‍💻