CODEX

Makers DevOps Bootcamp: Week 1 — DevOps, Containers and AWS Elastic Beanstalk

Tuyet Le
CodeX
Published in
7 min readMar 7, 2021

--

Last week, we wrapped up the final week of the Makers software development apprenticeship bootcamp and presented our final group project to our peers and employers. The last 12 weeks have been intense but an incredibly rewarding experience learning to code with an amazing cohort who I now call friends. We’ve grown with newfound confidence, presenting like a boss and with new technical skills under our belt.

Although it’s not all over yet! For some of us, we still have another six weeks to go where we will complete Makers’ very first DevOps course and I couldn’t be more excited to be a part of it!

Photo by Roman Synkevych on Unsplash

Makers is all about self-directed learning and the structure of the DevOps bootcamp is no different. The course is six weeks of self-led learning, workshops and team projects with a curriculum that cover topics on DevOps best practices, containerisation, serverless, CI/CD, AWS, Azure, Infrastructure as Code and common issues in Site Reliability Engineering.

We kick off the first week by learning what DevOps is, why it matters in the software development world, how it works, how to use Docker to containerise a web application and an introduction to AWS.

What’s DevOps exactly?

It’s been a recent buzzword for many people but not everyone really knows what it is, including me so I was super excited to spend the first day learning about what it is exactly.

DevOps is not tied to a particular technology in software development. At a high-level, it’s a culture that combines new practices, new tools and new ways of thinking to bridge the gap between IT operations, software development and quality assurance. It combines agile software development practices and automation to the world of IT operations. With this approach, DevOps helps to reduce the software development life cycle (SDLC) enabling organisations to roll out new products and services more quickly in a secure and reliable way while also improving operations performance.

Traditionally, IT operations and software developers work separately but by combining the two teams together, DevOps engineers get exposure to the application’s entire lifecycle, from the planning stage to development and testing, to releasing the product, to customer feedback and to application enhancements based on that feedback.

Some of the main benefits of adopting the DevOps model:

  • Speed — automation is one of the pillars of DevOps meaning new features and updates are released faster enabling businesses to adapt to market changes more quickly.
  • Rapid software delivery — with the increase in frequency of releasing new software, businesses are able to iterate on customer feedback more often, allowing for innovation and improved product releases.
  • Reliability — automated tests run before the release of an application, which increases the likelihood of things not going wrong hence improving its reliability. Businesses are able to stay informed of the application’s performance in real-time using monitoring and logging practices.
  • Scale — applications can easily be scaled up or down as and when needed and according to changes in user traffic overtime.
  • Improved collaboration and communication — teamwork is improved thanks to the combined practices and workflow of operations and development. With team members collaborating on the whole SDLC, there is a greater sense of ownership and sharing of tasks across the team, resulting in the improvement of the quality of services they provide to customers.
  • Security — this is a crucial part of the SDLC. With everyone now working on the entire SLDC and automated security testing replacing manual tests, it’s easier to stay compliant even at a large scale.

Containers — the foundation of DevOps collaboration

When I first heard we were learning about containers, I immediately thought of shipping containers. By its very nature, it’s purpose is not too far off in comparison to the use of containers in DevOps.

Photo by Paul Teysen on Unsplash

What are Containers?

Containers are isolated executable packages that contain everything that’s needed to run an application including its associated dependencies, libraries and configuration files.

So what does DevOps have to do with containers? Containerising an application offers significant benefits to DevOps including the flexibility to manage and deploy applications through portable and lightweight containers (maybe not so lightweight for the actual containers in the above photo).

It also enables developers to share their application more easily with operations and production environments while solving the typical day-to-day “it doesn’t work on my computer” problem as the container will always run the same from one computing environment to another.

Docker

Docker is a big player in the container technology world. It’s a popular software platform used in DevOps that allows you to easily share, manage and deploy applications consistently through containers.

The computer that Docker is installed and running on is referred to as a Docker Host or Host for simplicity.

Docker Containers and Docker Images are two additional essential concepts to know when working with Docker.

When deploying an application on the host, it creates a Docker Container. Docker Images are blueprints with instructions for building a container, which includes the application and its associated dependencies. As such, a Docker Container is a running snapshot of a Docker Image

Containerising our first application to deploy to the cloud

Photo by Kristi Martin on Unsplash

To understand the benefits of containers using Docker, we were tasked with our first group project of week one, which was to build a simple Node.js/Express application on anything of our choice and then to containerise it using Docker ready for deployment on to the cloud — AWS Elastic Beanstalk. We had four days to work on this in readiness to demo to the rest of our cohort on Friday.

My first thoughts were “holy crap we’re really diving right into it on day two!”. However, this gave us a real opportunity to apply all of day one’s workshops and reading that we did on containers to an actual project immediately.

Our team of four decided to build a simple one-page app as we wanted to prioritise our time to learn Docker, its commands to manage containers and how to deploy our containerised app on to AWS Elastic Beanstalk.

Here’s how our project went over the next four days as a team:

Day 1 of the project (research and app-building):

  • Group brainstorming session of what we wanted to build; decided on an interactive drawing app with a chatbox.
  • Created a diagram of our team’s workflow for the week.
  • Learnt how to set up a basic Express file structure for our app.
  • Researched JavaScript libraries for creating an interative drawing board and a chatbox. We decided to use RoCanvas for the drawing tool and socket.io for the chatbox.
  • Set up our back-end server and created the HTML of our front-end.

Day 2 of the project (containerising our app):

  • Built a Docker Image of our web application by using a Dockerfile. A Dockerfile is also used by AWS Elastic Beanstalk to deploy the app.
  • Learnt how to run our Docker Image on our local machine as a Docker Container using Docker.
  • Committed and published our dockerised Node.js application to Docker Hub (Docker Hub is a repository registry where you can publish and download Container Images — similar to GitHub for source code).

Day 3 of the project (deploying app to AWS Elastic Beanstalk):

  • Explored the AWS Console Interface and the Elastic Beanstalk documentation to understand how to deploy our containerised app.
  • We realised that there are different ways of configuring our Docker app to deploy onto AWS Elastic Beanstalk — you can either use a Dockerfile, a dockerrun.aws.json file, or both!
  • Successfully deployed our first containerised app by using both a Dockerfile and a dockerrun.aws.json file.

Day 4 of the project (final touches and presentation):

  • Final styling of the front-end.
  • Re-build our Docker Image with our updated code to re-deploy to AWS Elastic Beanstalk
  • Presentation time and retro!

You can read more about our team project here.

Reflection

Our first week of introductions to DevOps, containers and AWS was a steep learning curve but I’ve thoroughly enjoyed learning more about this exciting part of engineering after being immersed with the development side of things for the last 12 weeks.

During our retro, we learnt that the way we were tasked to complete week one’s project was intentionally manual in order for us to get a feel for working with containers and using AWS for the first time.

I’m looking forward to moving on to Continuous Integration/Continuous Delivery/Deployment (CI/CD) next week to see how we can automate some of the repetitive steps we had to do during the integration and deployment phases of our first app this week.

--

--

Tuyet Le
CodeX

Software Engineer & Career Changer 👩🏻‍💻