Read Time: 6 mins
23 May 2024
Category: Training
Buckle up, this is a little bit of a long one! After passing the Certified Cloud Practitioner exam (CLF-C02) and researching the next steps to help me get hands-on experience with AWS, I came across The Cloud Resume Challenge (LINK TO IT). One great thing about The Cloud Resume Challenge is that it’s not just focused on AWS, but the challenge can also be completed if you have studied one of the other popular cloud service providers, Azure, GCP and Digital Ocean. The challenge consists of 16 steps each designed to get you to interact with different AWS services, and implement architecture best practices and CI/CD. The first 1-4 steps get you to turn your resume (CV) into a static site and host it in an S3 bucket that is publicly accessible. The challenge only requires you to upload a basic HTML and CSS file into an S3 bucket to serve as your static site. However, with my love for frontend development, my knowledge of frameworks and also someone who’s just a little bit extra I decided to do mine in ReactJS and Tailwind. Did I need to? No. Did I want to? Absolutely! You can check out my Cloud Resume Challenge HERE.
So, because I went above and beyond when it came to uploading my site to an S3 bucket, it wasn’t so straightforward. A challenge does not guide you step by step for each step you need to complete. Which is great as it requires you, to do your own research. Having said that, a lot of the tutorials I read about uploading files for a static website into an S3 bucket were all geared towards uploading basic HTML, CSS and JavaScript files. This meant I had to do a little bit of extra digging to figure out what I should upload to the S3 buckets as I was using React and Tailwind. Spoiler alert, all I needed to do was run a build and upload what was in my ‘dist’ folder. I probably should have already known that! One thing The Cloud Resume Challenge does not require you to do is enable bucket versioning on your S3 bucket. I had learned about bucket versioning a few days beforehand and knew that I wouldn’t complete the challenge in one sitting and therefore would constantly be updating the objects in my bucket to ensure the best architectural practices bucket versioning was better to do than just replacing the whole file. As well as learning about another function that I can do in AWS. It was in doing this that I ran into a little bit of trouble. Because I had not implemented CI/CD (this was a letter step in the challenge) I had to manually upload the files into my S3 bucket. When I made changes to my files, I was rebuilding the application to pick up the changes. However, because sometimes I was only making changes to certain files I would only reupload those files instead of the whole folder because objects in your buckets add to storage costs and as I had bucket versioning on the old version stay in my bucket. Therefore, I would only upload the changed files to my S3 bucket. Because I was rebuilding the application the links to the CSS file and the within my index.html file within the dist folder were changing ever so slightly. But I wasn’t uploading those files that the index.html file was pointing to, which meant that my changes of course were being reflected in the browser and I was confused and frustrated! Until of course, I figured it out!
And then there were bucket policies…oh bucket policies! I want to go off-topic here for a second slightly. I cannot stress enough how important it is (I am sure you know) to actually get down and dirty in the AWS management console, AWS CLI etc to actually understand how these AWS services work. Yes, I knew about bucket policies, I had to in order to pass the exam, but had I ever really used them? Well no. But I needed to for The Cloud Resume Challenge, this is also why I love that it doesn’t walk you step-by-step how to complete it and forces you to research (I love researching anyway so this was something I thoroughly enjoyed). Anyway, back to the topic, bucket policies! Of course, when I first clicked the link to try and access my site, I was denied access and why? Well because I did not have a bucket policy in place. Below is an example of the bucket policy I used, and just like that my site was live! It would help if you also had a basic understanding of bucket policies in order to have your website content served through CloudFront.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<YOUR-BUCKET-NAME>/*"
}
]
}
Setting up a CloudFront distribution was relatively straightforward, and I will admit that I skipped a step and didn’t use a custom domain or interact with Amazon Route 53, but in my defence, I had already used the Route 53 service for my portfolio! Having to create a visitor counter that interacts with AWS DynamoDB, AWS Lambda and AWS API Gateway was another great lesson and learning curve! I’ll go into depth in another post about how I created my visitor counter and hooked it up to DynamoDB with API Gateway and AWS Lambda. The last few steps give you some basic practice in Infrastructure as Code (IaC), Source Control and CI/CD. All great learning curves for someone who has just passed the CLF-C02 as all of these concepts were mentioned in training for the exam you most likely never used them. This is your chance. This post is kind of going on for a bit too long, so I’ll wrap it up here. I cannot stress how useful doing The Cloud Resume Challenge was for me. It gives you great hands-on experience of services and concepts you learned while studying for the Certified Cloud Practitioner exam but also gives you the confidence to dive deeper and explore the AWS services in your own account!
Photo made on Canva.