<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Rishab in Cloud]]></title><description><![CDATA[Hello, I am Rishab, Staff Developer Evangelist at Twilio and a part-time professor of Cloud at St. Lawrence College, passionate about helping people get into cloud.]]></description><link>https://blog.rishabkumar.com</link><generator>RSS for Node</generator><lastBuildDate>Thu, 16 Apr 2026 08:06:36 GMT</lastBuildDate><atom:link href="https://blog.rishabkumar.com/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Google Cloud Professional Cloud Security Engineer Study Guide]]></title><description><![CDATA[Hey everyone! If you’ve been following along, you might know that I recently passed the Google Cloud Professional Cloud Security Engineer exam. To be honest, the whole process was a bit of a sprint, and I wanted to share my journey with you, includin...]]></description><link>https://blog.rishabkumar.com/google-cloud-professional-cloud-security-engineer-study-guide</link><guid isPermaLink="true">https://blog.rishabkumar.com/google-cloud-professional-cloud-security-engineer-study-guide</guid><category><![CDATA[google cloud]]></category><category><![CDATA[cloud security]]></category><category><![CDATA[Cloud Computing]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Mon, 15 Dec 2025 16:52:10 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1765310517508/869e026e-2ed7-4f66-942b-d267b4804443.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hey everyone! If you’ve been following along, you might know that I recently passed the <a target="_blank" href="https://cloud.google.com/learn/certification/cloud-security-engineer/"><strong>Google Cloud Professional Cloud Security Engineer exam</strong></a>. To be honest, the whole process was a bit of a sprint, and I wanted to share my journey with you, including the challenges, resources, and strategies I used to pass the exam.</p>
<h2 id="heading-the-journey-to-certification">The Journey to Certification</h2>
<h3 id="heading-why-i-took-the-exam">Why I Took the Exam</h3>
<p>Now, you might be wondering, "Rishab, you’re a developer advocate at Twilio, why focus on cloud security?" Well, here’s the deal. I’ve always envisioned myself working in some aspect of <strong>cloud security</strong> in the future. Not that I’m planning on leaving developer relations anytime soon, but when the time comes to transition to a more security-focused role, I believe this will be the smoothest route for me. Why? Because I’ve got <strong>five years of cloud engineering and DevOps experience</strong>, and adding security to that mix seems like the most natural progression.</p>
<p>This was my chance to deepen my knowledge of <strong>Google Cloud security</strong>, and I'm glad I took it. The <strong>GCP Cloud Security certification</strong> aligns with my long-term goals of gaining more specialized cloud security skills.</p>
<h3 id="heading-the-tight-timeline">The Tight Timeline</h3>
<p>Now, here’s where it gets interesting. I only had <strong>8 days</strong> to prepare for the exam, which is a pretty short timeline for something as comprehensive as this. Why the rush? Well, as a <a target="_blank" href="https://g.dev/rishabincloud">Google Developer Expert (GDE)</a>, I had a certification voucher that would expire by April 1st. So, I had no choice but to prepare quickly, but I was confident that my hands-on experience would help me move through the material faster.</p>
<p>I started prepping on <strong>March 22nd</strong>, and the exam was scheduled for <strong>March 30th</strong>, just a week to go through the material and practice.  </p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabincloud/status/1904168532436287891">https://twitter.com/rishabincloud/status/1904168532436287891</a></div>
<p> </p>
<hr />
<h2 id="heading-exam-overview">Exam Overview</h2>
<p>Before jumping into study materials, let’s talk a bit about what the exam actually covers:</p>
<p>The <strong>Cloud Security Engineer</strong> exam assesses your ability to:</p>
<ul>
<li><p><strong>Configure access and secure communication</strong> within Google Cloud</p>
</li>
<li><p><strong>Establish boundary protection</strong> and manage data protection</p>
</li>
<li><p><strong>Support compliance requirements</strong> and manage operations</p>
</li>
</ul>
<p>You can find all this information in the official <a target="_blank" href="https://services.google.com/fh/files/misc/professional_cloud_security_engineer_exam_guide_english.pdf"><strong>exam guide</strong></a>, and I highly recommend reading it before starting your prep. It provides a solid understanding of the services you’ll need to master, plus the weightage of each domain in the exam.</p>
<p>For this exam, Google recommends that you have <strong>3+ years of industry experience</strong>, with at least <strong>one year of hands-on experience</strong> designing and managing solutions in Google Cloud. While I didn’t have years of <strong>cloud security experience</strong> specifically, my background in <strong>cloud engineering and DevOps</strong> helped a lot. I focused on practicing <strong>hands-on labs</strong> and building projects with a security mindset, which worked well for me.</p>
<h3 id="heading-exam-details">Exam Details</h3>
<ul>
<li><p><strong>Duration</strong>: 2 hours</p>
</li>
<li><p><strong>Questions</strong>: 50-60 multiple-choice questions</p>
</li>
<li><p><strong>Cost</strong>: $200 (I used my <strong>GDE voucher</strong>)</p>
</li>
<li><p><strong>Format</strong>: Multiple-choice questions covering various topics in cloud security.</p>
</li>
</ul>
<hr />
<h2 id="heading-study-resources-amp-strategy">Study Resources &amp; Strategy</h2>
<p>Let’s get into the nitty-gritty of the study materials I used.</p>
<h3 id="heading-hands-on-projects-amp-labs">Hands-on Projects &amp; Labs</h3>
<p>As you know, I’m a big believer in <strong>hands-on learning</strong>, it’s the only way to truly retain skills. That’s why I prioritized <strong>projects</strong> and <strong>labs</strong> over just reading books or watching videos.</p>
<p>One of the projects I worked on was updating my <strong>home lab</strong>. I have a spare <strong>Linux server</strong> running <strong>Docker and Kubernetes</strong> in my home network. I set up a <strong>secure tunnel system</strong> using <strong>Tailscale</strong> to connect my server at home to my <strong>GCP Ubuntu server</strong>, ensuring secure communication between both. This project involved configuring <strong>VPCs</strong>, <strong>firewall rules</strong>, and <strong>networking</strong>, all of which were essential for the exam.</p>
<p>Another project was a <strong>URL shortener</strong> hosted on <strong>Google Cloud</strong> using <strong>Cloud Run</strong>. This project helped me practice security principles like:</p>
<ul>
<li><p><strong>Service accounts</strong> for secure access between GCP resources</p>
</li>
<li><p><strong>Cloud Armor</strong> and <strong>Cloud CDN</strong> for security and performance</p>
</li>
<li><p><strong>Cloud Logging and Monitoring</strong> for auditing and visibility</p>
</li>
</ul>
<p><img src="https://raw.githubusercontent.com/rishabkumar7/url-shortener-gcp/refs/heads/main/url-shortener-gcp-arch.png?token=GHSAT0AAAAAADPMSQNR3RXAHL52VQ454XXW2J24K6Q" alt class="image--center mx-auto" /></p>
<p>These hands-on projects reinforced my understanding of Google Cloud’s security features and were invaluable in preparing for the exam.</p>
<h3 id="heading-books-and-online-courses">Books and Online Courses</h3>
<ul>
<li><p>I used the <strong>Google Cloud Certified Professional Cloud Security Engineer Exam Guide</strong> by Ankush Chadri and Prashant Kulkarni (available on <strong>O'Reilly</strong>). I didn’t complete the entire book but referred to it for specific topics like <strong>Data Loss Prevention (DLP)</strong> and <strong>Cloud Encryption</strong>.</p>
</li>
<li><p>I also used <a target="_blank" href="https://www.skills.google/paths/15"><strong>Google Cloud Skills Boost</strong> <strong>Security Engineer path</strong></a>, which offers interactive labs and courses. This is a great resource if you want a more structured path for your learning.</p>
</li>
<li><p>Additionally, I referenced courses from <strong>Cloud Academy</strong>, which I accessed through my <strong>AWS Community Builder</strong> membership. This course had <strong>hands-on labs</strong> and was really helpful for reinforcing key concepts.</p>
</li>
</ul>
<h3 id="heading-practice-exams">Practice Exams</h3>
<p>I always recommend taking <strong>practice exams</strong> before the real deal. I used <a target="_blank" href="https://www.whizlabs.com/google-cloud-certified-professional-cloud-security-engineer/"><strong>Whizlabs</strong></a> practice exams to test myself on the specific domains. The exams on Whizlabs are domain-based, so you can focus on specific areas that need more attention. I completed almost all of them, and this helped me identify areas where I was weaker, allowing me to focus my limited study time more effectively.</p>
<hr />
<h2 id="heading-exam-experience">Exam Experience</h2>
<p>When exam day came, I felt ready. The exam took me about <strong>1 hour and 5 minutes</strong> to complete, leaving me with time to review my answers. Google Cloud certifications don’t provide a score breakdown, but I passed, and the <a target="_blank" href="https://www.credly.com/badges/34d1ecd6-b94e-4692-adda-e7f5d2c090ee/public_url"><strong>Credly badge</strong></a> was issued the very next day!</p>
<p><img src="https://images.credly.com/images/4ea0ec5c-6258-4c26-9282-6ed233c0c7ac/image.png" alt /></p>
<p>Here are some key topics to focus on for the exam based on my experience:</p>
<h3 id="heading-key-focus-areas">Key Focus Areas</h3>
<ol>
<li><p><strong>Network Security</strong>: VPCs, subnets, firewalls, load balancing, and private Google access. Understanding how to secure networks and control traffic flow is crucial.</p>
</li>
<li><p><strong>Data Security</strong>: Encryption techniques, protecting data at rest and in transit, <strong>Customer Managed Encryption Keys (CMEK)</strong>, and <strong>Data Loss Prevention (DLP)</strong>.</p>
</li>
<li><p><strong>Detection and Response</strong>: Learn about <strong>Security Command Center</strong>, <strong>Cloud Logging</strong>, and <strong>Cloud Monitoring</strong>. These tools help with visibility and auditing your cloud resources.</p>
</li>
<li><p><strong>Security Best Practices</strong>: Understand <strong>binary authorization</strong> (ensuring only trusted containers are deployed) and <strong>compliance assessments</strong>.</p>
</li>
</ol>
<p>I have also made my notes publicly available for you all to review:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://github.com/rishabkumar7/CloudNotes/blob/master/cloud/GCP-ProfessionalCloudSecurity.md">https://github.com/rishabkumar7/CloudNotes/blob/master/cloud/GCP-ProfessionalCloudSecurity.md</a></div>
<p> </p>
<hr />
<h2 id="heading-final-thoughts">Final Thoughts</h2>
<p>Passing the <strong>Google Cloud Professional Cloud Security Engineer exam</strong> was a challenging but rewarding experience. The key to my success was a balanced approach that combined <strong>hands-on projects</strong>, <strong>lab practice</strong>, and focused <strong>study resources</strong>. Even with a tight timeline, the experience and skills I gained during the preparation process made it all worth it.</p>
<p>If you're preparing for this exam, I wish you the best of luck! Stay focused, dive into hands-on labs, and remember that the best way to solidify your knowledge is through practical experience.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/UYWWCxmcyM8?si=9_8WKhmd2ZjqLjN_">https://youtu.be/UYWWCxmcyM8?si=9_8WKhmd2ZjqLjN_</a></div>
<p> </p>
<p>As for me, I’ve already got my sights set on the <strong>GCP DevOps Professional</strong> certification next, and I’ll be sharing my journey with that soon!</p>
]]></content:encoded></item><item><title><![CDATA[Google Cloud Generative AI Leader Study Guide]]></title><description><![CDATA[Hello amazing people!
I am thrilled to share that I recently passed the new Google Cloud Generative AI Leader Exam! This foundational certification is a great way to validate your knowledge of GenAI concepts and Google Cloud's offerings, even if you ...]]></description><link>https://blog.rishabkumar.com/google-cloud-generative-ai-leader-study-guide</link><guid isPermaLink="true">https://blog.rishabkumar.com/google-cloud-generative-ai-leader-study-guide</guid><category><![CDATA[Google]]></category><category><![CDATA[generative ai]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Wed, 26 Nov 2025 17:01:28 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1764176064417/7d0af4e4-1d5d-40b9-8310-c1b4a36251e3.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hello amazing people!</p>
<p>I am thrilled to share that I recently passed the new <a target="_blank" href="https://cloud.google.com/learn/certification/generative-ai-leader"><strong>Google Cloud Generative AI Leader Exam</strong></a>! This foundational certification is a great way to validate your knowledge of GenAI concepts and Google Cloud's offerings, even if you don't have extensive technical experience.</p>
<p>For my preparation, I focused on the official learning paths and leveraged my existing hands-on experience. I wasn't surprised that the exam content very closely matched the objectives laid out in the study guide, which made the process very straightforward.</p>
<p>Here is my study guide based on the resources I used and my experience sitting for the exam.</p>
<h2 id="heading-exam-overview">Exam Overview</h2>
<p>The Generative AI Leader certification is a foundational exam designed to assess your knowledge across four key areas.</p>
<ul>
<li><p><strong>Format:</strong> Multiple-choice questions (50-60 total)</p>
</li>
<li><p><strong>Time Limit:</strong> 90 minutes</p>
</li>
<li><p><strong>Cost:</strong> $99 USD</p>
</li>
<li><p><strong>Delivery:</strong> Online (proctored) or in-person at a test center</p>
</li>
<li><p><strong>Validity:</strong> 3 years</p>
</li>
</ul>
<h2 id="heading-skills-measured-in-the-exam">Skills Measured in the Exam</h2>
<p>The <a target="_blank" href="https://services.google.com/fh/files/misc/generative_ai_leader_exam_guide_english.pdf">official exam guide</a> provides a clear breakdown of the weightage for each section. I found that the actual exam reflected these percentages very well.</p>
<div class="hn-table">
<table>
<thead>
<tr>
<td><strong>Section</strong></td><td><strong>Weightage</strong></td></tr>
</thead>
<tbody>
<tr>
<td><strong>Fundamentals of GenAI</strong></td><td>30%</td></tr>
<tr>
<td><strong>Google Cloud's GenAI Offering</strong></td><td>35%</td></tr>
<tr>
<td><strong>Techniques to improve GenAI model output</strong></td><td>20%</td></tr>
<tr>
<td><strong>Business strategies for successful GenAI solutions</strong></td><td>15%</td></tr>
</tbody>
</table>
</div><h2 id="heading-resources-i-used">Resources I Used</h2>
<p>I relied completely on two main educational resources and my practical experience to prepare for this certification.</p>
<p><strong>1. Cloud Skills Boost Learning Paths</strong></p>
<ul>
<li><p><a target="_blank" href="https://www.skills.google/paths/1951"><strong>Generative AI Leader Path</strong></a><strong>:</strong> This path has five courses and is specifically aligned with the certification.</p>
</li>
<li><p><a target="_blank" href="https://www.skills.google/course_templates/536"><strong>Introduction to Generative AI Path</strong></a><strong>:</strong> If you are new to the field, I highly recommend going through this path as well. I particularly enjoyed the modules on Responsible AI, which are crucial for understanding Google Cloud's offerings.</p>
</li>
</ul>
<p><strong>2. Hands-on Experience</strong></p>
<p>While this is a foundational exam, having practical experience was a huge advantage. My experience working with the Gemini API and Vertex AI and its offerings, such as implementing RAG (Retrieval-Augmented Generation) with BigQuery, helped me understand the concepts not just theoretically but practically.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://github.com/rishabkumar7/gemini-langchain-rag">https://github.com/rishabkumar7/gemini-langchain-rag</a></div>
<p> </p>
<p><strong>3. Personal Notes</strong></p>
<p>I also took extensive personal notes and created mind maps to help internalize the details. I've cleaned up and shared my personal notes publicly for anyone preparing:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://github.com/rishabkumar7/CloudNotes/blob/master/cloud/GCP-GenAILeader.md">https://github.com/rishabkumar7/CloudNotes/blob/master/cloud/GCP-GenAILeader.md</a></div>
<p> </p>
<h2 id="heading-the-experience">The Experience</h2>
<p>I found the online proctoring process straightforward, though it does require you to ensure your environment is clean and free of notes. I finished the exam in about 65 minutes, well within the 90-minute limit.</p>
<p>You get an immediate pass/fail result, but the official confirmation and your <a target="_blank" href="https://www.credly.com/badges/6162937c-5e6f-48ab-bad5-cd3cd670d832/public_url"><strong>Credly badge</strong></a> usually take about 24 hours to arrive via email.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1764174180340/b3bf4046-7027-460b-abbd-5055f956343a.png" alt class="image--center mx-auto" /></p>
<p>Good luck to anyone planning to take the Google Cloud Generative AI Leader exam! Let me know in the comments if you have any questions.</p>
]]></content:encoded></item><item><title><![CDATA[Early Mornings, Late Ambitions: Building a Tech Career on Discipline]]></title><description><![CDATA[A few years ago(2018), I was working full-time in Tech Support. The job paid the bills, but it wasn’t what I wanted to do long term.
I tried staying back after work to study AWS, but most days I was drained. After troubleshooting tickets for 8 hours,...]]></description><link>https://blog.rishabkumar.com/early-mornings-building-a-tech-career-on-discipline</link><guid isPermaLink="true">https://blog.rishabkumar.com/early-mornings-building-a-tech-career-on-discipline</guid><category><![CDATA[Career]]></category><category><![CDATA[Study ]]></category><category><![CDATA[career advice]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Mon, 07 Apr 2025 04:23:52 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1743999718157/4ccc4b8c-8c1c-4d07-9a40-c5facec75d7e.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>A few years ago(2018), I was working full-time in Tech Support. The job paid the bills, but it wasn’t what I wanted to do long term.</p>
<p>I tried staying back after work to study AWS, but most days I was drained. After troubleshooting tickets for 8 hours, my brain was fried. So I made a small shift — I started coming in early. One hour before my shift started, I’d grab a coffee and study.</p>
<p>That one change made all the difference.</p>
<p>No one told me to do it. There was no applause. But it gave me a sense of control. I wasn’t just reacting to life anymore — I was preparing for what I wanted next.</p>
<blockquote>
<p>“You have power over your mind – not outside events. Realize this, and you will find strength.”<br />— <em>Marcus Aurelius</em></p>
</blockquote>
<hr />
<h2 id="heading-why-bother">Why Bother?</h2>
<p>If you already work full-time, it’s easy to justify skipping the extra effort.</p>
<p>“I’m tired.”<br />“There’s no time.”<br />“I’ll start next week.”</p>
<p>I told myself the same things.</p>
<p>But the truth is, you don’t need motivation. You need a reason. Mine was simple: I wanted to do better for my family — and my bank account was a constant reminder of that.</p>
<blockquote>
<p>“First say to yourself what you would be; and then do what you have to do.”<br />— <em>Epictetus</em></p>
</blockquote>
<hr />
<h2 id="heading-study-to-excel-not-just-to-escape">Study to Excel, Not Just to Escape</h2>
<p>When I got promoted to a Cloud Engineer role, I didn’t stop studying.</p>
<p>But the focus shifted. I wasn’t studying to get out — I was studying to get better. I spent time learning architecture patterns, reading about what other companies were building, and figuring out how to apply those ideas in our own environment.</p>
<p>This wasn’t about certifications or titles anymore. It was about doing the job well — with care and discipline.</p>
<hr />
<h2 id="heading-covid-changed-my-routine-again">COVID Changed My Routine Again</h2>
<p>When the pandemic hit, life slowed down. No commuting. No social plans. I got back into a consistent morning routine: wake up at <strong>5 AM</strong>, study until <strong>7</strong>, then start work.</p>
<p>That year, I sat <strong>12 certifications</strong>.</p>
<p>But more importantly, I built momentum. That led to me leading DevOps and SRE projects — and eventually transitioning fully into a DevOps role.</p>
<blockquote>
<p>“Don’t explain your philosophy. Embody it.”<br />— <em>Epictetus</em></p>
</blockquote>
<hr />
<h2 id="heading-the-final-push-big-tech">The Final Push: Big Tech</h2>
<p>At one point, I decided I wanted to break into Big Tech — Amazon, Google, Microsoft.</p>
<p>So I started studying 2–3 hours a day. Not because I had endless energy, but because I made a decision. I blocked out time. I protected that time.</p>
<p>No hacks. Just intention.</p>
<hr />
<h2 id="heading-studying-feels-like-levelling-up">Studying Feels Like Levelling Up</h2>
<p>Eventually, I started enjoying the process. Studying became like levelling up a character in a video game. Every new concept I understood felt like unlocking a new skill tree.</p>
<p>That mindset helped me stay consistent. Even when it felt boring. Even when I didn’t see progress.</p>
<blockquote>
<p>“If a person gave away your body to some passerby, you’d be furious. Yet you hand over your mind to anyone who comes along.”<br />— <em>Epictetus</em></p>
</blockquote>
<hr />
<h2 id="heading-guard-your-focus">Guard Your Focus</h2>
<p>Lately, it’s been harder to focus. Social media — especially short-form content — makes it tough to sit still and learn.</p>
<p>But here’s the thing: the ability to focus is a skill now. And if you can build it, you already have an edge. While everyone else is scrolling, you’re building. I have some more thoughts on <a target="_blank" href="https://blog.rishabkumar.com/how-smartphones-are-robbing-us">how our moms were right about the phone!</a></p>
<blockquote>
<p>“The things you think about determine the quality of your mind.”<br />— <em>Marcus Aurelius</em></p>
</blockquote>
<hr />
<h2 id="heading-so-how-do-you-study-with-a-full-time-job">So, How Do You Study With a Full-Time Job?</h2>
<p>Here’s what worked for me:</p>
<ul>
<li><p><strong>Time-block your mornings.</strong> Even 30–60 minutes is enough if you’re consistent.</p>
</li>
<li><p><strong>Study when your energy is highest.</strong> For me, that was mornings. For you, it might be evenings or lunch breaks.</p>
</li>
<li><p><strong>Have a clear goal.</strong> Not just “learn stuff,” but “get better at X,” or “prepare for Y.”</p>
</li>
<li><p><strong>Treat distractions like enemies.</strong> Your phone isn’t neutral — it’s a constant test of your discipline.</p>
</li>
<li><p><strong>Accept that it’s not supposed to be easy.</strong> It’s supposed to be worth it.</p>
</li>
</ul>
<hr />
<h2 id="heading-final-thoughts">Final Thoughts</h2>
<p>You don’t need a perfect setup to start. You just need to start.</p>
<p>I didn’t go to some fancy school. I worked nights at a gas station and delivered pizzas. I studied in the mornings before work and learned as I went.</p>
<p>The point isn’t to “hustle 24/7.” The point is to carve out <em>some</em> time, consistently, and use it with intention.</p>
<blockquote>
<p>“Waste no more time arguing what a good man should be. Be one.”<br />— <em>Marcus Aurelius</em></p>
</blockquote>
<p>Because studying with a full-time job is hard. But so is staying in a role that doesn’t challenge you.</p>
<p>Choose your hard.</p>
]]></content:encoded></item><item><title><![CDATA[FastAPI as AWS Lambda Function]]></title><description><![CDATA[As a fan of both FastAPI and AWS Lambda, I wanted to share a quick tutorial on how to combine these two awesome technologies. If you're not familiar with AWS Lambda, it's a serverless compute service that lets you run code without managing servers. A...]]></description><link>https://blog.rishabkumar.com/fastapi-as-aws-lambda-function</link><guid isPermaLink="true">https://blog.rishabkumar.com/fastapi-as-aws-lambda-function</guid><category><![CDATA[FastAPI]]></category><category><![CDATA[Python]]></category><category><![CDATA[AWS]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Fri, 21 Feb 2025 21:23:35 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1740172794678/ade6b1eb-37d7-4cca-993d-8c2d200bfc90.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>As a fan of both <a target="_blank" href="https://fastapi.tiangolo.com/">FastAPI</a> and <a target="_blank" href="https://aws.amazon.com/lambda/">AWS Lambda</a>, I wanted to share a quick tutorial on how to combine these two awesome technologies. If you're not familiar with AWS Lambda, it's a serverless compute service that lets you run code without managing servers. And FastAPI? It's one of my favorite Python frameworks for building APIs!</p>
<p>In this article, I'll show you how to deploy a FastAPI application to AWS Lambda using <a target="_blank" href="https://pypi.org/project/mangum/">Mangum</a>. Let's dive in!</p>
<h2 id="heading-prerequisites">Prerequisites</h2>
<ul>
<li><p><a target="_blank" href="https://www.python.org/downloads/">Python 3.10 installed</a></p>
</li>
<li><p><a target="_blank" href="https://signin.aws.amazon.com/signup?request_type=register">AWS account</a></p>
</li>
<li><p><a target="_blank" href="https://youtu.be/LVuxmQfqivA?si=lC-BVcTcRyzM7JOb">Basic understanding of FastAPI</a></p>
</li>
<li><p>VS Code (or your favorite code editor)</p>
</li>
</ul>
<h2 id="heading-project-setup">Project Setup</h2>
<p>First, let's create the directory for our project:</p>
<pre><code class="lang-bash"><span class="hljs-comment"># Create project directory</span>
mkdir fastapi-lambda-function 
<span class="hljs-built_in">cd</span> fastapi-lambda-function
</code></pre>
<p>Create and activate virtual environment</p>
<pre><code class="lang-bash">python -m venv venv 
<span class="hljs-built_in">source</span> venv/bin/activate  <span class="hljs-comment"># On Windows: venv\Scripts\activate</span>
</code></pre>
<p>Create two files in your project:</p>
<ul>
<li><p><code>requirements.txt</code></p>
</li>
<li><p><code>main.py</code></p>
</li>
</ul>
<p>In <code>requirements.txt</code>, add these dependencies:</p>
<pre><code class="lang-bash">fastapi==0.99.0  <span class="hljs-comment"># Using this version due to current compatibility</span>
mangum
</code></pre>
<p>Install the packages:</p>
<pre><code class="lang-bash">pip install -r requirements.txt
</code></pre>
<h2 id="heading-creating-our-fastapi-application">Creating Our FastAPI Application</h2>
<p>In <code>main.py</code>, let's create a basic FastAPI app:</p>
<pre><code class="lang-bash">from fastapi import FastAPI
from mangum import Mangum

app = FastAPI()

@app.get(<span class="hljs-string">"/"</span>)
async def read_root():
    <span class="hljs-built_in">return</span> {<span class="hljs-string">"message"</span>: <span class="hljs-string">"Hello, this is my first FastAPI app running on AWS Lambda!"</span>}

<span class="hljs-comment"># Add another route for demonstration</span>
@app.get(<span class="hljs-string">"/items/{item_id}"</span>)
async def read_item(item_id: int, q: str = None):
    <span class="hljs-built_in">return</span> {<span class="hljs-string">"item_id"</span>: item_id, <span class="hljs-string">"q"</span>: q}

<span class="hljs-comment"># This is the key for Lambda!</span>
handler = Mangum(app)
</code></pre>
<p>This basic FastAPI has two endpoints, a root and pretty basic <code>/items</code> endpoint. But as you can see the last line with handler being Mangum, that is how AWS will trigger this Lambda function.</p>
<h2 id="heading-packaging-for-aws-lambda">Packaging for AWS Lambda</h2>
<p>Now comes the fun part - packaging our application for Lambda! We need to create a ZIP file containing all our dependencies and code.</p>
<p>On Windows you can use PowerShell:</p>
<pre><code class="lang-bash"><span class="hljs-comment"># Zip dependencies</span>
Compress-Archive -Path <span class="hljs-string">"venv/Lib/site-packages/*"</span> -DestinationPath <span class="hljs-string">"aws-lambda.zip"</span>

<span class="hljs-comment"># Add main.py to the zip</span>
Compress-Archive -Path <span class="hljs-string">"main.py"</span> -Update -DestinationPath <span class="hljs-string">"aws-lambda.zip"</span>
</code></pre>
<p>On Unix/Linux:</p>
<pre><code class="lang-bash"><span class="hljs-built_in">cd</span> venv/lib/python3.10/site-packages
zip -r ../../../../aws-lambda.zip .
<span class="hljs-built_in">cd</span> ../../../../
zip -g aws-lambda.zip main.py
</code></pre>
<h2 id="heading-deploying-to-aws-lambda">Deploying to AWS Lambda</h2>
<ol>
<li><p>Go to <a target="_blank" href="https://console.aws.amazon.com/">AWS Console</a> and search for Lambda</p>
</li>
<li><p>Click "Create function"</p>
<p> <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1740170443878/0ee11fac-ff56-4f4c-b762-2314ffe68d93.png" alt class="image--center mx-auto" /></p>
</li>
<li><p>Choose "Author from scratch"</p>
</li>
<li><p>Configure the basics:</p>
<ul>
<li><p>Name: <code>fastapi-lambda</code></p>
</li>
<li><p>Runtime: Python 3.10</p>
</li>
<li><p>Architecture: x86_64</p>
</li>
<li><p>Default execution role: Create new basic Lambda role  </p>
<p>  <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1740170530009/c09ca265-7de0-436d-a51a-ffd4647c5873.png" alt class="image--center mx-auto" /></p>
</li>
</ul>
</li>
<li><p>Enable Function URL:</p>
<ul>
<li><p>Click "Configuration" tab</p>
</li>
<li><p>Find "Function URL" and enable it</p>
</li>
<li><p>Auth type: NONE (for this demo)</p>
</li>
<li><p>Click "Save"</p>
<p>  <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1740170550891/45c36928-96fb-401b-8e3b-95d00dfd53c3.png" alt class="image--center mx-auto" /></p>
</li>
</ul>
</li>
<li><p>Upload your code:</p>
<ul>
<li><p>Go to "Code" tab</p>
</li>
<li><p>Click "Upload from" and choose ".zip file"</p>
</li>
<li><p>Upload your <code>aws-lambda.zip</code></p>
</li>
<li><p>Click "Save"</p>
<p>  <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1740170570628/f5dd299d-5f1d-45ba-8886-c44b1dc0d32e.png" alt class="image--center mx-auto" /></p>
</li>
</ul>
</li>
<li><p>Configure the handler:</p>
<ul>
<li><p>Go to "Runtime settings"</p>
</li>
<li><p>Click "Edit"</p>
</li>
<li><p>Change Handler to: <code>main.handler</code></p>
</li>
<li><p>Click "Save"</p>
</li>
</ul>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1740170667787/1243c2a9-2133-46c8-a7a7-b8e332bd821a.png" alt class="image--center mx-auto" /></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1740170628922/8f5ddb00-4651-4093-85e3-d27c01400983.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-testing-your-api">Testing Your API</h2>
<p>Once deployed, you'll get a Function URL. Try these endpoints:</p>
<ul>
<li><p><code>&lt;your-function-url&gt;/</code> - Should return your welcome message</p>
</li>
<li><p><code>&lt;your-function-url&gt;/items/1</code> - Should return <code>{"item_id": 1, "q": null}</code></p>
</li>
<li><p><code>&lt;your-function-url&gt;/items/100</code> - Should return <code>{"item_id": 100, "q": null}</code></p>
</li>
</ul>
<h2 id="heading-whats-next">What's Next?</h2>
<p>This is just a basic example of what you can do with FastAPI and AWS Lambda. In production, you might want to consider:</p>
<ul>
<li><p>Setting up proper authentication</p>
</li>
<li><p>Using API Gateway for more control</p>
</li>
<li><p>Implementing proper error handling</p>
</li>
<li><p>Setting up CI/CD pipelines</p>
</li>
</ul>
<p>If you found this helpful, don't forget to follow me for more cloud and DevOps content! Check out my <a target="_blank" href="https://youtube.com/@rishabincloud">YouTube channel</a> where I have a detailed video walkthrough of this tutorial.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/b0XCH04K8eQ?si=dkcxSd0uyucTL1Kz">https://youtu.be/b0XCH04K8eQ?si=dkcxSd0uyucTL1Kz</a></div>
<p> </p>
<h2 id="heading-common-issues">Common Issues</h2>
<ul>
<li><p>If you're getting errors with the latest FastAPI version, stick to version <code>0.99.0</code> as specified in the requirements.</p>
</li>
<li><p>Make sure your handler name matches exactly <code>main.handler</code>.</p>
</li>
<li><p>Check your Lambda execution role has proper permissions.</p>
</li>
</ul>
<p>Happy coding!</p>
<p>P.S. All code from this tutorial is available in my <a target="_blank" href="https://github.com/rishabkumar7/fastapi-aws-lambda">GitHub repo</a>.</p>
]]></content:encoded></item><item><title><![CDATA[Deploying Uptime Kuma to GCP using Terraform]]></title><description><![CDATA[If you've ever needed to monitor your websites or services, you know how expensive monitoring tools can get. I recently have been exploring monitoring solutions, for my personal projects as well as Learn to Cloud and I stumbled upon Uptime Kuma.Uptim...]]></description><link>https://blog.rishabkumar.com/deploy-uptime-kuma-to-gcp-terraform</link><guid isPermaLink="true">https://blog.rishabkumar.com/deploy-uptime-kuma-to-gcp-terraform</guid><category><![CDATA[GCP]]></category><category><![CDATA[Terraform]]></category><category><![CDATA[monitoring]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Tue, 17 Dec 2024 16:48:57 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1734383106290/89795c47-b92d-4629-bb51-ef43b53ed8c2.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>If you've ever needed to monitor your websites or services, you know how expensive monitoring tools can get. I recently have been exploring monitoring solutions, for my personal projects as well as <a target="_blank" href="https://learntocloud.guide">Learn to Cloud</a> and I stumbled upon Uptime Kuma.<br /><a target="_blank" href="https://github.com/louislam/uptime-kuma">Uptime Kuma</a> is a fantastic open-source and self-hosted monitoring tool that lets you track pretty much anything - websites, APIs, DNS records, Docker containers, and even Steam game servers (yes, really!).</p>
<p>What makes Uptime Kuma stand out is its simplicity. You get a clean, responsive UI, monitoring intervals as low as 20 seconds, and notifications through various services like - Telegram, Discord, Slack, email. Plus, it supports multi-language interfaces and lets you create multiple status pages, which is perfect if you're managing different projects.</p>
<p>Today, we're going to deploy Uptime Kuma to Google Cloud Platform using Terraform. Why GCP? Because their free tier is pretty generous, and with the setup we'll use, you can run this without spending a dime. And why Terraform? Because nobody wants to click through cloud console menus every time they need to set up their cloud infrastructure.</p>
<p>Let's get this monitoring system up and running!</p>
<h2 id="heading-prerequisites">Prerequisites</h2>
<p>Before we dive in, make sure you have:</p>
<ul>
<li><p>A <a target="_blank" href="https://console.cloud.google.com/">Google Cloud account</a> with billing enabled</p>
</li>
<li><p><a target="_blank" href="https://www.terraform.io/downloads.html">Terraform</a> installed on your machine</p>
</li>
<li><p><a target="_blank" href="https://cloud.google.com/sdk/docs/install">Google Cloud CLI</a> installed and authenticated</p>
</li>
<li><p>A basic understanding of GCP and Terraform concepts</p>
</li>
</ul>
<p>If you aren’t familiar with Terraform and GCP concepts or need a refresher, check out <a target="_blank" href="https://www.youtube.com/watch?v=VCayKl82Lt8">this free course on freeCodeCamp.</a></p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://www.youtube.com/watch?v=VCayKl82Lt8">https://www.youtube.com/watch?v=VCayKl82Lt8</a></div>
<p> </p>
<h2 id="heading-project-setup">Project Setup</h2>
<p>Let's start by creating a new directory for our Terraform configuration:</p>
<pre><code class="lang-bash">mkdir uptime-kuma-terraform
<span class="hljs-built_in">cd</span> uptime-kuma-terraform
</code></pre>
<h2 id="heading-the-terraform-configuration">The Terraform Configuration</h2>
<p>We'll need to create a single file named <code>main.tf</code>. This file will contain all our infrastructure configuration:</p>
<pre><code class="lang-bash">terraform {
  required_providers {
    google = {
      <span class="hljs-built_in">source</span>  = <span class="hljs-string">"hashicorp/google"</span>
      version = <span class="hljs-string">"~&gt; 5.0"</span>
    }
  }

  required_version = <span class="hljs-string">"&gt;= 1.0.0"</span>
}

provider <span class="hljs-string">"google"</span> {
  project = var.project_id
  region  = var.region
}

<span class="hljs-comment"># Create a persistent disk for Uptime Kuma data</span>
resource <span class="hljs-string">"google_compute_disk"</span> <span class="hljs-string">"kuma_disk"</span> {
  name = <span class="hljs-string">"kuma-disk"</span>
  <span class="hljs-built_in">type</span> = <span class="hljs-string">"pd-standard"</span>
  size = var.disk_size
  zone = var.zone
}

<span class="hljs-comment"># Create VPC network firewall rule for Uptime Kuma</span>
resource <span class="hljs-string">"google_compute_firewall"</span> <span class="hljs-string">"kuma_firewall"</span> {
  name    = <span class="hljs-string">"allow-kuma-3001"</span>
  network = <span class="hljs-string">"default"</span>

  allow {
    protocol = <span class="hljs-string">"tcp"</span>
    ports    = [<span class="hljs-string">"3001"</span>]
  }

  source_ranges = [<span class="hljs-string">"0.0.0.0/0"</span>]
  target_tags   = [<span class="hljs-string">"kuma-3001"</span>]
}

<span class="hljs-comment"># Create the VM instance</span>
resource <span class="hljs-string">"google_compute_instance"</span> <span class="hljs-string">"uptime_kuma"</span> {
  name         = <span class="hljs-string">"uptime-kuma-vm"</span>
  machine_type = var.machine_type
  zone         = var.zone

  tags = [<span class="hljs-string">"kuma-3001"</span>]

  boot_disk {
    initialize_params {
      image = <span class="hljs-string">"cos-cloud/cos-stable"</span>
      size  = var.boot_disk_size
    }
  }

  attached_disk {
    <span class="hljs-built_in">source</span>      = google_compute_disk.kuma_disk.self_link
    device_name = <span class="hljs-string">"kuma-data"</span>
  }

  network_interface {
    network = <span class="hljs-string">"default"</span>
    access_config {
      // Ephemeral public IP
    }
  }

  metadata = {
    google-logging-enabled = <span class="hljs-string">"true"</span>

    <span class="hljs-comment"># Configure the container</span>
    gce-container-declaration = yamlencode({
      spec = {
        containers = [{
          name = <span class="hljs-string">"uptime-kuma"</span>
          image = <span class="hljs-string">"registry.hub.docker.com/louislam/uptime-kuma:1-debian"</span>
          securityContext = {
            privileged = <span class="hljs-literal">false</span>
          }
          volumeMounts = [{
            name = <span class="hljs-string">"kuma-data"</span>
            mountPath = <span class="hljs-string">"/app/data"</span>
            readOnly = <span class="hljs-literal">false</span>
          }]
          ports = [{
            containerPort = 3001
            hostPort = 3001
          }]
        }]
        volumes = [{
          name = <span class="hljs-string">"kuma-data"</span>
          hostPath = {
            path = <span class="hljs-string">"/mnt/disks/kuma-data"</span>
          }
        }]
        restartPolicy = <span class="hljs-string">"Always"</span>
      }
    })

    <span class="hljs-comment"># Format and mount the persistent disk</span>
    startup-script = &lt;&lt;-EOF
      <span class="hljs-comment">#!/bin/bash</span>
      <span class="hljs-keyword">if</span> [ ! -d <span class="hljs-string">"/mnt/disks/kuma-data"</span> ]; <span class="hljs-keyword">then</span>
        sudo mkdir -p /mnt/disks/kuma-data
        sudo mkfs.ext4 -F /dev/disk/by-id/google-kuma-data
        sudo mount -o discard,defaults /dev/disk/by-id/google-kuma-data /mnt/disks/kuma-data
        sudo chmod a+w /mnt/disks/kuma-data
      <span class="hljs-keyword">fi</span>
    EOF
  }

  service_account {
    scopes = [<span class="hljs-string">"cloud-platform"</span>]
  }
}

<span class="hljs-comment"># outputs.tf</span>
output <span class="hljs-string">"instance_external_ip"</span> {
  value       = google_compute_instance.uptime_kuma.network_interface[0].access_config[0].nat_ip
  description = <span class="hljs-string">"The external IP address of the Uptime Kuma instance"</span>
}

output <span class="hljs-string">"uptime_kuma_url"</span> {
  value       = <span class="hljs-string">"http://<span class="hljs-variable">${google_compute_instance.uptime_kuma.network_interface[0].access_config[0].nat_ip}</span>:3001"</span>
  description = <span class="hljs-string">"The URL to access Uptime Kuma"</span>
}
</code></pre>
<p>Since, we have few variables within our <code>main.tf</code> , let’s create them in <code>variables.tf</code></p>
<pre><code class="lang-bash">variable <span class="hljs-string">"project_id"</span> {
  description = <span class="hljs-string">"The ID of the GCP project"</span>
  <span class="hljs-built_in">type</span>        = string
}

variable <span class="hljs-string">"region"</span> {
  description = <span class="hljs-string">"The region to deploy resources to"</span>
  <span class="hljs-built_in">type</span>        = string
  default     = <span class="hljs-string">"us-central1"</span>
}

variable <span class="hljs-string">"zone"</span> {
  description = <span class="hljs-string">"The zone to deploy resources to"</span>
  <span class="hljs-built_in">type</span>        = string
  default     = <span class="hljs-string">"us-central1-a"</span>
}

variable <span class="hljs-string">"machine_type"</span> {
  description = <span class="hljs-string">"The machine type to use for the VM instance"</span>
  <span class="hljs-built_in">type</span>        = string
  default     = <span class="hljs-string">"e2-micro"</span>
}

variable <span class="hljs-string">"disk_size"</span> {
  description = <span class="hljs-string">"Size of the persistent disk in GB"</span>
  <span class="hljs-built_in">type</span>        = number
  default     = 20
}

variable <span class="hljs-string">"boot_disk_size"</span> {
  description = <span class="hljs-string">"Size of the boot disk in GB"</span>
  <span class="hljs-built_in">type</span>        = number
  default     = 10
}
</code></pre>
<p>Now, create a <code>terraform.tfvars</code> file to specify your project ID:</p>
<pre><code class="lang-bash">project_id = <span class="hljs-string">"your-project-id"</span>
<span class="hljs-comment"># Optionally override other variables:</span>
<span class="hljs-comment"># region = "us-west1"</span>
<span class="hljs-comment"># zone = "us-west1-a"</span>
<span class="hljs-comment"># disk_size = 30</span>
<span class="hljs-comment"># machine_type = "e2-small"</span>
</code></pre>
<h2 id="heading-whats-being-created">What's Being Created?</h2>
<p>Let's break down what this Terraform configuration is doing:</p>
<ol>
<li><p><strong>VM Instance</strong>: We're creating an e2-micro instance (free tier eligible) using Container-Optimized OS, which is perfect for running Docker containers.</p>
</li>
<li><p><strong>Persistent Storage</strong>: A 20GB persistent disk is attached to store your Uptime Kuma data, ensuring your monitoring history and settings survive VM restarts.</p>
</li>
<li><p><strong>Container Configuration</strong>: The VM is automatically configured to run the Uptime Kuma Docker container with proper volume mounting for persistent storage.</p>
</li>
<li><p><strong>Networking</strong>: A firewall rule is created to allow traffic to Uptime Kuma's default port (3001).</p>
</li>
</ol>
<h2 id="heading-deployment">Deployment</h2>
<p>Now that our Terraform configuration is ready, let’s deploy our infrastructure.</p>
<ol>
<li>First, initialize Terraform:</li>
</ol>
<pre><code class="lang-bash">terraform init
</code></pre>
<ol start="2">
<li>Preview the changes that will be made:</li>
</ol>
<pre><code class="lang-bash">terraform plan
</code></pre>
<ol start="3">
<li>Deploy the infrastructure:</li>
</ol>
<pre><code class="lang-bash">terraform apply
</code></pre>
<p>When prompted, type <code>yes</code> to confirm. The deployment will take about 2-3 minutes.</p>
<h2 id="heading-accessing-uptime-kuma">Accessing Uptime Kuma</h2>
<p>Once deployment is complete, Terraform will output your instance's IP address and the full URL to access Uptime Kuma.</p>
<pre><code class="lang-bash">Apply complete! Resources: 3 added, 0 changed, 0 destroyed.

Outputs:

instance_external_ip = <span class="hljs-string">"34.71.97.11"</span>
uptime_kuma_url = <span class="hljs-string">"http://34.71.97.11:3001"</span>
</code></pre>
<p>Just open your browser and navigate to: <code>http://&lt;your-instance-ip&gt;:3001</code></p>
<p>You'll see the Uptime Kuma setup screen where you can create your admin account and start monitoring your services.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1734380886514/552327a1-e979-4e03-a0da-7b52a89fe33a.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-cleaning-up">Cleaning Up</h2>
<p>If you ever want to tear down the infrastructure:</p>
<pre><code class="lang-bash">terraform destroy
</code></pre>
<div data-node-type="callout">
<div data-node-type="callout-emoji">⚠</div>
<div data-node-type="callout-text">Remember: This will delete everything, including your monitoring history!</div>
</div>

<h2 id="heading-pro-tips">Pro Tips</h2>
<p>Here are some useful tips for managing your deployment:</p>
<ol>
<li><p><strong>Cost Management</strong>: The configuration uses free-tier eligible resources (e2-micro instance and standard persistent disk), so you shouldn't incur any charges if you're within the free tier limits.</p>
</li>
<li><p><strong>Customization</strong>: You can easily modify the configuration by changing variables in your <code>terraform.tfvars</code> file:</p>
<ul>
<li><p>Need more storage? Adjust <code>disk_size</code></p>
</li>
<li><p>Want a more powerful VM? Change <code>machine_type</code></p>
</li>
<li><p>Different region? Update <code>region</code> and <code>zone</code></p>
</li>
</ul>
</li>
<li><p><strong>State Management</strong>: Keep your <code>terraform.tfstate</code> file safe - it's how Terraform tracks your resources. Consider using <a target="_blank" href="https://www.terraform.io/docs/language/state/remote.html">remote state</a> for team environments.</p>
</li>
</ol>
<h2 id="heading-troubleshooting">Troubleshooting</h2>
<p>If you can't access Uptime Kuma after deployment:</p>
<ol>
<li><p>Wait a few minutes for the container to start up completely</p>
</li>
<li><p>Check if your VM is running in the GCP Console</p>
</li>
<li><p>Verify the firewall rule was created correctly</p>
</li>
<li><p>SSH into the VM to check container logs:</p>
</li>
</ol>
<pre><code class="lang-bash">gcloud compute ssh uptime-kuma-vm
docker ps
docker logs $(docker ps -q)
</code></pre>
<h2 id="heading-conclusion">Conclusion</h2>
<p>And there you have it! You've just automated the deployment of Uptime Kuma on Google Cloud. No more manual configuration or clicking through the console - just clean, repeatable infrastructure as code.</p>
<p>The best part? This setup is completely free-tier eligible and takes care of all the little details like persistent storage and container configuration. You can now deploy and destroy your monitoring environment with a single command!</p>
<p>Let me know in the comments if you have any questions or run into issues. Happy monitoring! 🚀</p>
<h2 id="heading-resources">Resources</h2>
<ul>
<li><p><a target="_blank" href="https://github.com/rishabkumar7/uptime-kuma-gcp-vm">GitHub Repository with Terraform Code</a></p>
</li>
<li><p><a target="_blank" href="https://github.com/louislam/uptime-kuma">Uptime Kuma GitHub Repository</a></p>
</li>
<li><p><a target="_blank" href="https://www.terraform.io/docs">Terraform Documentation</a></p>
</li>
<li><p><a target="_blank" href="https://cloud.google.com/free">Google Cloud Free Tier Details</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[How to Pass the AI-102 Azure AI Engineer Associate Certification]]></title><description><![CDATA[Are you preparing for the AI-102 Azure AI Engineer Associate certification? I recently passed mine on 19th June, 2024. This guide will provide you with valuable insights and tips to help you succeed in your exam preparation.
Exam Overview

Exam Code:...]]></description><link>https://blog.rishabkumar.com/pass-ai-102-azure-ai-engineer-associate</link><guid isPermaLink="true">https://blog.rishabkumar.com/pass-ai-102-azure-ai-engineer-associate</guid><category><![CDATA[Azure]]></category><category><![CDATA[AI]]></category><category><![CDATA[Certification]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Thu, 27 Jun 2024 20:54:54 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1719419679691/7a4eaae0-e5a2-4259-8c3e-f2aee5e1e7a2.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Are you preparing for the AI-102 Azure AI Engineer Associate certification? I recently passed mine on 19th June, 2024. This guide will provide you with valuable insights and tips to help you succeed in your exam preparation.</p>
<h2 id="heading-exam-overview">Exam Overview</h2>
<ul>
<li><p><strong>Exam Code</strong>: AI-102</p>
</li>
<li><p><strong>Level</strong>: Intermediate</p>
</li>
<li><p><strong>Duration</strong>: 2 hours (with 1 hour 40 minutes for the actual exam)</p>
</li>
<li><p><strong>Number of Questions</strong>: ~50 (varies per exam)</p>
</li>
<li><p><strong>Passing Score</strong>: 700 out of 1000</p>
</li>
</ul>
<h2 id="heading-skills-measured">Skills Measured</h2>
<p>The exam covers six main sections:</p>
<ul>
<li><p>Plan and manage an Azure AI solution (15–20%)</p>
</li>
<li><p>Implement content moderation solutions (10–15%)</p>
</li>
<li><p>Implement computer vision solutions (15–20%)</p>
</li>
<li><p>Implement natural language processing solutions (30–35%)</p>
</li>
<li><p>Implement knowledge mining and document intelligence solutions (10–15%)</p>
</li>
<li><p>Implement generative AI solutions (10–15%)</p>
</li>
</ul>
<h2 id="heading-exam-format-and-tips">Exam Format and Tips</h2>
<ul>
<li><p>Choose between Python or C# for SDK-related questions</p>
</li>
<li><p>Access to Microsoft Learn is provided during the exam</p>
</li>
<li><p>Time management is crucial - don't spend too long on any single question</p>
</li>
</ul>
<h2 id="heading-preparation-resources">Preparation Resources</h2>
<ol>
<li><p><strong>Microsoft Learn</strong>: The primary resource for exam preparation</p>
<ul>
<li><p>Course: "Designing and Implementing a Microsoft AI Solution"</p>
</li>
<li><p>Complete all modules, including hands-on labs</p>
</li>
</ul>
</li>
<li><p><strong>Official Study Guide</strong>: Available on <a target="_blank" href="https://learn.microsoft.com/en-us/credentials/certifications/resources/study-guides/ai-102">Microsoft Learn</a></p>
<ul>
<li>Provides a detailed breakdown of exam objectives</li>
</ul>
</li>
<li><p><strong>AI-102 Study Cram Course</strong> by <a target="_blank" href="https://www.youtube.com/watch?v=I7fdWafTcPY&amp;t=0s">John Savill</a></p>
<ul>
<li><p>Offers a concise overview of exam topics and Azure AI services</p>
<p>  %[https://www.youtube.com/watch?v=I7fdWafTcPY&amp;t=0s] </p>
</li>
</ul>
</li>
<li><p><strong>Hands-on Projects</strong>:</p>
<ul>
<li><p>Build real Azure AI solutions to gain practical experience</p>
</li>
<li><p>Experiment with Azure AI Search, Cognitive Services, and other relevant technologies</p>
</li>
</ul>
</li>
<li><p><strong>Azure AI Services Familiarity</strong>:</p>
<ul>
<li><p>Understand the use cases for various Azure AI services</p>
</li>
<li><p>Learn about recent rebranding (e.g., Cognitive Services now use "AI" in their names)</p>
</li>
</ul>
</li>
<li><p><strong>Docker Deployment</strong>:</p>
<ul>
<li><p>Learn how to deploy AI services using Docker containers</p>
</li>
<li><p>Understand the setup process, including API keys and endpoints</p>
</li>
</ul>
</li>
</ol>
<h2 id="heading-my-personal-exam-experience">My Personal Exam Experience</h2>
<h3 id="heading-time-management">Time Management</h3>
<p>The exam lasted the full two hours, and I used every minute available. In fact, time management turned out to be one of the biggest challenges. Despite my best efforts, I wasn't able to answer all the questions in time. When the clock ran out, I had left 2-3 questions unanswered at the very end. This experience taught me the importance of pacing myself throughout the exam and not spending too much time on any single question, trying to find right docs in Microsoft Learn.</p>
<h3 id="heading-using-microsoft-learn-during-the-exam">Using Microsoft Learn During the Exam</h3>
<p>Talking about the access to Microsoft Learn during the test, while this can be helpful, it's also a potential time sink. I found myself looking up information for some questions, which ate into my available time. In retrospect, I should have been more selective about when to use this resource. It's there to help, but relying on it too much can prevent you from finishing all the questions.</p>
<h3 id="heading-the-result">The Result</h3>
<p>Despite the time crunch at the end, I'm happy to report that I passed the exam with a score of 760 out of 1000, close to the passing score of 700. The score report provided a breakdown of my performance in each section, which was helpful for identifying areas where I could improve further.</p>
<h3 id="heading-lessons-learned">Lessons Learned</h3>
<ol>
<li><p><strong>Time management is crucial</strong>: Practice working through sample questions under timed conditions to improve your pacing.</p>
</li>
<li><p><strong>Use Microsoft Learn judiciously</strong>: While it's a valuable resource, be strategic about when you use it during the exam.</p>
</li>
<li><p><strong>Focus on weaker areas</strong>: My score report showed that I performed least well in the "Plan and manage Azure AI solution" section. Identifying such areas can help you focus your continued learning and improvement.</p>
</li>
</ol>
<p>Remember, everyone's experience will be unique, but I hope sharing mine gives you a better idea of what to expect and how to prepare.</p>
<h2 id="heading-additional-tips">Additional Tips</h2>
<ul>
<li><p>Focus on solution-oriented scenarios in your study</p>
</li>
<li><p>Practice identifying the appropriate AI service for different use cases</p>
</li>
<li><p>Understand the differences between services like Azure AI Search and Azure AI Document Intelligence</p>
</li>
<li><p>Familiarize yourself with both Python SDK and REST API interactions and how you can deploy AI services as containers</p>
</li>
<li><p>Get used the important Microsoft Learn docs, makes it easy to navigate during the exam</p>
</li>
</ul>
<h2 id="heading-conclusion">Conclusion</h2>
<p>With thorough preparation using the resources mentioned above and a focus on hands-on experience, you'll be well-equipped to pass the AI-102 Azure AI Engineer Associate certification. Good luck with your exam!</p>
]]></content:encoded></item><item><title><![CDATA[The Digital Thief: How Smartphones Are Robbing Us!]]></title><description><![CDATA[Have you ever wondered how many years of your life you'll spend staring at a screen? It's a sobering question, that I have been thinking about since this year started. Hence, I have been researching on this very topic and I truly believe the question...]]></description><link>https://blog.rishabkumar.com/how-smartphones-are-robbing-us</link><guid isPermaLink="true">https://blog.rishabkumar.com/how-smartphones-are-robbing-us</guid><category><![CDATA[Screen Time]]></category><category><![CDATA[smartphone]]></category><category><![CDATA[Productivity]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Fri, 07 Jun 2024 22:50:01 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1717800248636/59119f17-faa1-43f1-b25e-98ef53c16925.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Have you ever wondered how many years of your life you'll spend staring at a screen? It's a sobering question, that I have been thinking about since this year started. Hence, I have been researching on this very topic and I truly believe the question is worth asking as we confront the silent pandemic of smartphone addiction. In this post, we'll explore the alarming statistics (thanks to all the researchers), the impact on our lives, for me it started with productivity, but the impact is more than just productivity, and provide a simple formula to help you calculate your own "screen time lifespan."</p>
<h2 id="heading-a-crisis-of-distraction">A Crisis of Distraction</h2>
<p>The average 18-year-old in the U.S. is on track to spend a staggering 93% of their remaining free time staring at a screen. Imagine reaching the age of 90 and reflecting on a life dominated by digital distraction, missing out on real-world experiences, relationships, and personal growth. This isn't just about wasted time; it's about missed opportunities to shape who we become.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1717525163278/f669c476-6ea8-4e66-ace6-d8699036aca1.png" alt class="image--center mx-auto" /></p>
<p><em>This image is from</em><a target="_blank" href="https://youtu.be/4TMPXK9tw5U?si=gVzO4h8US1w5SIyY"><strong><em>Dino Ambrosi's*</em></strong>Ted talk<em>\</em>**:*** <em>The Battle for Your Time: Exposing the Costs of Social Media</em></a><em>.</em></p>
<p>Here is where my mind was blown, if you take away Sleep, Work + School, Cooking/Eating and Bathroom away, an average 18 year old is going to have <strong>334 months</strong> left as free time, given they live up to 90 years, out of which they are going to spend <strong>312 months</strong> looking at screen!</p>
<h2 id="heading-the-cognitive-cost">The Cognitive Cost</h2>
<p>Our brains are being rewired by the constant bombardment of short-form content and rapid-fire information. With the rise of TikTok, IG reel and YouTube Shorts, we're training ourselves to be chronically distracted, sacrificing our ability to focus, think deeply, and engage meaningfully with the world around us. This has serious implications for our careers, relationships, and overall well-being.</p>
<h2 id="heading-the-social-media-mirage"><strong>The Social Media Mirage</strong></h2>
<p>Social media platforms aren't just addictive; they subtly shape our values and beliefs. They tell us that our worth is tied to appearances, that relationships are about frequency over depth, and that complex issues can be reduced to sound bites. That's why we call them social media influencers, because they influence us. In contrast, technologies like books and letters encourage empathy, deep thought, and meaningful connection.</p>
<h2 id="heading-paying-with-our-lives"><strong>Paying with Our Lives</strong></h2>
<p>Is Social media FREE? No, Social media companies profit from our attention. We may not pay a monetary fee, but we pay dearly with our time, the most valuable resource we have. By understanding this dynamic, we can start to make more conscious choices about how we spend our digital hours.</p>
<h2 id="heading-the-unseen-addiction"><strong>The Unseen Addiction</strong></h2>
<p>While we grapple with the broader implications of screen time, the addiction itself remains largely unacknowledged. Within a mere 15 years, an astonishing 97% of Americans own a smartphone, and surveys indicate that roughly half feel addicted. American teens spend an average of nine hours a day on their phones, a significant portion of their waking lives.</p>
<h2 id="heading-your-smartphone-a-dopamine-dealer"><strong>Your Smartphone: A Dopamine Dealer</strong></h2>
<p>Smartphones are designed to trigger dopamine release through notifications, likes, and endless scrolling. This creates a cycle of craving and reward, making it hard to put the phone down.</p>
<p><img src="https://www.researchgate.net/profile/Hueseyin-Macit/publication/333774040/figure/fig5/AS:769568045801474@1560490728665/Social-media-dopamin-loop.jpg" alt="Social media dopamin loop" /></p>
<p>This is image is from <a target="_blank" href="https://www.researchgate.net/publication/333774040_A_Research_On_Social_Media_Addiction_and_Dopamine_Driven_Feedback">Macit, Hüseyin &amp; Macit, Gamze &amp; Güngör, Orhan. (2019). A Research On Social Media Addiction and Dopamine Driven Feedback. 5. 882-897.</a></p>
<h2 id="heading-the-cost-of-connectivity"><strong>The Cost of Connectivity</strong></h2>
<p>This addiction comes at a steep price. Research has linked excessive smartphone use to various mental health issues, including depression, anxiety, and even suicidal thoughts. Moreover, it's eroding our productivity, focus, and creativity. We're constantly distracted, interrupting deep work with a quick glance at our screens.</p>
<p>Read this great research by <a target="_blank" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10838039/#:~:text=Results%3A,0.328%2C%20p%20%3C%200.05">Sarhan AL: The relationship of smartphone addiction with depression, anxiety, and stress among medical students. SAGE Open Med. 2024 Feb 2;12:20503121241227367. doi: 10.1177/20503121241227367. PMID: 38313469; PMCID: PMC10838039.</a>.)</p>
<h2 id="heading-the-super-addiction"><strong>The Super Addiction</strong></h2>
<p>Smartphones are particularly insidious because they combine multiple addictive elements:</p>
<ul>
<li><p><strong>Unlimited access:</strong> They're always in our pockets, available 24/7.</p>
</li>
<li><p><strong>Instant gratification:</strong> We get immediate rewards from likes, comments, and new content.</p>
</li>
<li><p><strong>Constant novelty:</strong> Apps are continuously updated to keep us engaged, with new ones like TikTok rapidly taking over entire generations.</p>
</li>
<li><p><strong>Social acceptance:</strong> It's the norm to be glued to our phones, making it harder to recognize the problem.</p>
</li>
</ul>
<h3 id="heading-calculate-your-screen-time-lifespan"><strong>Calculate Your Screen Time Lifespan</strong></h3>
<p>Want to know how much of your life you might spend on screens? Try this simple formula:</p>
<pre><code class="lang-yaml"><span class="hljs-string">(Daily</span> <span class="hljs-string">Screen</span> <span class="hljs-string">Time</span> <span class="hljs-string">in</span> <span class="hljs-string">Hours</span> <span class="hljs-string">/</span> <span class="hljs-number">24</span><span class="hljs-string">)</span> <span class="hljs-string">x</span> <span class="hljs-string">(90</span> <span class="hljs-bullet">-</span> <span class="hljs-string">Current</span> <span class="hljs-string">Age)</span> <span class="hljs-string">=</span> <span class="hljs-string">Years</span> <span class="hljs-string">Spent</span> <span class="hljs-string">on</span> <span class="hljs-string">Screens</span>
</code></pre>
<p><strong>For example:</strong></p>
<p>If you're 30 years old, have a life expectancy of 90, and spend 5 hours a day on your phone screen:</p>
<p>(5 / 24) x (90 - 30) = 12.5 years on phone screen.</p>
<p>But wait, this number is still wrong, and it's probably going to be more than this, did you notice I am using 24 hours, so if you incorporate basic life tasks, like sleep, cooking, eating, etc, we have less than 24 hours in the day. To keep things simple, I am just going to subtract 8 hours of sleep from the day, which gives us:</p>
<pre><code class="lang-yaml"><span class="hljs-string">(Daily</span> <span class="hljs-string">Screen</span> <span class="hljs-string">Time</span> <span class="hljs-string">in</span> <span class="hljs-string">Hours</span> <span class="hljs-string">/</span> <span class="hljs-number">16</span><span class="hljs-string">)</span> <span class="hljs-string">x</span> <span class="hljs-string">(90</span> <span class="hljs-bullet">-</span> <span class="hljs-string">Current</span> <span class="hljs-string">Age)</span> <span class="hljs-string">=</span> <span class="hljs-string">Years</span> <span class="hljs-string">Spent</span> <span class="hljs-string">on</span> <span class="hljs-string">Screens</span>
</code></pre>
<p>(5 / 16) x (90 - 30) = 18.75 years on screens.</p>
<p>Again, this still an approximation but you get the idea, 18 years scrolling through endless reels/shorts! (at least most of it!)</p>
<h2 id="heading-breaking-free-steps-towards-a-healthier-relationship"><strong>Breaking Free: Steps Towards a Healthier Relationship</strong></h2>
<p>Most smartphones these days has Screen Time analytics, both <a target="_blank" href="https://support.apple.com/en-ca/108806#:~:text=To%20see%20the%20report%2C%20go,up%20or%20received%20a%20notification.">iOS</a> and <a target="_blank" href="https://support.google.com/android/answer/9346420?hl=en">Android</a> have these features. Since I have an iOS device, I will be showing you how I have setup limits on some of the apps. And yes, if you want you can uninstall them too, but what I have found is, I still use some of these platforms to stay connected with my friends and family. Because the idea is to use the apps and not let these apps use us. I use them as Social Media apps not as doom scrolling apps.</p>
<p>There are two ways you can limit the screen time for these apps:<br />You can group them together, so let's say you want to limit TikTok, IG and X/Twitter usage to complete 30 minutes/day.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1717526709801/d784e04d-4e6d-482a-af1d-23081cb855eb.jpeg" alt class="image--center mx-auto" /></p>
<p>Or you can set individuals limits for each app, like in this example I have 30 minutes/day for each app: TikTok, IG and X/Twitter.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1717526700383/5781b595-dfc7-480b-907c-4332dbfa5370.jpeg" alt class="image--center mx-auto" /></p>
<p>Again, make sure you don't get discourage by looking at the total hours, as applications like WhatsApp and Signal also come under the social umbrella (at least on iOS), which I use to connect: call and chat with my family and friends.</p>
<p>Again, the goal is to limit screen-time that is <strong>not valuable</strong>, talking to my mom on video chat is still very valuable to me!</p>
<p>So, keep these points in mind:</p>
<ol>
<li><p><strong>Limit screen time:</strong> Use apps to set boundaries and stick to them.</p>
</li>
<li><p><strong>Digital detox:</strong> Schedule regular breaks from your phone, especially before bed.</p>
</li>
<li><p><strong>Mindful usage:</strong> Pay attention to how your phone makes you feel and use it intentionally.</p>
</li>
</ol>
<h2 id="heading-conclusion"><strong>Conclusion</strong></h2>
<p>Smartphone addiction is not just about lost productivity; it's about losing ourselves in a digital world that's designed to keep us hooked. By recognizing the true cost of our screen time, we can make conscious choices to reclaim our time, attention, and lives.</p>
<p><strong>Call to Action:</strong></p>
<ul>
<li><p><strong>Calculate your screen time lifespan:</strong> How many years will you spend on screens?</p>
</li>
<li><p>Share this post with friends and family to raise awareness.</p>
</li>
<li><p>Track your screen time and set goals for reducing it.</p>
</li>
</ul>
<p>This blog post has a lot of references to research papers and also some talks. Please watch them.</p>
<ul>
<li><p><a target="_blank" href="https://youtu.be/4TMPXK9tw5U?si=_juohCd5-Z-ECmgQ">The Battle for Your Time: Exposing the Costs of Social Media by Dino Ambrosi</a></p>
</li>
<li><p><a target="_blank" href="https://youtu.be/2ldLwkj4dRc?si=pzQykI9N8ryhWt_x">Smartphones: It’s Time to Confront Our Global Addiction by Dr. Justin Romano</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[The Cloud Resume API Challenge - Beginner Cloud Project]]></title><description><![CDATA[A beginner cloud project - The Cloud Resume API
Want to impress potential employers and gain hands-on cloud experience at the same time? The Cloud Resume API Challenge is the perfect way to start! By building a cloud-based API for your resume, you'll...]]></description><link>https://blog.rishabkumar.com/cloud-resume-api-challenge</link><guid isPermaLink="true">https://blog.rishabkumar.com/cloud-resume-api-challenge</guid><category><![CDATA[Cloud]]></category><category><![CDATA[Cloud Computing]]></category><category><![CDATA[AWS]]></category><category><![CDATA[Azure]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Sat, 23 Mar 2024 14:49:24 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1711205119335/037f888d-59b9-4a5f-aff3-eb3b282d1375.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-a-beginner-cloud-project-the-cloud-resume-api">A beginner cloud project - The Cloud Resume API</h2>
<p>Want to impress potential employers and gain hands-on cloud experience at the same time? The Cloud Resume API Challenge is the perfect way to start! By building a cloud-based API for your resume, you'll showcase your cloud computing skills while creating something that has real-world value. And the best part – you don't need to be a cloud guru to get started.</p>
<h2 id="heading-challenge-requirements">Challenge Requirements:</h2>
<ul>
<li><p><strong>NoSQL Database</strong>: These flexible databases, like DynamoDB (AWS), Firestore (GCP), and Cosmos DB (Azure), are a great fit for storing resume data.</p>
</li>
<li><p><strong>Serverless Functions</strong>: This technology lets you run code without worrying about managing servers. It's the heart of your resume API!</p>
</li>
<li><p><strong>CI/CD (GitHub Actions)</strong>: This tool automates the process of updating your API. Every time you change your code, GitHub Actions will repackage and redeploy it for you.</p>
</li>
</ul>
<p><img src="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d1xg9ual3fm8pmoj805v.png" alt="Cloud Resume API Architecture" /></p>
<h2 id="heading-step-by-step-guide">Step-by-Step Guide</h2>
<p><strong>Pick Your Cloud Playground</strong>: The Cloud Resume API Challenge supports AWS, Google Cloud Platform (GCP), or Microsoft Azure. Pick a provider that you like best!</p>
<p><strong>Design Your Data</strong>: Think about what information your resume API will include. Start with the basics like your name, contact details, experience, and skills. You can always expand later!</p>
<p><strong>Store your Resume Data</strong>: Use the NoSQL Database within your selected cloud provider to store your resume data in JSON.</p>
<p><strong>Time to Code</strong>: Your core component will be a serverless function. This function is responsible for fetching your resume data from the database and crafting a response for anyone who uses your API.</p>
<p><strong>Deploy to Cloud</strong>: Deployment is the process that makes your API accessible over the internet.</p>
<p><strong>Test Drive Your API</strong>: Once your API is deployed, you (or anyone else!) can send a request and see your resume displayed in a structured JSON format.</p>
<h2 id="heading-tips-for-beginners">Tips for Beginners</h2>
<ul>
<li><p>Start Simple: Your initial API doesn't have to be fancy. Get the core functions working, then add features over time.</p>
</li>
<li><p>The Power of Community: Visit the <a target="_blank" href="https://cloudresumeapi.dev">Cloud Resume API website</a> to see examples from others who have taken on this challenge. You're not alone!</p>
</li>
</ul>
<p>Ready to build your Cloud Resume API? My YouTube video has the detailed steps of the challenge.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/iZq8aaGMpjM?si=rFTZX9vBhXU-N4h1">https://youtu.be/iZq8aaGMpjM?si=rFTZX9vBhXU-N4h1</a></div>
<p> </p>
<p><strong>Share Your Success</strong>: Did you conquer the challenge? I'd love to hear about it! Share your work and any questions in the comments below. Let's get started on your cloud journey!</p>
]]></content:encoded></item><item><title><![CDATA[How I passed the Azure AZ-204 Certification Exam]]></title><description><![CDATA[Hey everyone! If you're prepping for the Azure AZ-204 Developer exam, you're in the right place. I recently passed, so I want to share my experience and the strategies that worked for me.
My Journey
I was excited and a little nervous heading to the t...]]></description><link>https://blog.rishabkumar.com/azure-az-204-study-guide</link><guid isPermaLink="true">https://blog.rishabkumar.com/azure-az-204-study-guide</guid><category><![CDATA[Azure]]></category><category><![CDATA[Cloud]]></category><category><![CDATA[Cloud Computing]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Thu, 07 Mar 2024 13:00:27 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1709778228725/0a645f30-1c11-48aa-a408-edbc129d04eb.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hey everyone! If you're prepping for the Azure AZ-204 Developer exam, you're in the right place. I recently passed, so I want to share my experience and the strategies that worked for me.</p>
<h2 id="heading-my-journey"><strong>My Journey</strong></h2>
<p>I was excited and a little nervous heading to the test center. The exam definitely challenged me, but walking out with a passing score of 737 felt amazing! I know I barely passed, as 700 is required passing score and also I barely finished in time, so managing those minutes is essential!</p>
<h2 id="heading-how-did-i-prepare"><strong>How Did I Prepare?</strong></h2>
<p>The big question, right? Here's the thing: I didn't use any traditional courses. Instead, I've been building projects on Azure for a while now, and that hands-on experience was my main study method. Here's what I focused on:</p>
<ul>
<li><p><strong>Microsoft Learn Challenges:</strong> During the Microsoft Build Cloud Skills Challenge, I got a free exam voucher by completing the <strong>Microsoft Build: Cloud Development Challenge.</strong> It covered some of the AZ-204 modules.</p>
</li>
<li><p><strong>Microsoft Learn Modules:</strong> Outside of the challenge, I went through all the AZ-204 modules on Microsoft Learn. See them <a target="_blank" href="https://learn.microsoft.com/en-us/credentials/certifications/exams/az-204/">here.</a></p>
</li>
<li><p><strong>Hands-On Projects:</strong> This was the MOST important part! Over time, I've built projects on Azure that used the skills on the exam guide. Check out tools like Azure Functions, container apps, and storage accounts.</p>
<p>  <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1709817871485/4e160924-e54f-4f5e-a46c-1a9152767803.gif" alt class="image--center mx-auto" /></p>
</li>
</ul>
<h2 id="heading-important-extras"><strong>Important Extras</strong></h2>
<ul>
<li><p><strong>The AZ-204 Exam Guide:</strong> Find it on the <a target="_blank" href="https://learn.microsoft.com/en-us/credentials/certifications/resources/study-guides/az-204">Microsoft website</a>. Be super familiar with the skills you'll be tested on.</p>
</li>
<li><p><strong>Following are the domains</strong> you will be tested on with weightage:</p>
<ul>
<li><p>Develop Azure compute solutions (25–30%)</p>
</li>
<li><p>Develop for Azure storage (15–20%)</p>
</li>
<li><p>Implement Azure security (20–25%)</p>
</li>
<li><p>Monitor, troubleshoot, and optimize Azure solutions (15–20%)</p>
</li>
<li><p>Connect to and consume Azure services and third-party services (15–20%)</p>
</li>
</ul>
</li>
<li><p><strong>Open Book Option:</strong> Azure now lets you use Microsoft Learn during the exam! I found this useful, but be careful not to spend too much time searching for every answer.</p>
</li>
</ul>
<h2 id="heading-my-advice"><strong>My Advice?</strong></h2>
<ul>
<li><p><strong>Build, build, build!</strong> Nothing replaces hands-on experience.</p>
</li>
<li><p><strong>Know the exam guide</strong> so you're building the right kinds of projects.</p>
</li>
<li><p><strong>Manage your time</strong> during the exam.</p>
</li>
</ul>
<h2 id="heading-not-just-about-the-badge"><strong>Not Just About the Badge</strong></h2>
<p>I'm proud to now be seven-times Azure certified, but it's the knowledge that matters most. I love challenging myself to learn through real-world problem-solving!</p>
<div data-node-type="callout">
<div data-node-type="callout-emoji">💡</div>
<div data-node-type="callout-text">Don't collect certification badges like pokémons!</div>
</div>

<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://media.giphy.com/media/DRfu7BT8ZK1uo/giphy.gif?cid=790b7611w46ogbfac9a81dlva74xpfozhvgql692esc95n4i&amp;ep=v1_gifs_search&amp;rid=giphy.gif&amp;ct=g">https://media.giphy.com/media/DRfu7BT8ZK1uo/giphy.gif?cid=790b7611w46ogbfac9a81dlva74xpfozhvgql692esc95n4i&amp;ep=v1_gifs_search&amp;rid=giphy.gif&amp;ct=g</a></div>
<p> </p>
<p>It's totally possible to pass the AZ-204 if you're focused on practical learning. If you have the time to invest in building projects, I believe you can do it!</p>
<p>Let me know your thoughts or questions in the comments!</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/4tDxShE9ctQ?si=BIpdYqx7-Ym6xH9A">https://youtu.be/4tDxShE9ctQ?si=BIpdYqx7-Ym6xH9A</a></div>
]]></content:encoded></item><item><title><![CDATA[Should you include .terraform.lock.hcl in .gitignore?]]></title><description><![CDATA[One question that I had in my mind when I first build my IaC(Infrastructure as Code) with Terraform was, what all needs to be included in the .gitignore, especially the .terraform.lock.hcl.

So here is the quick answer: Yes, you should commit the .te...]]></description><link>https://blog.rishabkumar.com/terraform-lock-hcl-to-gitignore-or-commit</link><guid isPermaLink="true">https://blog.rishabkumar.com/terraform-lock-hcl-to-gitignore-or-commit</guid><category><![CDATA[Terraform]]></category><category><![CDATA[version control]]></category><category><![CDATA[Git]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Sun, 11 Feb 2024 16:34:14 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707666218132/d66439c0-b5c6-48b0-b954-fae90f887e3f.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>One question that I had in my mind when I first build my IaC(Infrastructure as Code) with Terraform was, what all needs to be included in the <code>.gitignore</code>, especially the <code>.terraform.lock.hcl</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707665594211/3eae31bb-cace-49ff-b2ab-b70cf6c3eb55.png" alt=".terraform.lock.hcl file in VSCode, ready to be committed to repository" class="image--center mx-auto" /></p>
<p>So here is the quick answer: <strong>Yes, you should commit</strong> the <code>.terraform.lock.hcl</code> to your version control (GitHub repository).</p>
<p>Let's first see, what is the <code>.terraform.lock.hcl</code>. According to <a target="_blank" href="https://developer.hashicorp.com/terraform/language/files/dependency-lock">Hashicorp's documentation</a></p>
<blockquote>
<p>Terraform automatically creates or updates the dependency lock file each time you run <a target="_blank" href="https://developer.hashicorp.com/terraform/cli/commands/init">the <code>terraform init</code> command. You should include this</a> file in your version control repository so that you can discuss potential changes to your external dependencies via code review, just as you would discuss potential changes to your configuration itself.</p>
</blockquote>
<p>When <code>terraform init</code> is working on installing all of the providers needed for a configuration, Terraform considers both the version constraints in the configuration <em>and</em> the version selections recorded in the lock file.</p>
<p>If a particular provider has no existing recorded selection, Terraform will select the newest available version that matches the given version constraint, and then update the lock file to include that selection.</p>
<p>If a particular provider already has a selection recorded in the lock file, Terraform will always re-select that version for installation, even if a newer version has become available. You can override that behaviour by adding the <code>-upgrade</code> option when you run <code>terraform init</code>, in which case Terraform will disregard the existing selections and once again select the newest available version matching the version constraint.</p>
<p>Basically, <code>.terraform.lock.hcl</code> allows Terraform to keep on utilizing the version of the provider selected when you added it. In the event that you don't checkin the lock file, you will automatically be upgraded to the latest version, which may lead to broken changes.</p>
<h2 id="heading-support-multiple-platforms">Support multiple platforms</h2>
<p>You may have developers who work with your Terraform code on different OS machines: Windows, macOS or Linux.<br />This is where you could choose to verify that all of your providers support all of those platforms, and to pre-populate the lock file with the necessary checksums, you can do so by running <code>terraform providers lock</code> and specifying those three platforms:</p>
<pre><code class="lang-bash">terraform providers lock \
  -platform=windows_amd64 \ <span class="hljs-comment"># 64-bit Windows</span>
  -platform=darwin_amd64 \  <span class="hljs-comment"># 64-bit macOS</span>
  -platform=linux_amd64     <span class="hljs-comment"># 64-bit Linux</span>
</code></pre>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707665446997/6a88bb21-c193-44a1-a162-0b1937371c69.png" alt="running the terraform providers lock command to support multiple systems" class="image--center mx-auto" /></p>
<p>After the command has run successfully, you will also see that output mentions:</p>
<blockquote>
<p>Review the changes in .terraform.lock.hcl and then commit to your version control system to retain the new checksums.</p>
</blockquote>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707665483448/f45f3494-eabf-40e6-b6a9-f6985293ed55.png" alt="Output after the terraform provider lock command is run successfully." class="image--center mx-auto" /></p>
<p>You can read more about the <code>provider lock</code> command <a target="_blank" href="https://www.terraform.io/cli/commands/providers/lock">here.</a></p>
<h2 id="heading-terraform-gitignore-file">Terraform <code>.gitignore</code> file</h2>
<p>And just for the reference, here is the Terraform <code>.gitignore</code> file.</p>
<pre><code class="lang-yaml"><span class="hljs-comment"># Local .terraform directories</span>
<span class="hljs-string">**/.terraform/*</span>

<span class="hljs-comment"># .tfstate files</span>
<span class="hljs-string">*.tfstate</span>
<span class="hljs-string">*.tfstate.*</span>

<span class="hljs-comment"># Crash log files</span>
<span class="hljs-string">crash.log</span>
<span class="hljs-string">crash.*.log</span>

<span class="hljs-comment"># Exclude all .tfvars files, which are likely to contain sensitive data, such as</span>
<span class="hljs-comment"># password, private keys, and other secrets. These should not be part of version </span>
<span class="hljs-comment"># control as they are data points which are potentially sensitive and subject </span>
<span class="hljs-comment"># to change depending on the environment.</span>
<span class="hljs-string">*.tfvars</span>
<span class="hljs-string">*.tfvars.json</span>

<span class="hljs-comment"># Ignore override files as they are usually used to override resources locally and so</span>
<span class="hljs-comment"># are not checked in</span>
<span class="hljs-string">override.tf</span>
<span class="hljs-string">override.tf.json</span>
<span class="hljs-string">*_override.tf</span>
<span class="hljs-string">*_override.tf.json</span>

<span class="hljs-comment"># Include override files you do wish to add to version control using negated pattern</span>
<span class="hljs-comment"># !example_override.tf</span>

<span class="hljs-comment"># Include tfplan files to ignore the plan output of command: terraform plan -out=tfplan</span>
<span class="hljs-comment"># example: *tfplan*</span>

<span class="hljs-comment"># Ignore CLI configuration files</span>
<span class="hljs-string">.terraformrc</span>
<span class="hljs-string">terraform.rc</span>
</code></pre>
<p>Hope you found this article helpful. I am Rishab, who loves writing and sharing content around cloud computing and DevOps. I also have a YouTube channel - <a target="_blank" href="https://youtube.com/@rishabincloud">Rishab in Cloud.</a></p>
]]></content:encoded></item><item><title><![CDATA[How I passed the GitHub Foundations Certification Exam]]></title><description><![CDATA[GitHub now has certifications. They launched four certifications to endorse your GitHub skills:

GitHub Foundations

GitHub Actions

GitHub Advanced Security

GitHub Administration


You can find more details on all four here.
GitHub Foundations Exam...]]></description><link>https://blog.rishabkumar.com/github-foundations-certification-exam</link><guid isPermaLink="true">https://blog.rishabkumar.com/github-foundations-certification-exam</guid><category><![CDATA[GitHub]]></category><category><![CDATA[Certification]]></category><category><![CDATA[Git]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Fri, 19 Jan 2024 15:49:08 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1705638541547/2670bbd7-3e2b-4741-b763-d21f91f27cec.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>GitHub now has certifications. They launched four certifications to endorse your GitHub skills:</p>
<ul>
<li><p>GitHub Foundations</p>
</li>
<li><p>GitHub Actions</p>
</li>
<li><p>GitHub Advanced Security</p>
</li>
<li><p>GitHub Administration</p>
</li>
</ul>
<p>You can find more details on all four <a target="_blank" href="https://resources.github.com/learn/certifications/">here.</a></p>
<h2 id="heading-github-foundations-exam">GitHub Foundations Exam</h2>
<p>According to GitHub, with this certification you highlight your understanding of the foundational topics and concepts of collaborating, contributing, and working on GitHub. This exam covers collaboration, GitHub products, Git basics, and working within GitHub repositories.</p>
<p>There are 7 domains in the exam, the official study guide can be found <a target="_blank" href="https://assets.ctfassets.net/wfutmusr1t3h/1kmMx7AwI4qH8yIZgOmQlP/4e60030cc6c76688698652e830ea2a48/github-foundations-exam-study-guide.pdf">here.</a></p>
<ul>
<li><p>Domain 1: Introduction to Git and GitHub</p>
</li>
<li><p>Domain 2: Working with GitHub Repositories</p>
</li>
<li><p>Domain 3: Collaboration Features</p>
</li>
<li><p>Domain 4: Modern Development</p>
</li>
<li><p>Domain 5: Project Management</p>
</li>
<li><p>Domain 6: Privacy, Security, and Administration</p>
</li>
<li><p>Domain 7: Benefits of the GitHub Community</p>
</li>
</ul>
<h2 id="heading-registering-for-the-exam">Registering for the exam</h2>
<p>I came across this launch announcement by a tweet from <a target="_blank" href="https://twitter.com/itsthatladydev">Kedasha</a>:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://x.com/itsthatladydev/status/1745847347186225643?s=20">https://x.com/itsthatladydev/status/1745847347186225643?s=20</a></div>
<p> </p>
<p>And immediately started looking at all the certifications that were offered. All of them are priced at $99 USD. As you can see in the tweet, for a limited time the GitHub Foundations Certification is on 50% discount, making the fee $49.50 USD.</p>
<p>So I decided to book my exam. You need to register an account on the <a target="_blank" href="https://examregistration.github.com/overview">certification site</a>(you can use your existing GitHub account).</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1705636717822/0ff9ea45-892b-4109-b0a3-69eb85e6fa07.png" alt class="image--center mx-auto" /></p>
<p>After registering, you can book the desired exam, it will ask for your personal info and take you to PSI, the test provider, where you will be able to choose if you want to take it Online or onsite at a test centre.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1705636881235/6f21abb5-0cbc-4069-b94d-3104eac25d59.png" alt class="image--center mx-auto" /></p>
<p>So, I booked my for 13th January, 2024, which was the very next day.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabincloud/status/1746009108245782731">https://twitter.com/rishabincloud/status/1746009108245782731</a></div>
<p> </p>
<h2 id="heading-preparation">Preparation</h2>
<p>I used Microsoft Learning path "<a target="_blank" href="https://learn.microsoft.com/en-us/collections/o1njfe825p602p">GitHub Foundations</a>" for my preparation. It had 14 modules, out of which I was able to complete 12 of them.</p>
<p>One other thing I would like to point out, my previous knowledge and experience with GitHub did help a lot, to sit this exam within a day. I have been using GitHub for 4-5 years now and have been maintaining and contributing to open-source projects.</p>
<h2 id="heading-exam-experience">Exam experience</h2>
<p>There were about 75 questions on the exam and you have 120 minutes to complete the exam.</p>
<p>I took around 63 minutes to answer all the 75 questions. And I passed!</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabincloud/status/1746266194816987643">https://twitter.com/rishabincloud/status/1746266194816987643</a></div>
<p> </p>
<p>You immediately see if you passed or failed after you end the exam, and receive a score report and credly badge few minutes after you submit and end your exam. I scored 75% on the exam, but there was no score mentioned on the score report.</p>
<p>Questions and topics that I found challenging were:</p>
<ul>
<li><p>GitHub Administration and Security</p>
</li>
<li><p>GitHub Enterprise Server and GitHub Enterprise Cloud</p>
</li>
</ul>
<p>There were questions about Enterprise offerings and what authentication methods are supported on each, which I don't have much experience in. So will definitely recommend going over these offerings.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/2-QiORv6wew">https://youtu.be/2-QiORv6wew</a></div>
<p> </p>
<hr />
<p>Hope this helps you with your preparations and I wish you luck if you are sitting the exam soon. I am thinking of taking the GitHub Actions Certification next, will let you know if I do take and will share my experience. You can find me on X/Twitter <a target="_blank" href="https://x.com/rishabincloud">@rishabincloud</a> or on <a target="_blank" href="https://linkedin.com/in/rishabkumar7">LinkedIn</a> if you have any questions.</p>
]]></content:encoded></item><item><title><![CDATA[Learn This Before Diving into Kubernetes]]></title><description><![CDATA[Kubernetes has become a cornerstone in modern software deployment, but mastering it is no small feat. As I discovered in my own journey, jumping straight into Kubernetes without a solid foundation can lead to frustration and setbacks. In this blog, I...]]></description><link>https://blog.rishabkumar.com/learn-this-before-kubernetes</link><guid isPermaLink="true">https://blog.rishabkumar.com/learn-this-before-kubernetes</guid><category><![CDATA[Kubernetes]]></category><category><![CDATA[Devops]]></category><category><![CDATA[containers]]></category><category><![CDATA[learning]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Mon, 08 Jan 2024 16:44:05 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1704731370604/400cd22c-5d3f-48aa-a6b7-7d9aef377dc6.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Kubernetes has become a cornerstone in modern software deployment, but mastering it is no small feat. As I discovered in my own journey, jumping straight into Kubernetes without a solid foundation can lead to frustration and setbacks. In this blog, I'll share the essential prerequisites you should master before tackling Kubernetes, ensuring a smoother learning experience.</p>
<h2 id="heading-1-containerization-the-heart-of-kubernetes">1. Containerization: The Heart of Kubernetes</h2>
<p>Before you dive into Kubernetes, understanding containerization is crucial. Containerization is the backbone of Kubernetes. It involves encapsulating an application in a container with its own operating environment. One of the most popular tools for this is Docker.</p>
<p><strong>Understanding Docker:</strong></p>
<ul>
<li><p><strong>Dockerfile</strong>: This is a file containing instructions to build a container image.</p>
</li>
<li><p><strong>Building an Image</strong>: Use <code>docker build</code> to create the image from the Dockerfile.</p>
</li>
<li><p><strong>Running Containers</strong>: These images are blueprints for running your app in isolated environments.</p>
</li>
<li><p><strong>Example:</strong> Here is an example of a Dockerfile to containerize a Python Flask app.</p>
<pre><code class="lang-dockerfile">  <span class="hljs-keyword">FROM</span> python:<span class="hljs-number">3.8</span>-slim-buster

  <span class="hljs-keyword">WORKDIR</span><span class="bash"> /python-docker</span>

  <span class="hljs-keyword">COPY</span><span class="bash"> requirements.txt requirements.txt</span>
  <span class="hljs-keyword">RUN</span><span class="bash"> pip3 install -r requirements.txt</span>

  <span class="hljs-keyword">COPY</span><span class="bash"> . .</span>

  <span class="hljs-keyword">CMD</span><span class="bash"> [ <span class="hljs-string">"python3"</span>, <span class="hljs-string">"-m"</span> , <span class="hljs-string">"flask"</span>, <span class="hljs-string">"run"</span>, <span class="hljs-string">"--host=0.0.0.0"</span>]</span>
</code></pre>
</li>
</ul>
<p>Docker is not just popular but also a standard in many tech stacks, as evidenced by its ranking in <a target="_blank" href="https://survey.stackoverflow.co/2023/">Stack Overflow surveys.</a></p>
<h2 id="heading-2-cloud-basics-the-playground-of-kubernetes">2. Cloud Basics: The Playground of Kubernetes</h2>
<p>Most Kubernetes deployments are on cloud platforms. Familiarize yourself with the basics of cloud computing, especially with services from major providers like <a target="_blank" href="https://aws.amazon.com/eks/">AWS (EKS</a>), <a target="_blank" href="https://azure.microsoft.com/en-ca/products/kubernetes-service">Azure (AKS)</a>, and <a target="_blank" href="https://cloud.google.com/kubernetes-engine?hl=en">Google Cloud Platform (GKE)</a>.</p>
<h3 id="heading-key-cloud-concepts-to-learn">Key Cloud Concepts to Learn</h3>
<ul>
<li><p>Virtual Machines</p>
</li>
<li><p>Networking</p>
</li>
<li><p>DNS</p>
</li>
<li><p>Load Balancers</p>
</li>
</ul>
<p>Understanding these will help you grasp how Kubernetes operates in a cloud environment.</p>
<h2 id="heading-3-yaml-the-language-of-kubernetes">3. YAML: The Language of Kubernetes</h2>
<p>Kubernetes uses YAML (Yet Another Markup Language) to define and manage its configurations.</p>
<h3 id="heading-yaml-in-action">YAML in Action</h3>
<ul>
<li><p><strong>Defining Cluster State</strong>: Learn to write YAML to set up your Kubernetes deployments.</p>
</li>
<li><p><strong>Example</strong>: Here is a YAML file that creates <code>ReplicaSet</code> to bring 3 <code>nginx</code> pods.</p>
<pre><code class="lang-yaml">  <span class="hljs-attr">apiVersion:</span> <span class="hljs-string">apps/v1</span>
  <span class="hljs-attr">kind:</span> <span class="hljs-string">Deployment</span>
  <span class="hljs-attr">metadata:</span>
    <span class="hljs-attr">name:</span> <span class="hljs-string">nginx-deployment</span>
    <span class="hljs-attr">labels:</span>
      <span class="hljs-attr">app:</span> <span class="hljs-string">nginx</span>
  <span class="hljs-attr">spec:</span>
    <span class="hljs-attr">replicas:</span> <span class="hljs-number">3</span>
    <span class="hljs-attr">selector:</span>
      <span class="hljs-attr">matchLabels:</span>
        <span class="hljs-attr">app:</span> <span class="hljs-string">nginx</span>
    <span class="hljs-attr">template:</span>
      <span class="hljs-attr">metadata:</span>
        <span class="hljs-attr">labels:</span>
          <span class="hljs-attr">app:</span> <span class="hljs-string">nginx</span>
      <span class="hljs-attr">spec:</span>
        <span class="hljs-attr">containers:</span>
        <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">nginx</span>
          <span class="hljs-attr">image:</span> <span class="hljs-string">nginx:1.14.2</span>
          <span class="hljs-attr">ports:</span>
          <span class="hljs-bullet">-</span> <span class="hljs-attr">containerPort:</span> <span class="hljs-number">80</span>
</code></pre>
</li>
</ul>
<h2 id="heading-4-networking-basics-the-connectivity-of-kubernetes">4. Networking Basics: The Connectivity of Kubernetes</h2>
<p>Kubernetes is heavily reliant on networking. A basic grasp of networking concepts will significantly ease your Kubernetes journey.</p>
<h3 id="heading-networking-concepts-to-know">Networking Concepts to Know</h3>
<ul>
<li><p>OSI Model</p>
</li>
<li><p>IP Addresses</p>
</li>
<li><p>Networking Protocols</p>
</li>
<li><p>Ports and DNS</p>
</li>
</ul>
<p>Understanding these will clarify how Kubernetes pods, services, and external systems communicate.</p>
<h2 id="heading-conclusion-build-your-foundation-first">Conclusion: Build Your Foundation First</h2>
<p>My experience taught me that jumping into Kubernetes without these prerequisites can lead to unnecessary challenges. By first understanding containerization, cloud basics, YAML, and networking, you'll be better equipped to tackle Kubernetes.</p>
<h3 id="heading-support-and-further-learning">Support and Further Learning</h3>
<p>If you found this guide helpful, consider liking and sharing this article. Your likes and shares are greatly appreciated. Also, let me know in the comments if you're interested in a Docker course where I explain its workings and containerize a couple of apps.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/y37cDE_8PiE?si=P-qTUtw3K6NA-HmS">https://youtu.be/y37cDE_8PiE?si=P-qTUtw3K6NA-HmS</a></div>
]]></content:encoded></item><item><title><![CDATA[Deploying Grafana to Azure’s Web Apps for Containers]]></title><description><![CDATA[Hello Cloud adventurers, I have been on a learning journey this year with containerization, specifically docker. Continuing the series of blogs, I wanted to try deploying Grafana to Azure but as a docker container. I know there are different ways and...]]></description><link>https://blog.rishabkumar.com/grafana-on-azure-web-app-containers</link><guid isPermaLink="true">https://blog.rishabkumar.com/grafana-on-azure-web-app-containers</guid><category><![CDATA[Azure]]></category><category><![CDATA[containers]]></category><category><![CDATA[Docker]]></category><category><![CDATA[Grafana]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Mon, 21 Aug 2023 15:41:17 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1692283523573/7bfe3341-fcf0-4792-999e-5f3549742bdc.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hello Cloud adventurers, I have been on a learning journey this year with containerization, specifically docker. Continuing the series of blogs, I wanted to try deploying Grafana to Azure but as a docker container. I know there are different ways and different services within Azure that you can use to deploy a container, but I will be using <a target="_blank" href="https://azure.microsoft.com/en-ca/products/app-service/web">Azure Web Apps.</a></p>
<p>Deploying Grafana to Azure’s Web Apps for Containers is a straightforward process:</p>
<ul>
<li><p>A storage account</p>
</li>
<li><p>An Azure Files share</p>
</li>
<li><p>Initiate an empty Sqlite database</p>
</li>
<li><p>Web Apps for Containers Azure App Service</p>
</li>
<li><p>Mount the Azure Files share</p>
</li>
<li><p>Set an environment variable</p>
</li>
</ul>
<p>Let’s get started.</p>
<h2 id="heading-create-a-storage-account">Create a Storage Account</h2>
<p>Begin by creating an Azure Storage account. This is where we will create the File Share later. You might ask, why do we need a File Share?</p>
<p>If I deployed Grafana as a container, the Sqlite database (and plug-ins) would be lost as soon as the container is reset. Hence, we will be using the File Share as storage for our Grafana database.</p>
<p>I have created <code>grafana-rg</code> as the resource group where all the resources will be deployed for this blog post. For the Storage Account, make sure you use a unique name and I went with <code>Standard</code> performance with <code>Geo-Redundant Storage (GRS)</code>. I left everything else as default.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692279763590/e530bd26-dc26-42c9-b98b-26dae27c92b4.png" alt="Creating a Storage Account in Azure" class="image--center mx-auto" /></p>
<h2 id="heading-create-an-azure-files-share">Create an Azure Files Share</h2>
<p>Navigate to your newly created storage account and scroll down to <code>File Shares</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692281095634/14c01326-0e5e-4407-9904-2d5e7dc2633d.png" alt="Azure Storage Account with File Shares highlighted on the Configuration Blade." class="image--center mx-auto" /></p>
<p>Click on the + File share button.</p>
<p>For the Name, I chose <code>grafana-storage</code> you can choose whatever you like. I went with the <code>Transaction optimized</code> tier. Everything else was left to default.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692281117633/994fc10c-0f49-4295-99bc-b1b5e69f20cf.png" alt="Creating new File Share with Transaction optimized Tier" class="image--center mx-auto" /></p>
<h2 id="heading-initiate-an-empty-sqlite-database">Initiate an empty Sqlite database</h2>
<p>I found it the hard way that if we leave Grafana to create the database, it will run into a database lock error. To overcome this, we’re going to create a Sqlite database manually and then, using the Azure CLI, we will upload the database to the file share.</p>
<p>On a Windows machine, you can run the following commands.</p>
<div data-node-type="callout">
<div data-node-type="callout-emoji">💡</div>
<div data-node-type="callout-text">Provided you’ve already installed <a target="_blank" href="https://chocolatey.org/install">choco</a>.</div>
</div>

<p>If you are on macOS, Sqlite is already installed and, therefore, you can skip the first command.</p>
<p>Also, you’ll need <a target="_blank" href="https://learn.microsoft.com/en-us/cli/azure/install-azure-cli">Azure CLI</a> installed and authenticated with your Azure account.</p>
<pre><code class="lang-bash">choco install sqlite
</code></pre>
<pre><code class="lang-bash">sqlite3 grafana.db <span class="hljs-string">'PRAGMA journal_mode=wal;'</span>
</code></pre>
<p>Now, let's copy the <code>grafana.db</code> to our File Share. We will use the <code>azcopy</code> command.</p>
<pre><code class="lang-bash">az login az storage copy -s .\grafana.db -d https://&lt;storage_account&gt;.file.core.windows.net/&lt;file_share&gt; --subscription &lt;subscription_name&gt;
</code></pre>
<p>Replace &lt;storage_account&gt;, &lt;file_share&gt;, and &lt;subscription_name&gt;, with the appropriate values: the name of your storage account, the name of your file share and your subscription’s id or name.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282141394/deb8fe0f-0e48-4c53-a2a8-95956c8029ab.png" alt="AZ Copy command output, copied the grafana.db over to Azure File Share" class="image--center mx-auto" /></p>
<h2 id="heading-app-service">App Service</h2>
<p>Create a new Azure web application that publishes a Docker Container and choose Linux for the OS. Choose a pre-existing Linux App Service Plan or create a new one.</p>
<p>There is a Free F1 Plan, but I went with Basic B1.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282217748/65f7c12d-138a-4671-8cfb-84fb68904dcc.png" alt="Creating a Web App, first settings page with Docker Container and Pricing plan as Basic B1." class="image--center mx-auto" /></p>
<p>For the Docker options, we’re going to leave the defaults for the moment. We’ll update this later once we have our environment fully configured. All other options we will the default.</p>
<div data-node-type="callout">
<div data-node-type="callout-emoji">💡</div>
<div data-node-type="callout-text">Make sure the <code>Enable Public Access</code> is checked.</div>
</div>

<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282334460/a243fa3c-bc2b-41fc-b3cb-81a1387a5bd1.png" alt="Using the default settings for Web App Container under Docker tab" class="image--center mx-auto" /></p>
<p>Once the Azure Web App has been successfully configured, you can access it by the app service URL. You should see the nginx message.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282346811/68700be8-aa70-4620-b04c-c6ee69a99196.png" alt="Azure Web App page with Default Domain and Configuration highlighted" class="image--center mx-auto" /></p>
<p>Now we will configure the Web App container to use the Azure File Share as storage.</p>
<h2 id="heading-mount-the-azure-files-share">Mount the Azure Files share</h2>
<p>In the Azure portal, on the App Service’s blade, click on <code>Configuration</code>, then <code>Path mappings</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282622040/0b2e3ece-3d5c-4fe1-846a-0117f2c97339.png" alt="Azure Web App Configuration window, with Path mappings highlighted" class="image--center mx-auto" /></p>
<p>Now click on <code>+ New Azure Storage Mount</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282647645/f666631a-e042-4d50-830e-806f3044be63.png" alt="Azure Web App configuration: adding New Azure Storage Mount" class="image--center mx-auto" /></p>
<p>Name the storage mount whatever you’d prefer and choose the <em>Storage Account</em> you created before, <code>Azure Files</code> for <em>Storage Type</em>, and the Azure File share created before for the <em>Storage Container</em>. Finally, most important, for the <em>Mount path</em>, type in <code>/var/lib/grafana</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692282653075/599ff125-67ee-46b9-b58a-2bc267f17028.png" alt="New Azure Storage Mount for Azure Web App with all the parameters filled in" class="image--center mx-auto" /></p>
<p>Click <code>OK</code> then <code>Save</code> at the top.</p>
<p>We just attached our Azure File share to our container and it will be mounted at <code>/var/lib/grafana</code> which is Grafana’s default file path for the database and its plugins. This means, should our container be restarted for any reason, none of our settings will be lost.</p>
<h2 id="heading-set-an-environment-variable"><strong>Set an Environment Variable</strong></h2>
<p>Now we have Grafana storing the database in our <code>/var/lib/grafana</code> mount path. However, it will not use the <code>PRAGMA</code> flag that we used earlier, therefore, we will experience <a target="_blank" href="https://github.com/grafana/grafana/issues/16638">database locking</a>.</p>
<p>Grafana allows us to overwrite default configuration variables by leveraging <strong>environment variables</strong>. So, we’re going to use Grafana’s default database connection string, but append the necessary flag.</p>
<p>Click on the <code>Configuration</code> blade once again for your Web App(you should still be on the <strong>Configuration</strong> blade from the previous step).</p>
<p>Now, choose <code>Application settings</code> and add a <code>+ New application setting.</code></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692283877904/1a9cb91f-4ecf-440f-a1e4-96b1a6f97f07.png" alt="Azure Web App Configuration page with Application Settings highlighted" class="image--center mx-auto" /></p>
<p>For the setting, enter the following:</p>
<pre><code class="lang-bash">Name: GF_DATABASE_URL
Value: sqlite3:///var/lib/grafana/grafana.db?cache=private&amp;mode=rwc&amp;_journal_mode=WAL
</code></pre>
<p>Leave the <code>Deployment slot setting</code> unchecked.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284016752/3478b7da-09bb-46b9-994d-f3f5963a3bfd.png" alt="Adding new application setting for Azure Web App" class="image--center mx-auto" /></p>
<p>Make sure you click <code>Save</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284045144/4c2eadd5-65e9-4a88-820e-b3d49ff70307.png" alt="Azure Web App Configuration page with Save button highlighted" class="image--center mx-auto" /></p>
<p>Awesome, now we have the environment configured, but we need to update our container’s image source.</p>
<p>In the App Service blade, under <code>Deployment Center</code>, choose <code>Settings</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284123943/44f79ab4-cc35-4be2-ae70-a76b7c0262ab.png" alt="Azure Web App Deployment Center page: changing the container settings" class="image--center mx-auto" /></p>
<p>By default, the <code>Full Image Name and Tag will</code> be set to <code>nginx</code>(which was set when we created the App Service).</p>
<p>We want to use the Grafana Container Image, I have my own docker image for Grafana that I want to use, but you can use the default one that is provided by Grafana.</p>
<p>Curious on how to create your own Docker image, <a target="_blank" href="https://youtu.be/tvIcZZBvnOk">watch this video on how I did it.</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284234451/3cdab5b0-7e3f-41a8-9ec2-cb349ac0e9c9.png" alt="Docker Hub with my own grafana-container image" class="image--center mx-auto" /></p>
<p>Change the registry source to <code>Docker Hub</code>, and the image to <code>grafana/grafana</code> if you want to use the official <a target="_blank" href="https://hub.docker.com/r/grafana/grafana">Grafana Image</a>.</p>
<p>Click <code>Save</code>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284241428/f76ffe28-30b8-4c3e-a10f-8d51d2a6a86d.png" alt="Azure Web App Deployment Center Container settings filled in with appropriate values for my grafana image" class="image--center mx-auto" /></p>
<p>It may take a moment for Azure to pick up the changes once you submit them. You can click <code>Refresh</code> a few times and check the logs to ensure the container was deployed correctly.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284479916/3125a882-7c4c-42ef-a885-7f09b60a5dad.png" alt="Azure Web App portal with Refresh and Logs button highlighted" class="image--center mx-auto" /></p>
<p>Now let's visit the Web App URL to see the changes.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1692284537794/f16d8122-1156-4333-8ae2-2a9af3231142.png" alt="Grafana Login screen when accessing the Azure Web App URL" class="image--center mx-auto" /></p>
<h2 id="heading-conclusion"><strong>Conclusion</strong></h2>
<p>We did it! We now have Grafana deployed to Azure Web Apps for Containers with Azure File Share as storage, making it redundant, not prone to losing data on container restarts and the ability to scale to support the necessary load.</p>
]]></content:encoded></item><item><title><![CDATA[How I Passed LPI Linux Essentials Certification]]></title><description><![CDATA[Introduction
In this blog post, I will share my experience and the resources I used to successfully pass the Linux Essentials certification. This certification is a fundamental level exam that provides a solid foundation in Linux. Whether you're new ...]]></description><link>https://blog.rishabkumar.com/how-i-passed-lpi-linux-essentials-certification</link><guid isPermaLink="true">https://blog.rishabkumar.com/how-i-passed-lpi-linux-essentials-certification</guid><category><![CDATA[Linux]]></category><category><![CDATA[Certification]]></category><category><![CDATA[Cloud]]></category><category><![CDATA[#cybersecurity]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Thu, 15 Jun 2023 14:11:07 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1686323523477/eb60d499-2180-4a39-897f-a71bad380cae.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-introduction">Introduction</h2>
<p>In this blog post, I will share my experience and the resources I used to successfully pass the Linux Essentials certification. This certification is a fundamental level exam that provides a solid foundation in Linux. Whether you're new to Linux or pursuing a career in cloud security or cybersecurity, this certification can be highly beneficial. I will discuss the preparation process, the resources I utilized, and my overall exam experience.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabk7/status/1664652268581396480?s=20">https://twitter.com/rishabk7/status/1664652268581396480?s=20</a></div>
<p> </p>
<h2 id="heading-exam-objective">Exam Objective</h2>
<p>Here is the official <a target="_blank" href="https://www.lpi.org/our-certifications/exam-010-objectives">Exam Objective for Linux Essentials</a>:</p>
<ol>
<li><p>The Linux Community and a Career in Open Source</p>
</li>
<li><p>Finding Your Way on a Linux System</p>
</li>
<li><p>The Power of the Command Line</p>
</li>
<li><p>The Linux Operating System</p>
</li>
<li><p>Security and File Permissions</p>
</li>
</ol>
<h2 id="heading-preparation-and-background">Preparation and Background</h2>
<p>I had been aiming to take the Linux Essentials certification exam since 2021. Although I prepared for the exam in 2022, I didn't sit for it until 2023. Having completed college courses on Linux fundamentals and advanced networking with Linux, I was already well-versed in Linux and the command line. Additionally, I had over three years of practical experience working with Linux daily, managing approximately 50 Linux servers in a cloud environment.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabk7/status/1663876403538849794?s=20">https://twitter.com/rishabk7/status/1663876403538849794?s=20</a></div>
<p> </p>
<h2 id="heading-resources-used">Resources Used</h2>
<ol>
<li><p>Jason Dion's Linux Essentials Course: I highly recommend this course, <a target="_blank" href="https://www.udemy.com/course/linux-essentials-010/">available on Udemy</a> and included in the LinkedIn Learning subscription. For North American users, LinkedIn Learning can be accessed for free through public library memberships.</p>
</li>
<li><p>TryHackMe and Linux Journey: These platforms provide hands-on practice and a gamified learning experience. <a target="_blank" href="http://tryhackme.com">TryHackMe</a> offers Linux fundamentals rooms, while <a target="_blank" href="https://linuxjourney.com/">Linux Journey</a> provides an introduction to the command line and important Linux concepts.</p>
</li>
<li><p>Practice Exams: Day, a recommended creator, suggests <a target="_blank" href="https://www.udemy.com/course/linux-essentials-practice-exams/">Jason Dion's practice exams</a> for additional preparation. Although I personally didn't go through any practice exams, they can be beneficial for beginners to assess their knowledge before the actual exam.</p>
</li>
</ol>
<h2 id="heading-exam-experience">Exam Experience</h2>
<p>I chose to take the exam at a nearby test center instead of at home. Having moved to an apartment close to a Pearson VUE test center, I found it more convenient. The exam consisted of <strong>40 questions</strong>, and I had <strong>60 minutes</strong> to complete it. However, I finished the exam in around 25 minutes. The passing score for this exam is <strong>500 out of 800</strong>, and I scored 650. The exam fee was <strong>USD 120</strong>.</p>
<h2 id="heading-conclusion">Conclusion</h2>
<p>Obtaining the LPI Linux Essentials certification has been a valuable achievement. Linux is an essential tool in various fields, particularly in cloud and cybersecurity. Whether you're starting your career journey or looking to enhance your skill set, this certification can provide a solid foundation. In the future, I may explore additional Linux certifications. Stay tuned for more blogs and videos on useful Linux commands, the file system, and other relevant topics. Thank you for reading, and I hope you found this post helpful.</p>
<p>Feel free to reach me at <a target="_blank" href="https://twitter.com/rishabk7">rishabk7 on Twitter</a> or <a target="_blank" href="https://www.linkedin.com/in/rishabkumar7/">Rishab Kumar on LinkedIn.</a></p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/HMr4CvhETZo">https://youtu.be/HMr4CvhETZo</a></div>
]]></content:encoded></item><item><title><![CDATA[Understanding Docker - as an 11 year old]]></title><description><![CDATA[Hello, cloud explorers!
Have you ever played with LEGO blocks? If not, you still might know what they are. Those colorful, versatile bricks that let your imagination run wild? Sure, you have! We all love to build castles, robots, or spacecraft with t...]]></description><link>https://blog.rishabkumar.com/understanding-docker</link><guid isPermaLink="true">https://blog.rishabkumar.com/understanding-docker</guid><category><![CDATA[Docker]]></category><category><![CDATA[Devops]]></category><category><![CDATA[containers]]></category><category><![CDATA[Cloud]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Thu, 08 Jun 2023 13:14:45 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1684936041422/0f179de6-3daf-4108-86dc-3d9136a6f9e6.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hello, cloud explorers!</p>
<p>Have you ever played with LEGO blocks? If not, you still might know what they are. Those colorful, versatile bricks that let your imagination run wild? Sure, you have! We all love to build castles, robots, or spacecraft with them, right? Now, what if I told you there's something quite similar in the world of computers, and it's called Docker!</p>
<p>Hold up! You might be thinking, "What does <strong>Docker</strong> have to do with my LEGO blocks?" Well, let's embark on a journey to unravel this mystery!</p>
<h2 id="heading-introduction">Introduction</h2>
<p>In our day-to-day life, we interact with many applications on our smartphones, tablets, or computers – games, educational apps, and a lot more. But have you ever wondered how these applications are created? They're built by software developers using different tools and languages. It's like constructing a LEGO masterpiece, but instead of physical blocks, they use blocks of code!</p>
<p><img src="https://images.unsplash.com/photo-1585366119957-e9730b6d0f60?ixlib=rb-4.0.3&amp;ixid=M3wxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8fA%3D%3D&amp;auto=format&amp;fit=crop&amp;w=2371&amp;q=80" alt="white and black lego toy by danielkcheung on Unsplash" class="image--center mx-auto" /></p>
<p>However, there's a challenge. Just like your LEGO creation might need a specific type of brick, these applications also need certain elements to work properly. When these applications are moved from one place to another (like from a developer's computer to a server in the cloud), they might not work correctly because some essential elements are missing or not compatible. It's like trying to fit your LEGO masterpiece into different rooms with different settings – some rooms might not have enough space or the right table for your masterpiece.</p>
<p>This is where Docker, the hero of our story, comes into play!</p>
<h2 id="heading-understanding-docker">Understanding Docker</h2>
<p>Docker is like a magical toolbox that provides a perfect environment (a room) for your application (LEGO masterpiece) to run. This toolbox is portable and can be carried anywhere. It ensures that no matter where you take it, the application will run just as intended, just like your LEGO masterpiece remains intact inside the box.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1685990306532/f4b3635e-3ea4-49df-894c-5f13a36896fa.png" alt="Docker Diagram" class="image--center mx-auto" /></p>
<p>This magical toolbox is what we call a <strong>Docker Container</strong>. In technical terms, a Docker Container is a lightweight, standalone package that includes everything an application needs to run – code, runtime, libraries, and system tools. No matter where you move this container, the application will always run without any issues!</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1685990813473/2ceed61f-e375-47e7-a892-f0ffef7de9a9.png" alt="Docker Container with Docker File and Image" class="image--center mx-auto" /></p>
<p>Now, Docker also has something called <strong>Docker Images</strong>. They're like the instruction manual that comes with your LEGO sets, telling you what pieces you need and how to put them together. In the same way, Docker Images provide the blueprint for creating Docker Containers.</p>
<h2 id="heading-wrapping-up-our-adventure">Wrapping Up Our Adventure</h2>
<p>So there it is, folks! Docker, in simple terms, is a fantastic tool that helps create and deliver software applications conveniently and reliably, just like transporting a LEGO masterpiece safely in a magic toolbox.</p>
<p>Although Docker sounds a bit technical, just remember that even the most complex things can be understood when related to something fun and familiar. Today, it's LEGO blocks and Docker, tomorrow it could be something else!</p>
<p>I am Rishab, feel free to reach out to me <a target="_blank" href="https://twitter.com/rishabk7">@rishabk7</a> on Twitter or <a target="_blank" href="https://instagram.com/rishabincloud">@rishabincloud</a> on IG. Also, I wrote this <a target="_blank" href="https://blog.rishabkumar.com/docker-cheat-sheet">cheat sheet for Docker</a> that you might find helpful.</p>
]]></content:encoded></item><item><title><![CDATA[Ultimate Docker Cheat Sheet]]></title><description><![CDATA[Last year, I started learning more about containerization, which meant gaining some skills with Docker, an open-source project for automating the deployment of applications as portable, self-sufficient containers.If you use Docker, you are well aware...]]></description><link>https://blog.rishabkumar.com/docker-cheat-sheet</link><guid isPermaLink="true">https://blog.rishabkumar.com/docker-cheat-sheet</guid><category><![CDATA[Docker]]></category><category><![CDATA[containers]]></category><category><![CDATA[Devops]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Tue, 30 May 2023 14:45:23 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1685045732096/eb36ac40-44eb-4bef-9476-180c4d5cba10.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Last year, I started learning more about containerization, which meant gaining some skills with Docker, an <a target="_blank" href="https://github.com/docker/docker">open-source project</a> for automating the deployment of applications as portable, self-sufficient containers.<br />If you use Docker, you are well aware of how effective it can be in streamlining and improving development procedures. However, the numerous commands and Dockerfile instructions can sometimes feel overwhelming, especially if you're new to the platform. That's why I've put together this Docker cheat sheet to help you keep track of the most common commands.</p>
<h2 id="heading-docker-command-line-interface-cli-commands"><strong>Docker Command Line Interface (CLI) Commands</strong></h2>
<h3 id="heading-general-commands">General Commands</h3>
<ul>
<li><p><code>docker version</code>: Need to check which Docker version you're running? This command will provide all the information.</p>
</li>
<li><p><code>docker info</code>: If you're looking for system-wide information related to Docker, this command is your go-to.</p>
</li>
<li><p><code>docker help &lt;command&gt;</code>: Are you uncertain about a specific command? Add the command after <code>docker help</code> to get detailed information.</p>
</li>
</ul>
<h3 id="heading-image-commands">Image Commands</h3>
<ul>
<li><p><code>docker images</code>: This command will provide a list of all the images present on your system.</p>
</li>
<li><p><code>docker pull &lt;image&gt;</code>: This command allows you to pull an image from a registry.</p>
</li>
<li><p><code>docker rmi &lt;image&gt;</code>: Use this command to remove one or images from your system.</p>
</li>
</ul>
<h3 id="heading-container-commands">Container Commands</h3>
<ul>
<li><p><code>docker ps</code>: List all running containers with this command.</p>
</li>
<li><p><code>docker ps -a</code>: List all containers, including stopped ones.</p>
</li>
<li><p><code>docker run &lt;image&gt;</code>: Use this command to run a command in a new container, pulling the image if needed and starting the container.</p>
</li>
<li><p><code>docker stop &lt;container&gt;</code>: Stop a running container.</p>
</li>
<li><p><code>docker rm &lt;container&gt;</code>: To remove one or more containers from your system.</p>
</li>
</ul>
<h3 id="heading-dockerfile-commands">Dockerfile Commands</h3>
<ul>
<li><p><code>docker build -t &lt;tag&gt; .</code>: This command lets you build an image from a Dockerfile in the current directory,</p>
</li>
<li><p><code>docker tag &lt;image&gt; &lt;tag&gt;</code>: You can tag an image to a name (local or registry) with this command.</p>
</li>
</ul>
<h3 id="heading-docker-compose-commands">Docker Compose Commands</h3>
<ul>
<li><p><code>docker-compose up</code>: This command builds, (re)creates, starts, and attaches to containers for a service.</p>
</li>
<li><p><code>docker-compose down</code>: If you want to stop and remove containers, networks, images, and volumes, use this command.</p>
</li>
<li><p><code>docker-compose build</code>: This command is used to build or rebuild services.</p>
</li>
</ul>
<h2 id="heading-dockerfile-instructions">Dockerfile Instructions</h2>
<p>Dockerfile instructions are used to assemble a Docker image. Here are some of the essentials:</p>
<ul>
<li><p><code>FROM</code>: This sets the base image for subsequent instructions.</p>
</li>
<li><p><code>RUN</code>: This allows you to execute commands in a new layer on top of the current image and commit the results.</p>
</li>
<li><p><code>CMD</code>: This specifies the command to run when a container is launched.</p>
</li>
<li><p><code>EXPOSE</code>: You can inform Docker that the container listens on the specified network ports at runtime with this instruction.</p>
</li>
<li><p><code>ENV</code>: Set environment variables using this instruction.</p>
</li>
<li><p><code>ADD/COPY</code>: These instructions let you copy new files, directories, or remote file URLs and add them to the image filesystem.</p>
</li>
<li><p><code>ENTRYPOINT</code>: Configure a container that will run as an executable with this instruction.</p>
</li>
<li><p><code>VOLUME</code>: This creates a mount point and marks it as holding externally mounted volumes.</p>
</li>
<li><p><code>USER</code>: This sets the user name or UID used when running the image and for any following RUN, CMD, and ENTRYPOINT instructions.</p>
</li>
<li><p><code>WORKDIR</code>: This sets the working directory for any following RUN, CMD, ENTRYPOINT, COPY, and ADD instructions.</p>
</li>
</ul>
<p>Lastly, it is always good practice to clean up and remove unused Docker resources. Docker provides a clean-up command for this: <code>docker system prune</code>. However, use this command with caution, as it will remove all unused resources.<br />For more in-depth information about Docker CLI commands and Dockerfile instructions, refer to the <a target="_blank" href="https://docs.docker.com/"><strong>official Docker documentation</strong></a>.</p>
<p>This cheat sheet should serve as a handy reference guide whether you're a Docker newbie or a seasoned professional. I also made a PDF version which you can <a target="_blank" href="https://www.buymeacoffee.com/rishabincloud/e/139731"><strong>download here.</strong></a><br />Free feel to reach out to me if you have any questions, I am <a target="_blank" href="https://twitter.com/rishabk7">@rishabk7</a> on Twitter and you can also find me on <a target="_blank" href="https://linkedin.com/in/rishabkumar7">LinkedIn.</a></p>
]]></content:encoded></item><item><title><![CDATA[How I Passed the Google Cloud Associate Cloud Engineer Certification Exam]]></title><description><![CDATA[Hey there, I know, another certification blog post!
Today, I'll be sharing the resources and strategies that helped me pass the Google Cloud Associate Cloud Engineer exam. Whether you're new to the world of cloud and cloud certifications or have some...]]></description><link>https://blog.rishabkumar.com/how-i-passed-google-cloud-associate-cloud-engineer-certification</link><guid isPermaLink="true">https://blog.rishabkumar.com/how-i-passed-google-cloud-associate-cloud-engineer-certification</guid><category><![CDATA[GCP]]></category><category><![CDATA[Cloud]]></category><category><![CDATA[Certification]]></category><category><![CDATA[Cloud Computing]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Wed, 24 May 2023 13:06:48 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1684429561972/8848702d-a538-4fd2-b2a6-97f5438ff7a1.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Hey there, I know, another certification blog post!</p>
<p>Today, I'll be sharing the resources and strategies that helped me pass the Google Cloud Associate Cloud Engineer exam. Whether you're new to the world of cloud and cloud certifications or have some experience, I hope you'll find these tips useful. I also have a video about the same and posts about other cloud certifications, so make sure to check those out too.</p>
<h2 id="heading-my-background-and-motivation">My Background and Motivation</h2>
<p>Before I dive into the specifics, let me give you some context. I joined Google early in January 2022 and as part of my onboarding process within the Google Cloud team, I was required to get this certification within a certain timeframe.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabk7/status/1488148706918649857?s=20">https://twitter.com/rishabk7/status/1488148706918649857?s=20</a></div>
<p> </p>
<p>Having worked in cloud computing for about two and a half years, with prior experience in AWS and Azure, I was not entirely new to cloud computing concepts. However, I did need to transfer my knowledge and experience over to the Google Cloud Platform.</p>
<p>I dedicated around 25-30 days to preparing for the exam and was able to pass it successfully. In this post, I'll share some of the resources that helped me along the way.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1684934434534/a70b98f2-d481-46ca-97fc-721bcf526ce1.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-resources-that-helped-me">Resources That Helped Me</h2>
<h3 id="heading-official-google-resources">Official Google Resources</h3>
<p>To start, I referred to the Associate Cloud Engineering Exam Guide, the official exam guide by Google. It helped me understand the different sections I needed to focus on, such as setting up a cloud solution environment, planning and configuring a cloud solution, deploying and implementing a cloud solution, ensuring the successful operation of a cloud solution, and configuring access and security.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://cloud.google.com/learn/certification/cloud-engineer">https://cloud.google.com/learn/certification/cloud-engineer</a></div>
<p> </p>
<p>Another official resource from Google is the <a target="_blank" href="https://www.cloudskillsboost.google/paths/11">Cloud Engineer Learning Path</a>. This links to the Cloud Skills Boost Lab (formerly QwickLabs) which offers six courses and 25 hands-on labs. I highly recommend going through this learning path, as it offers invaluable hands-on experience.</p>
<h3 id="heading-video-based-courses">Video-based Courses</h3>
<p>For video-based courses, I opted for Coursera's "<a target="_blank" href="https://www.coursera.org/learn/preparing-cloud-associate-cloud-engineer-exam">Preparing for Your Associate Cloud Engineer Journey</a>" and their essential courses on cloud fundamentals and infrastructure scaling and automation. Note that these courses might be paid, but Google occasionally offers challenges that allow you to access some of these resources for free.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://www.coursera.org/learn/preparing-cloud-associate-cloud-engineer-exam">https://www.coursera.org/learn/preparing-cloud-associate-cloud-engineer-exam</a></div>
<p> </p>
<p>Alternatively, freeCodeCamp and Anthony have provided a comprehensive Google Associate Cloud Engineering course on YouTube, which is completely free.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/jpno8FSqpc8">https://youtu.be/jpno8FSqpc8</a></div>
<p> </p>
<h3 id="heading-practice-exams">Practice Exams</h3>
<p>I highly recommend doing practice exams to familiarize yourself with the exam format and the type of questions you might encounter. I found Tutorial Dojo's practice exams to be particularly useful and reliable.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://portal.tutorialsdojo.com/product/google-certified-associate-cloud-engineer-practice-exams/">https://portal.tutorialsdojo.com/product/google-certified-associate-cloud-engineer-practice-exams/</a></div>
<p> </p>
<h3 id="heading-text-based-resources-and-flashcards">Text-based Resources and Flashcards</h3>
<p>For those who prefer text-based resources or books, I recommend the "<a target="_blank" href="https://amzn.to/45qbnah">Google Cloud Certified Associate Cloud Engineer Study Guide" by Dan.</a></p>
<p>I also used <a target="_blank" href="https://quizlet.com/328524759/google-cloud-certified-associate-cloud-engineer-flash-cards/">Quizlet flashcards</a> to help memorize and recall key concepts. Flashcards are an excellent study tool, and the Quizlet set by "christopher_gang" that I used included 213 flashcards covering various services relevant to the exam.</p>
<h2 id="heading-conclusion">Conclusion</h2>
<p>In conclusion, these were the resources that I relied on for my preparation. The entire process took me around 25 days. Remember, each person's journey is unique, and what worked for me might not work for you. However, I hope that sharing my experience will provide you with some guidance and inspiration as you prepare for your Google Cloud Associate Cloud Engineer exam.</p>
<p>Good luck with your exam and your journey in cloud and DevOps! Feel free to reach out to me on <a target="_blank" href="https://twitter.com/rishabk7">Twitter</a> or <a target="_blank" href="https://linkedin.com/in/rishabkumar7">LinkedIn</a>.</p>
<p>Also, I made a video about my GCP Exam experience!</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/NavIeYNkZv8">https://youtu.be/NavIeYNkZv8</a></div>
]]></content:encoded></item><item><title><![CDATA[How I passed the AWS DevOps Engineer Professional Exam]]></title><description><![CDATA[Introduction
In this blog post, I'll share my experience taking the AWS DevOps Pro Exam, how I prepared for it, well I guess not prepared for it, and some recommendations for resources to help you pass the exam.
Preparation
Like the Azure DevOps Expe...]]></description><link>https://blog.rishabkumar.com/how-i-passed-aws-devops-engineer-professional-exam</link><guid isPermaLink="true">https://blog.rishabkumar.com/how-i-passed-aws-devops-engineer-professional-exam</guid><category><![CDATA[AWS]]></category><category><![CDATA[Cloud]]></category><category><![CDATA[Devops]]></category><category><![CDATA[Certification]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Mon, 15 May 2023 19:14:30 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1684165251438/023f6ee1-9443-4416-8896-f8532a18537a.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-introduction">Introduction</h2>
<p>In this blog post, I'll share my experience taking the AWS DevOps Pro Exam, how I prepared for it, well I guess not prepared for it, and some recommendations for resources to help you pass the exam.</p>
<h2 id="heading-preparation">Preparation</h2>
<p>Like the <a target="_blank" href="https://blog.rishabkumar.com/how-i-passed-azure-az-400-devops-engineer-exam">Azure DevOps Expert Exam (AZ-400)</a>, I didn't have any specific preparation for the AWS DevOps PRO Exam either. I had a coupon expiring and the exam version was retiring, so I decided to wing it.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://twitter.com/rishabk7/status/1633101885820268544?s=20">https://twitter.com/rishabk7/status/1633101885820268544?s=20</a></div>
<p> </p>
<p>That being said, I do have one year of DevOps engineering experience using TeamCity, CloudFormation, Terraform, and Azure DevOps, which helped me understand DevOps principles and where AWS tools fit in the process. However, I wasn't familiar with AWS-specific tooling, such as CodeCommit, CodePipeline, and CodeBuild.</p>
<h2 id="heading-my-exam-experience">My Exam Experience</h2>
<p>I took the exam in person at a nearby college rather than doing it remotely from my home, via PearsonVUE. The exam took almost two and a half hours out of the three hours allotted, and I found it challenging to focus after the 30-35th question due to a lack of breakfast and a bad headache.</p>
<p>The questions were lengthier compared to the Azure DevOps Expert Exam (AZ-400), requiring more reading and remembering the context of the questions. Nevertheless, I passed the exam with a score of 756.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1684177385826/32041d18-4fb4-4293-b117-e2d12bf5dc34.png" alt="AWS DevOps Engineer Professional Exam Report" class="image--center mx-auto" /></p>
<h2 id="heading-recommendations-for-exam-preparation">Recommendations for Exam Preparation</h2>
<p>I suggest familiarizing yourself with all the DevOps tools offered by AWS, such as:</p>
<ul>
<li><p>CodeCommit, CodeBuild, CodeDeploy, CodePipeline</p>
</li>
<li><p>CloudFormation</p>
</li>
<li><p>Elastic Beanstalk</p>
</li>
<li><p>SSM and OpsWorks</p>
</li>
<li><p>There were also questions about CloudTrail, CloudWatch logs, monitoring, AWS Config, and AWS Inspector.</p>
</li>
</ul>
<p>Understanding the concepts of <strong>fault tolerance</strong>, <strong>disaster recovery</strong>, and <strong>high availability</strong> is essential. If you have a strong understanding of the <strong>Software Development Lifecycle</strong> (SDLC) and have worked within a DevOps or Cloud team, you should be well-prepared for the exam.</p>
<h2 id="heading-resources">Resources</h2>
<p>Though I didn't use any resources for my preparation, I recommend checking out Stephen Marek's courses on Udemy for AWS certificate courses.</p>
<p><a target="_blank" href="https://www.udemy.com/course/aws-certified-devops-engineer-professional-hands-on/">AWS Certified DevOps Engineer Professional 2023 - Hands On!</a></p>
<p>For practice exams, Tutorial Dojo/Jon Bonso's practice exams are a great option.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://portal.tutorialsdojo.com/product/aws-certified-devops-engineer-professional-practice-exams/">https://portal.tutorialsdojo.com/product/aws-certified-devops-engineer-professional-practice-exams/</a></div>
<p> </p>
<h2 id="heading-conclusion">Conclusion</h2>
<p>While I didn't prepare extensively for the AWS DevOps Pro Exam, my experience in the field and understanding of DevOps principles helped me pass. Remember that hands-on experience is <strong>invaluable</strong> when preparing for an exam like this. Good luck with your preparation, and I hope you find these insights helpful!</p>
<p>Bonus: If you are a video person, check out my <a target="_blank" href="https://youtube.com/@rishabkumar7">YouTube Channel</a>, where I talk about Cloud, DevOps and tech.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/xtIZ3AyaRK4">https://youtu.be/xtIZ3AyaRK4</a></div>
<p> </p>
<p>Feel free to reach out to me on <a target="_blank" href="https://twitter.com/rishabk7">Twitter</a> or <a target="_blank" href="https://linkedin.com/in/rishabkumar7">LinkedIn</a>, if you have any questions.</p>
]]></content:encoded></item><item><title><![CDATA[How I Passed the Microsoft Azure AZ-400 DevOps Engineer Expert Exam]]></title><description><![CDATA[I recently passed the Microsoft Azure AZ-400 DevOps Engineer Expert exam without any preparation, which is designed to test your knowledge and skills in DevOps practices: continuous integration and deployment, infrastructure as code, and monitoring a...]]></description><link>https://blog.rishabkumar.com/how-i-passed-azure-az-400-devops-engineer-exam</link><guid isPermaLink="true">https://blog.rishabkumar.com/how-i-passed-azure-az-400-devops-engineer-exam</guid><category><![CDATA[Devops]]></category><category><![CDATA[Cloud]]></category><category><![CDATA[Certification]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Mon, 08 May 2023 19:01:14 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1683576736791/4cd68d52-2280-4b52-bc7d-b2587efb200b.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I recently passed the Microsoft Azure AZ-400 DevOps Engineer Expert exam <strong>without any preparation</strong>, which is designed to test your knowledge and skills in DevOps practices: continuous integration and deployment, infrastructure as code, and monitoring and logging. In this blog post, I want to share my experience of taking the exam without much preparation and how I managed to pass it.</p>
<h2 id="heading-exam-objective">Exam Objective</h2>
<ul>
<li><p>Configure processes and communications</p>
</li>
<li><p>Design and implement source control</p>
</li>
<li><p>Design and implement build and release pipelines</p>
</li>
<li><p>Develop a security and compliance plan</p>
</li>
<li><p>Implement an instrumentation strategy</p>
</li>
</ul>
<h2 id="heading-prerequisite">Prerequisite</h2>
<p>In order to become a Microsoft Certified: DevOps Engineer Expert, you need either <a target="_blank" href="https://learn.microsoft.com/en-us/certifications/azure-administrator/">Azure Administrator Associate AZ-104</a> or <a target="_blank" href="https://learn.microsoft.com/en-us/certifications/azure-developer/">Azure Developer Associate AZ-204.</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1683563433732/5cb4184d-cf8a-43c9-ad77-e5c2ec294a25.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-preparation">Preparation</h2>
<p>First of all, I have to admit that I got lucky to have a voucher for the exam, which I received after participating in the <strong>Microsoft Ignite Cloud Skills Challenge</strong>. Otherwise, the exam fee is around <strong>$165</strong>, which is not cheap. I booked the exam last year in November, for February 15th (this year), which was the last day to use the voucher. I did not prepare for the exam because I forgot that I had the exam on Feb 15th. Two days before the exam, I received an email from Pearson, the exam provider, reminding me of the upcoming exam, I thought I could not learn all the topics in such a short time.</p>
<p>I was surprised and nervous at the same time, as I did not expect the exam to be that close. I looked at the exam outline and realized that it was extensive, covering various aspects of DevOps practices and Azure services. However, I remembered that I had already gone through the Microsoft Learn modules when I participated in the cloud skills challenge back in November. That gave me some confidence as I had some background knowledge of the topics covered in the exam.</p>
<p>I also had <strong>one year of experience</strong> working as a DevOps Engineer with Azure DevOps as the primary tool for pipelines and continuous integration and deployment. That gave me a good understanding of how Azure services work together and how to configure them for different scenarios. Although I had not worked with Azure DevOps for some time, I could still remember the dashboard and some of the workflows I had set up.</p>
<h2 id="heading-exam-experience">Exam Experience</h2>
<p>When I started the exam, I realized that some questions were challenging and required careful analysis and understanding of the scenarios. One area where I struggled was the different version control systems, such as Perforce and Apache Subversion, which I had not worked with before.</p>
<p>However, I managed to pass the exam with a score of <strong>710 out of 1000</strong>, which was just above the passing score of 700. I did the worst in the <strong>Developer Security and Compliance Plan section</strong>, which I had not practiced before. I did the best in the <strong>Implement and Instrumentation Strategy section</strong>, which I was comfortable with because of my experience.</p>
<p>Although I managed to pass the exam without much preparation, I would not recommend anyone to <strong>take that risk</strong>. It is better to study and revise the topics covered in the exam, especially if you are not familiar with Azure services and DevOps practices. Also, make sure you practice the exam questions and understand the scenarios presented. A year of experience working with Azure DevOps and Azure services gave me some advantages, but that might not be the case for everyone.</p>
<h2 id="heading-resources"><strong>Resources</strong></h2>
<p>Microsoft Learn is your <strong>best friend</strong> for any Azure certification.</p>
<p>Go through the <a target="_blank" href="https://learn.microsoft.com/en-us/certifications/devops-engineer/">AZ-400 Microsoft Learn module</a>.</p>
<p>If you are looking for a video-based course, I don't have specific recommendations but highly suggest watching this playlist by John Savill:</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtube.com/playlist?list=PLlVtbbG169nFr8RzQ4GIxUEznpNR53ERq">https://youtube.com/playlist?list=PLlVtbbG169nFr8RzQ4GIxUEznpNR53ERq</a></div>
<p> </p>
<p>If you are a video person, I also have a YouTube video talking about my experience with the exam.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/QCFChF-V24s">https://youtu.be/QCFChF-V24s</a></div>
<p> </p>
<p>In conclusion, passing the Microsoft Azure AZ-400 DevOps Engineer Expert exam requires a good understanding of Azure services and DevOps practices. With proper preparation and practice, anyone can pass the exam and earn the certification. If you have any questions about the exam or the certification, feel free to reach out to me on <a target="_blank" href="http://twitter.com/rishabk7">Twitter</a> or <a target="_blank" href="https://linkedin.com/in/rishabkumar7">LinkedIn.</a></p>
]]></content:encoded></item><item><title><![CDATA[Deploying a static website to AWS with Pulumi]]></title><description><![CDATA[Deploying a static website to the cloud has never been easier, thanks to Infrastructure as Code (IaC) tools like Pulumi. If you're like me, a developer who has used Terraform for your IaC needs in the past, Pulumi offers an alternative that allows yo...]]></description><link>https://blog.rishabkumar.com/deploying-a-static-website-to-aws-with-pulumi</link><guid isPermaLink="true">https://blog.rishabkumar.com/deploying-a-static-website-to-aws-with-pulumi</guid><category><![CDATA[Devops]]></category><category><![CDATA[AWS]]></category><category><![CDATA[Pulumi]]></category><category><![CDATA[Cloud]]></category><category><![CDATA[Cloud Computing]]></category><dc:creator><![CDATA[Rishab Kumar]]></dc:creator><pubDate>Sun, 30 Apr 2023 12:00:42 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1682743908910/7fc1f17a-3553-40c7-a81e-838b02dec9d3.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Deploying a static website to the cloud has never been easier, thanks to Infrastructure as Code (IaC) tools like Pulumi. If you're like me, a developer who has used Terraform for your IaC needs in the past, Pulumi offers an alternative that allows you to write code in your preferred programming language (TypeScript/JavaScript, Python, Go, .NET, and Java) to provision and manage cloud infrastructure. In this blog post, we will walk through the steps to deploy a static website to Amazon Web Services (AWS) using Pulumi, ps: this is my first time trying it out.</p>
<h2 id="heading-installing-pulumi">Installing Pulumi</h2>
<p>Since I am doing this demo on a macOS, it is easy with homebrew:</p>
<pre><code class="lang-bash">brew install pulumi/tap/pulumi
</code></pre>
<p>For Linux, here is the install script:</p>
<pre><code class="lang-bash">curl -fsSL https://get.pulumi.com | sh
</code></pre>
<p>If you are on Windows, you can download the MSI <a target="_blank" href="https://www.pulumi.com/docs/get-started/install/">here.</a></p>
<p>After installation, here is the list of available commands:</p>
<pre><code class="lang-bash">@Rishabs-MacBook-Pro ➜ ~  pulumi
Usage:
  pulumi [<span class="hljs-built_in">command</span>]

Available Commands:
  about          Print information about the Pulumi environment.
  cancel         Cancel a stack<span class="hljs-string">'s currently running update, if any
  config         Manage configuration
  console        Opens the current stack in the Pulumi Console
  convert        Convert Pulumi programs from a supported source program into other supported languages
  destroy        Destroy all existing resources in the stack
  gen-completion Generate completion scripts for the Pulumi CLI
  help           Help about any command
  import         Import resources into an existing stack
  login          Log in to the Pulumi Cloud
  logout         Log out of the Pulumi Cloud
  logs           Show aggregated resource logs for a stack
  new            Create a new Pulumi project
  org            Manage Organization configuration
  package        Work with Pulumi packages
  plugin         Manage language and resource provider plugins
  policy         Manage resource policies
  preview        Show a preview of updates to a stack'</span>s resources
  refresh        Refresh the resources <span class="hljs-keyword">in</span> a stack
  schema         Analyze package schemas
  stack          Manage stacks
  state          Edit the current stack<span class="hljs-string">'s state
  up             Create or update the resources in a stack
  version        Print Pulumi'</span>s version number
  watch          Continuously update the resources <span class="hljs-keyword">in</span> a stack
  whoami         Display the current logged-in user
</code></pre>
<h2 id="heading-other-requirements">Other Requirements</h2>
<p>Make sure you have AWS CLI configured. You can read more on how to download and configure AWS CLI <a target="_blank" href="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html">here.</a></p>
<p>And for the static website, I am using my terminal-portfolio as an example.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://github.com/rishabkumar7/terminal-portfolio">https://github.com/rishabkumar7/terminal-portfolio</a></div>
<p> </p>
<h2 id="heading-deploying-the-site">Deploying the site</h2>
<p>Now, let's create a new directory for our project.</p>
<pre><code class="lang-bash">mkdir static-website &amp;&amp; static-website
</code></pre>
<p>We'll be using <code>pulumi new</code> to initialize a new Pulumi project in my favorite programming language, Python.</p>
<pre><code class="lang-bash">pulumi new static-website-aws-python
</code></pre>
<p>Go through the prompts to configure the project.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742160004/280b68ee-3a58-46ae-abac-6f77cdbafa1e.png" alt class="image--center mx-auto" /></p>
<p>It will ask you to either paste your access token or log in using your browser.</p>
<p>If you haven't created a Pulumi account, go ahead and hit <code>enter,</code> it will launch the browser and take you to the sign-in page. Sign-up for the Pulumi account.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742281084/f9db2aad-c0e0-4030-9baa-a825b21417f3.png" alt class="image--center mx-auto" /></p>
<p>After the account creation is complete, go back to your terminal, and you'll see that the authentication was successful.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742403128/a45445ae-914a-4ee6-b0dd-113da6c30329.png" alt class="image--center mx-auto" /></p>
<p>Let's go through the project creation setup, after login is complete it will prompt you with project configurations:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742471639/c32200a0-ba54-4216-be8d-7c51063b10ba.png" alt class="image--center mx-auto" /></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742541913/ca50d02c-74b3-4b15-8f9d-17ae80b6dcf2.png" alt class="image--center mx-auto" /></p>
<p>Now you have a finished project that’s ready to be deployed, configured with the most common settings.</p>
<p>Also, let's inspect the code in <code>__main__.py</code></p>
<pre><code class="lang-python"><span class="hljs-keyword">import</span> pulumi
<span class="hljs-keyword">import</span> pulumi_aws <span class="hljs-keyword">as</span> aws
<span class="hljs-keyword">import</span> pulumi_synced_folder <span class="hljs-keyword">as</span> synced_folder

<span class="hljs-comment"># Import the program's configuration settings.</span>
config = pulumi.Config()
path = config.get(<span class="hljs-string">"path"</span>) <span class="hljs-keyword">or</span> <span class="hljs-string">"./www"</span>
index_document = config.get(<span class="hljs-string">"indexDocument"</span>) <span class="hljs-keyword">or</span> <span class="hljs-string">"index.html"</span>
error_document = config.get(<span class="hljs-string">"errorDocument"</span>) <span class="hljs-keyword">or</span> <span class="hljs-string">"error.html"</span>

<span class="hljs-comment"># Create an S3 bucket and configure it as a website.</span>
bucket = aws.s3.Bucket(
    <span class="hljs-string">"bucket"</span>,
    website=aws.s3.BucketWebsiteArgs(
        index_document=index_document,
        error_document=error_document,
    ),
)

<span class="hljs-comment"># Set ownership controls for the new bucket</span>
ownership_controls = aws.s3.BucketOwnershipControls(
    <span class="hljs-string">"ownership-controls"</span>,
    bucket=bucket.bucket,
    rule=aws.s3.BucketOwnershipControlsRuleArgs(
        object_ownership=<span class="hljs-string">"ObjectWriter"</span>,
    )
)

<span class="hljs-comment"># Configure public ACL block on the new bucket</span>
public_access_block = aws.s3.BucketPublicAccessBlock(
    <span class="hljs-string">"public-access-block"</span>,
    bucket=bucket.bucket,
    block_public_acls=<span class="hljs-literal">False</span>,
)

<span class="hljs-comment"># Use a synced folder to manage the files of the website.</span>
bucket_folder = synced_folder.S3BucketFolder(
    <span class="hljs-string">"bucket-folder"</span>,
    acl=<span class="hljs-string">"public-read"</span>,
    bucket_name=bucket.bucket,
    path=path,
    opts=pulumi.ResourceOptions(depends_on=[
        ownership_controls,
        public_access_block
    ])
)

<span class="hljs-comment"># Create a CloudFront CDN to distribute and cache the website.</span>
cdn = aws.cloudfront.Distribution(
    <span class="hljs-string">"cdn"</span>,
    enabled=<span class="hljs-literal">True</span>,
    origins=[
        aws.cloudfront.DistributionOriginArgs(
            origin_id=bucket.arn,
            domain_name=bucket.website_endpoint,
            custom_origin_config=aws.cloudfront.DistributionOriginCustomOriginConfigArgs(
                origin_protocol_policy=<span class="hljs-string">"http-only"</span>,
                http_port=<span class="hljs-number">80</span>,
                https_port=<span class="hljs-number">443</span>,
                origin_ssl_protocols=[<span class="hljs-string">"TLSv1.2"</span>],
            ),
        )
    ],
    default_cache_behavior=aws.cloudfront.DistributionDefaultCacheBehaviorArgs(
        target_origin_id=bucket.arn,
        viewer_protocol_policy=<span class="hljs-string">"redirect-to-https"</span>,
        allowed_methods=[
            <span class="hljs-string">"GET"</span>,
            <span class="hljs-string">"HEAD"</span>,
            <span class="hljs-string">"OPTIONS"</span>,
        ],
        cached_methods=[
            <span class="hljs-string">"GET"</span>,
            <span class="hljs-string">"HEAD"</span>,
            <span class="hljs-string">"OPTIONS"</span>,
        ],
        default_ttl=<span class="hljs-number">600</span>,
        max_ttl=<span class="hljs-number">600</span>,
        min_ttl=<span class="hljs-number">600</span>,
        forwarded_values=aws.cloudfront.DistributionDefaultCacheBehaviorForwardedValuesArgs(
            query_string=<span class="hljs-literal">True</span>,
            cookies=aws.cloudfront.DistributionDefaultCacheBehaviorForwardedValuesCookiesArgs(
                forward=<span class="hljs-string">"all"</span>,
            ),
        ),
    ),
    price_class=<span class="hljs-string">"PriceClass_100"</span>,
    custom_error_responses=[
        aws.cloudfront.DistributionCustomErrorResponseArgs(
            error_code=<span class="hljs-number">404</span>,
            response_code=<span class="hljs-number">404</span>,
            response_page_path=<span class="hljs-string">f"/<span class="hljs-subst">{error_document}</span>"</span>,
        )
    ],
    restrictions=aws.cloudfront.DistributionRestrictionsArgs(
        geo_restriction=aws.cloudfront.DistributionRestrictionsGeoRestrictionArgs(
            restriction_type=<span class="hljs-string">"none"</span>,
        ),
    ),
    viewer_certificate=aws.cloudfront.DistributionViewerCertificateArgs(
        cloudfront_default_certificate=<span class="hljs-literal">True</span>,
    ),
)

<span class="hljs-comment"># Export the URLs and hostnames of the bucket and distribution.</span>
pulumi.export(<span class="hljs-string">"originURL"</span>, pulumi.Output.concat(<span class="hljs-string">"http://"</span>, bucket.website_endpoint))
pulumi.export(<span class="hljs-string">"originHostname"</span>, bucket.website_endpoint)
pulumi.export(<span class="hljs-string">"cdnURL"</span>, pulumi.Output.concat(<span class="hljs-string">"https://"</span>, cdn.domain_name))
pulumi.export(<span class="hljs-string">"cdnHostname"</span>, cdn.domain_name)
</code></pre>
<p>So the template requires no additional configuration. Once the new project is created, you can deploy it immediately with <a target="_blank" href="https://www.pulumi.com/docs/reference/cli/pulumi_up"><code>pulumi up</code></a>:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742572075/16c83a7b-99af-4388-b981-77a409f3b68a.png" alt class="image--center mx-auto" /></p>
<p>There will be a prompt, asking you if you want to perform this update, type <code>yes</code> and hit <code>enter.</code></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742663919/404c69aa-ba00-463d-9045-caa0b14b048c.png" alt class="image--center mx-auto" /></p>
<p>As you can see, it created 8 resources. And also gave us the following outputs:</p>
<table><tbody><tr><td><p><strong>cdnHostname</strong></p></td><td><p>The provider-assigned hostname of the CloudFront CDN. Useful for creating <code>CNAME</code> records to associate custom domains.</p></td></tr><tr><td><p><strong>cdnURL</strong></p></td><td><p>The fully-qualified HTTPS URL of the CloudFront CDN.</p></td></tr><tr><td><p><strong>originHostname</strong></p></td><td><p>The provider-assigned hostname of the S3 bucket.</p></td></tr><tr><td><p><strong>originURL</strong></p></td><td><p>The fully-qualified HTTP URL of the S3 bucket endpoint.</p></td></tr></tbody></table>

<p>Let's check out our static website by visiting the <code>cdnURL</code> which will looks something like this - <a target="_blank" href="https://d2384wrx9ddsro.cloudfront.net/">https://d2384wrx9ddsro.cloudfront.net/</a>.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742932248/f440105d-a7ef-4621-a6ae-7ca1452b7c72.png" alt class="image--center mx-auto" /></p>
<p>Aye! We have a static website running on AWS!</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://media.giphy.com/media/s4VoCsFz8prlhSFCeS/giphy.gif">https://media.giphy.com/media/s4VoCsFz8prlhSFCeS/giphy.gif</a></div>
<p> </p>
<h2 id="heading-customizing-the-site">Customizing the site</h2>
<p>To customize the website to be my <code>terminal portfolio</code>, I am going to copy/clone the GitHub repository within our <code>static-website</code> directory.</p>
<p>So, this is what my directory structure looks like now:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682743104712/ca0bf118-dc8e-4bfa-b29b-d5aa8da3833f.png" alt class="image--center mx-auto" /></p>
<p>And then using the <a target="_blank" href="https://www.pulumi.com/docs/reference/cli/pulumi_config_set"><code>pulumi config set</code></a>, I am going to point to <code>terminal-portfolio</code> folder, instead of the <code>www</code> folder, with the <code>path</code> setting:</p>
<pre><code class="lang-bash">pulumi config <span class="hljs-built_in">set</span> path terminal-portfolio
</code></pre>
<p>And then let's deploy the changes:</p>
<pre><code class="lang-bash">pulumi up
</code></pre>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682743164107/88497ed1-bd99-484e-b6b7-67543985ab9b.png" alt class="image--center mx-auto" /></p>
<p>You can see that new changes have been deployed, but when you navigate to the CDN URL, it still might show the old "Hello, World!" website, that's due to the cache, by default, the generated program configures the CloudFront CDN to cache files for 600 seconds (10 minutes).</p>
<p>But we can change that!</p>
<p>Let's look at <code>__main.py__</code> :</p>
<pre><code class="lang-python">default_cache_behavior=aws.cloudfront.DistributionDefaultCacheBehaviorArgs(
        target_origin_id=bucket.arn,
        viewer_protocol_policy=<span class="hljs-string">"redirect-to-https"</span>,
        allowed_methods=[
            <span class="hljs-string">"GET"</span>,
            <span class="hljs-string">"HEAD"</span>,
            <span class="hljs-string">"OPTIONS"</span>,
        ],
        cached_methods=[
            <span class="hljs-string">"GET"</span>,
            <span class="hljs-string">"HEAD"</span>,
            <span class="hljs-string">"OPTIONS"</span>,
        ],
        default_ttl=<span class="hljs-number">600</span>,
        max_ttl=<span class="hljs-number">600</span>,
        min_ttl=<span class="hljs-number">600</span>,
</code></pre>
<p>You can change those values to your desired settings.</p>
<p>Let's check my terminal-portfolio again, by visiting the CDN URL - <a target="_blank" href="https://d2384wrx9ddsro.cloudfront.net/">https://d2384wrx9ddsro.cloudfront.net</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1682742091909/37448cd6-c564-4061-a162-c9f49c59e95c.png" alt class="image--center mx-auto" /></p>
<p>Voila!</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://media.giphy.com/media/uyoXx0qpUWfQs/giphy.gif">https://media.giphy.com/media/uyoXx0qpUWfQs/giphy.gif</a></div>
<p> </p>
<p>In conclusion, Pulumi is a powerful tool for deploying and managing cloud infrastructure with ease. Whether you're new to Pulumi or have used it before, this blog post has shown you how to use it to deploy a static website to AWS.</p>
<p>Follow me here on Hashnode or <a target="_blank" href="https://twitter.com/rishabk7">Twitter</a>/<a target="_blank" href="https://linkedin.com/in/rishabkumar7">LinkedIn</a> to stay up-to-date with my latest blog posts and tech tutorials.</p>
]]></content:encoded></item></channel></rss>