<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Data-Engineering on Derrek</title><link>https://www.meath.cloud/tags/data-engineering/</link><description>Recent content in Data-Engineering on Derrek</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Sat, 14 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://www.meath.cloud/tags/data-engineering/index.xml" rel="self" type="application/rss+xml"/><item><title>Day 6: Building an F1 Race Viewer with Azure and Hugo</title><link>https://www.meath.cloud/posts/f1-race-viewer/</link><pubDate>Sat, 14 Mar 2026 00:00:00 +0000</pubDate><guid>https://www.meath.cloud/posts/f1-race-viewer/</guid><description>&lt;p>With the F1 data pipeline running automatically after every race, the next step
was making the data actually useful — turning raw JSON in blob storage into
something you&amp;rsquo;d actually want to look at.&lt;/p>
&lt;h2 id="what-we-built">What We Built&lt;/h2>
&lt;p>The &lt;a href="https://www.meath.cloud/f1/">F1 page&lt;/a> on this site shows:&lt;/p>
&lt;ul>
&lt;li>Full race results with finishing positions&lt;/li>
&lt;li>Fastest lap indicator per race&lt;/li>
&lt;li>Tyre strategy visualization for every driver&lt;/li>
&lt;li>Pit stop times on hover&lt;/li>
&lt;li>A race selector covering the full 2025 season and 2026 onwards&lt;/li>
&lt;/ul>
&lt;p>All of it updates automatically after each race with zero manual intervention.&lt;/p></description></item><item><title>Day 5: Building an F1 Data Pipeline with OpenF1 and Azure</title><link>https://www.meath.cloud/posts/f1-data-pipeline/</link><pubDate>Mon, 09 Mar 2026 00:00:00 +0000</pubDate><guid>https://www.meath.cloud/posts/f1-data-pipeline/</guid><description>&lt;p>I&amp;rsquo;m an F1 fan. So when it came time to build out my data engineering portfolio,
pointing a pipeline at Formula 1 telemetry data was a no-brainer.&lt;/p>
&lt;h2 id="the-goal">The Goal&lt;/h2>
&lt;p>Build an automated pipeline that ingests F1 race data into Azure Blob Storage
after every Grand Prix — without me having to do anything. No manual triggers,
no leaving my laptop on, no babysitting.&lt;/p>
&lt;h2 id="the-data-source">The Data Source&lt;/h2>
&lt;p>&lt;a href="https://openf1.org">OpenF1&lt;/a> is a free, open-source API that provides real-time
and historical F1 data — lap times, pit stops, telemetry, weather, driver info —
all of it. No API key required for historical data. It&amp;rsquo;s genuinely one of the
nicest public APIs I&amp;rsquo;ve worked with.&lt;/p></description></item><item><title>Data Engineering on Azure: Building My First ETL Pipeline</title><link>https://www.meath.cloud/posts/data-engineering-etl-pipeline/</link><pubDate>Thu, 05 Mar 2026 00:00:00 +0000</pubDate><guid>https://www.meath.cloud/posts/data-engineering-etl-pipeline/</guid><description>&lt;h1 id="data-engineering-on-azure-building-my-first-etl-pipeline">Data Engineering on Azure: Building My First ETL Pipeline&lt;/h1>
&lt;p>Day 4 of the cloud engineering transition. Today I crossed into data engineering territory — spinning up Azure Storage and Data Factory infrastructure with Bicep, then writing a Python ETL pipeline that pulls live weather data and lands it in Azure Blob Storage.&lt;/p>
&lt;p>It&amp;rsquo;s a simple pipeline by design, but it covers the full ETL pattern that every data engineering project is built on.&lt;/p></description></item></channel></rss>