Convert Your Django DRF Project to MCP in 5 Minutes

The Problem: When APIs Meet AI Assistants A few weeks ago, I was experimenting with Model Context Protocol (MCP) for a Kafka integration project. The experience was eye-opening - being able to interact with complex systems through natural language felt like a glimpse into the future of development workflows. After successfully getting MCP working with Kafka, I started thinking about other systems I could connect. That’s when I looked at one of my old Django projects. ...

August 3, 2025 · 13 min · 2701 words · Joel Hanson

Turn GitHub into Your Free Data Platform: Building APIs with GitHub Actions

What if I told you that you could build, host, and maintain data APIs completely free using just GitHub? Here’s how I built a production-ready data platform without spending a penny on hosting. The Problem: APIs Are Expensive (Or Are They?) Building data-driven applications usually means expensive cloud hosting, database costs, and server maintenance. Most developers assume they need AWS, Google Cloud, or Azure to run automated data collection and serve APIs. But what if there was a better way? ...

July 26, 2025 · 10 min · 2077 words · Joel Hanson

Tracking Antarctic Giants: Building a Real-Time Iceberg Monitor with NASA Data

Ever wondered where those massive icebergs the size of cities are drifting? I built a system to track them in real-time using NASA satellite data. The Problem: Lost Giants in the Southern Ocean Antarctic icebergs are some of the most fascinating phenomena on Earth. These floating ice mountains—some larger than entire countries—break off from ice shelves and drift across the Southern Ocean for years. But tracking their movements has always been challenging. ...

July 26, 2025 · 6 min · 1224 words · Joel Hanson

Mastering Integration Testing for Kafka Connectors: A Complete Guide

Integration testing is crucial for Kafka Connectors to ensure they work correctly with external systems. This guide provides a comprehensive approach to setting up and executing integration tests for Kafka Connectors, covering everything from environment setup to best practices for testing source and sink connectors.

July 13, 2025 · 18 min · 3654 words · Joel Hanson

Building an MCP Server for Your Kafka Cluster

A step‑by‑step guide to using FastMCP and the MCP protocol to expose Kafka operations (topic management, produce/consume, troubleshooting) as LLM‑accessible tools.

June 7, 2025 · 4 min · 841 words · Joel Hanson

How I Automated My Markdown Publishing on Medium (No More Manual Work!)

Tired of manually reformatting Markdown for Medium? Discover my open-source tool that converts Hugo-friendly Markdown into perfectly formatted Medium posts instantly.

April 26, 2025 · 2 min · 303 words · Joel Hanson

Creating Tombstone Records Using kafka-console-producer.sh: A Quick Guide

A practical guide to creating tombstone records for Kafka compacted topics using the kafka-console-producer.sh command-line tool with the null marker feature

April 22, 2025 · 3 min · 577 words · Joel Hanson

Build Custom Kafka Connectors Fast with This Open-Source Template

Apache Kafka is a powerful distributed event streaming platform, and Kafka Connect makes it easy to integrate Kafka with external systems. While many pre-built connectors exist, real-world applications often need custom connectors tailored for proprietary systems, custom logic, or advanced error handling. That’s where this production-ready template comes in—it removes the boilerplate and gives you everything you need to build, test, and deploy connectors with ease.

April 14, 2025 · 4 min · 842 words · Joel Hanson

Filtering Tombstone Records in Kafka Connect

Kafka Connect provides a flexible way to process streaming data using Single Message Transforms (SMTs). If you need to filter out tombstone records (records with null values), you should use the generic Filter transformation along with the RecordIsTombstone predicate. Here’s the correct configuration: # Define the predicate to detect tombstone records (i.e., records with null values) predicates=dropTombstone predicates.dropTombstone.type=org.apache.kafka.connect.transforms.predicates.RecordIsTombstone # Configure the Filter transformation to drop records that match the predicate transforms=dropTombstone transforms.dropTombstone.type=org.apache.kafka.connect.transforms.Filter transforms.dropTombstone.predicate=dropTombstone Explanation What is a Predicate? A predicate in Kafka Connect is a condition that evaluates whether a given record meets certain criteria. It returns either true or false. If true, the transformation (such as filtering) is applied. In this case, the predicate named dropTombstone uses the built-in class RecordIsTombstone, which evaluates to true when a record’s value is null. ...

March 11, 2025 · 2 min · 307 words · Joel Hanson

Server-Sent Events: Build Real-Time Web Apps with Minimal Code

In this technical deep-dive, we unravel the power of Server-Sent Events (SSE), a game-changing web technology that simplifies real-time communication. Developers often struggle with complex, resource-intensive methods for creating live updates, but SSE offers an elegant solution that’s both lightweight and powerful.

December 7, 2024 · 2 min · 344 words · Joel Hanson