Convert Your Django DRF Project to MCP in 5 Minutes
The Problem: When APIs Meet AI Assistants A few weeks ago, I was experimenting with Model Context Protocol (MCP) for a Kafka integration project. The experience was eye-opening - being able to interact with complex systems through natural language felt like a glimpse into the future of development workflows. After successfully getting MCP working with Kafka, I started thinking about other systems I could connect. That鈥檚 when I looked at one of my old Django projects. ...
Turn GitHub into Your Free Data Platform: Building APIs with GitHub Actions
What if I told you that you could build, host, and maintain data APIs completely free using just GitHub? Here鈥檚 how I built a production-ready data platform without spending a penny on hosting. The Problem: APIs Are Expensive (Or Are They?) Building data-driven applications usually means expensive cloud hosting, database costs, and server maintenance. Most developers assume they need AWS, Google Cloud, or Azure to run automated data collection and serve APIs. But what if there was a better way? ...
Tracking Antarctic Giants: Building a Real-Time Iceberg Monitor with NASA Data
Ever wondered where those massive icebergs the size of cities are drifting? I built a system to track them in real-time using NASA satellite data. The Problem: Lost Giants in the Southern Ocean Antarctic icebergs are some of the most fascinating phenomena on Earth. These floating ice mountains鈥攕ome larger than entire countries鈥攂reak off from ice shelves and drift across the Southern Ocean for years. But tracking their movements has always been challenging. ...
Mastering Integration Testing for Kafka Connectors: A Complete Guide
Integration testing is crucial for Kafka Connectors to ensure they work correctly with external systems. This guide provides a comprehensive approach to setting up and executing integration tests for Kafka Connectors, covering everything from environment setup to best practices for testing source and sink connectors.
Building an MCP Server for Your Kafka Cluster
A step鈥慴y鈥憇tep guide to using FastMCP and the MCP protocol to expose Kafka operations (topic management, produce/consume, troubleshooting) as LLM鈥慳ccessible tools.
How I Automated My Markdown Publishing on Medium (No More Manual Work!)
Tired of manually reformatting Markdown for Medium? Discover my open-source tool that converts Hugo-friendly Markdown into perfectly formatted Medium posts instantly.
Creating Tombstone Records Using kafka-console-producer.sh: A Quick Guide
A practical guide to creating tombstone records for Kafka compacted topics using the kafka-console-producer.sh command-line tool with the null marker feature
Build Custom Kafka Connectors Fast with This Open-Source Template
Apache Kafka is a powerful distributed event streaming platform, and Kafka Connect makes it easy to integrate Kafka with external systems. While many pre-built connectors exist, real-world applications often need custom connectors tailored for proprietary systems, custom logic, or advanced error handling. That鈥檚 where this production-ready template comes in鈥攊t removes the boilerplate and gives you everything you need to build, test, and deploy connectors with ease.
Filtering Tombstone Records in Kafka Connect
Kafka Connect provides a flexible way to process streaming data using Single Message Transforms (SMTs). If you need to filter out tombstone records (records with null values), you should use the generic Filter transformation along with the RecordIsTombstone predicate. Here鈥檚 the correct configuration: # Define the predicate to detect tombstone records (i.e., records with null values) predicates=dropTombstone predicates.dropTombstone.type=org.apache.kafka.connect.transforms.predicates.RecordIsTombstone # Configure the Filter transformation to drop records that match the predicate transforms=dropTombstone transforms.dropTombstone.type=org.apache.kafka.connect.transforms.Filter transforms.dropTombstone.predicate=dropTombstone Explanation What is a Predicate? A predicate in Kafka Connect is a condition that evaluates whether a given record meets certain criteria. It returns either true or false. If true, the transformation (such as filtering) is applied. In this case, the predicate named dropTombstone uses the built-in class RecordIsTombstone, which evaluates to true when a record鈥檚 value is null. ...
Server-Sent Events: Build Real-Time Web Apps with Minimal Code
In this technical deep-dive, we unravel the power of Server-Sent Events (SSE), a game-changing web technology that simplifies real-time communication. Developers often struggle with complex, resource-intensive methods for creating live updates, but SSE offers an elegant solution that鈥檚 both lightweight and powerful.