Senior Data Pipeline Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Pipeline Engineer on a 6-month contract-to-hire, remote position. Requires 4–6 years of experience with Native Kafka, Kafka Streams, and Snowflake in AWS. Strong event-driven architecture knowledge is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 12, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #Programming #"ETL (Extract #Transform #Load)" #Data Pipeline #Project Management #Documentation #AWS (Amazon Web Services) #Snowflake #Cloud #Scala
Role description
Job Title: Senior Software Engineer – Real-Time Data Pipelines (6-Month Contract-to-Hire, Remote) Job Description: We are seeking a highly skilled Senior Software Engineer with 4–6 years of experience designing and building complex enterprise solutions. This role focuses on real-time data pipeline development using Native Kafka and Kafka Streams (KStream) and loading/managing data in Snowflake within the AWS cloud ecosystem. The ideal candidate has a strong understanding of event-driven architectures and queuing mechanisms and excels at delivering scalable, high-quality solutions in a collaborative, delivery-driven environment. This position supports the Event Insights project, enabling end-to-end claim workflow lifecycle tracking, and offers the chance to work on a cohesive team where knowledge sharing and collaboration are highly valued. Responsibilities: • Build and manage real-time data pipelines using CentEvent, Native Kafka, and Kafka Streams (KStream). • Load and maintain data in Snowflake within the AWS cloud. • Work with Snowflake tables, procedures, and event-driven architectures. • Collaborate across cross-functional teams to deliver scalable solutions. • Maintain high standards of code quality and documentation. • Support the Event Insights project, focusing on end-to-end claim workflow lifecycle tracking. Must-Haves: • 4–6 years of experience building and managing real-time data pipelines. • Deep expertise in Native Kafka, Kafka Streams (KStream), and Snowflake. • Strong understanding of event-driven architectures and queuing mechanisms. • Ability to work independently with analytical, problem-solving, and project management skills. • Excellent judgment, decision-making, and accuracy under pressure. Nice-to-Haves: • Experience evaluating and improving code quality/standards. • Experience working with or directing third-party application developers. • Knowledge of software design principles and programming concepts. Why You’ll Love This Role: • 100% remote, flexible work environment. • Opportunity to work on high-impact, end-to-end claim workflow tracking. • Cohesive, collaborative, and professional team culture. • Hands-on, delivery-driven work with opportunities to learn and grow.