Openkyber

Kafka Real Time Analytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Real Time Analytics Engineer in West Des Moines, IA, offering $41.00 - $44.00 hourly for a contingent assignment. Requires 2+ years of data engineering experience, proficiency in PySpark, Apache Airflow, and Google Cloud Platform.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
352
-
🗓️ - Date
March 17, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alaska
-
🧠 - Skills detailed
#HDFS (Hadoop Distributed File System) #Scala #Data Modeling #Consulting #SQL (Structured Query Language) #Datasets #Database Systems #Airflow #Cloud #"ETL (Extract #Transform #Load)" #Batch #GCP (Google Cloud Platform) #Apache Airflow #PySpark #Data Management #Data Lifecycle #BigQuery #Compliance #Data Quality #Data Pipeline #Hadoop #Data Governance #Metadata #Dataflow #Automation #Kafka (Apache Kafka) #Data Processing #Spark (Apache Spark) #Delta Lake #Python #Data Engineering #Programming
Role description
Location: West Des Moines, IA Salary: $41.00 USD Hourly - $44.00 USD Hourly Description: Software Engineer - Data Engineering (Contingent Worker) Location: West Des Moines, Iowa About the Role In this contingent assignment, you'll contribute to data engineering initiatives that involve designing, building, and optimizing data pipelines and workflows. You'll support low- to moderately-complex engineering efforts, perform analysis, identify opportunities for process improvements, and provide technical guidance to partner teams. This role requires exercising independent judgment while developing a strong understanding of organizational policies, procedures, and compliance expectations. Key Responsibilities: Design, develop, and optimize ETL/ELT workflows and data pipelines for batch and real-time data processing. Build and maintain scalable pipelines supporting reporting, analytics, and downstream applications using open-source and cloud-native technologies. Implement and manage operational and analytical data stores using Delta Lake and modern data management concepts. Optimize data structures for performance, reliability, and scalability across large, distributed datasets. Partner with architects and engineering teams to ensure solutions align with target state architecture and best practices. Apply data governance, lineage, and metadata best practices, including integration with Google Dataplex for centralized governance and data quality management. Develop, schedule, and orchestrate complex workflows using Apache Airflow; design, implement, and maintain robust Airflow DAGs. Troubleshoot and resolve pipeline issues to ensure high availability and operational excellence. Required Qualifications: 2+ years of software engineering or data engineering experience, demonstrated through professional work, consulting, training, military service, or educational background. Required Technical Skills: Data Foundations: Strong understanding of data structures, data modeling, and data lifecycle management. ETL/ELT Development: Hands-on experience designing, building, and maintaining data pipelines. PySpark: Advanced experience with distributed data processing and transformations. Open Table Formats: Experience implementing and managing NetApp Iceberg or similar technologies. Hadoop Ecosystem: Knowledge of HDFS, Hive, and related big-data components. Cloud Technologies: Google Cloud Platform (BigQuery, Dataflow) Delta Lake Dataplex for governance and metadata management Programming & Orchestration: Proficiency in Python, Spark, and SQL. Workflow Automation: Strong experience with Apache Airflow, including authoring and maintaining complex DAGs. Database Concepts: Strong understanding of relational, distributed, and analytical database systems. By providing your phone number, you consent to: (1) receive automated text messages and calls from OpenKyber, Inc. and its affiliates to such phone number regarding job opportunities, your job application, and for other related purposes. Message & data rates apply and message frequency may vary. Reply STOP to opt out of receiving telephone calls and text messages from OpenKyber and HELP for help. Contact: This job and many more are available through OpenKyber. Please apply with us today! For applications and inquiries, contact: hirings@openkyber.com