Mondo

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer IV position for a 2-month remote contract starting November 3, 2025, with a pay rate of $50–$65/hr. Required skills include 5+ years in data engineering, expertise in Kafka, Java microservices, and MongoDB.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
October 28, 2025
🕒 - Duration
1 to 3 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Cloud #Docker #Kubernetes #Java #Data Governance #Metadata #Scala #GCP (Google Cloud Platform) #Requirements Gathering #Data Management #Data Architecture #Agile #Data Design #Data Pipeline #Azure #MongoDB #Data Catalog #Documentation #Data Engineering #API (Application Programming Interface) #Python #Data Modeling #AWS (Amazon Web Services) #Microservices #Kafka (Apache Kafka)
Role description
Apply Now: Data Engineer IV – Remote (Salisbury, NC) for this 2 month contract position. Job Title: Data Engineer IV Location-Type: Remote (Salisbury, NC) Start Date Is: November 3, 2025 (or ASAP) Duration: 2-Month Contract (11/3/2025 – 12/27/2025) Compensation Range: $50–$65/hr W2 Job Description: We are seeking an experienced Data Engineer IV to design and develop scalable, high-performance data streaming applications. This short-term contract role focuses on Kafka, Java microservices, and MongoDB within a complex enterprise data ecosystem. The ideal candidate will act as a hands-on technical expert, designing data pipelines, resolving complex data architecture challenges, and supporting multi-team initiatives to enhance data flow and consumption. Day-to-Day Responsibilities: • Design and develop scalable streaming data applications using Kafka, Java Microservices, and MongoDB. • Solve complex application errors and ensure reliable performance across systems. • Create and review requirement and functional design documents as part of the software development lifecycle (SDLC). • Document all project phases, including requirements gathering, entity relationship diagrams, and technical specifications. • Analyze information from multiple data sources to draw accurate, actionable conclusions. • Serve as a subject matter expert (SME) on data engineering solutions, supporting both architecture and implementation. • Collaborate cross-functionally with IT and business teams to ensure data pipelines meet operational and analytical needs. • Drive innovation and optimization across data flow, integration, and data consumption processes. Must Haves (Required Qualifications): • 5+ years of hands-on experience as a Data Engineer or similar technical role. • Expertise in Kafka, Java Microservices, and MongoDB. • Understanding of data pipeline architecture, ETL frameworks, and streaming data design. • Proficiency in data modeling, API integration, and system documentation. • Experience creating functional and technical design documents. • Proven ability to analyze, troubleshoot, and optimize large-scale data systems. • Excellent problem-solving and communication skills. Nice to Haves (Preferred Skills): • Experience with cloud-based data infrastructure (AWS, GCP, or Azure). • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices. • Knowledge of Python or Scala for data transformation and analytics. • Exposure to data governance, metadata management, or data catalog tools. • Ability to mentor junior engineers or collaborate in agile teams. Benefits: • This role is eligible to enroll in both Mondo's health insurance plan and retirement plan. Mondo defers to the applicable State or local law for paid sick leave eligibility