FUSTIS LLC

Senior Data Engineer (Onsite Interview)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 8+ years of experience, focusing on Python, Apache Airflow, dbt, Kubernetes, and OpenShift. It offers $70-$75/hr. on a C2C/1099 basis, requiring onsite work in Jersey City, NJ.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
February 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#Security #AutoScaling #Documentation #Deployment #Oracle #Data Warehouse #Apache Airflow #Data Architecture #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Data Pipeline #Data Processing #Scala #Data Engineering #Batch #Data Modeling #Datasets #GIT #Migration #SQL (Structured Query Language) #Automation #Data Quality #Observability #Kubernetes #Python #Airflow #Monitoring #Macros #Cloud
Role description
Job Role: Data Engineer Location: Onsite 3 days/week in Jersey City, NJ (185 Hudson St #1150, Jersey City, NJ 07311) Mode of Interview: Interview Pay Rate: $70-$75/hr. on C2C/1099 Must have: -Python -Apache Airflow/DBT -Communication, both written & verbal -Kubernetes -OpenShift -8+ years of experience Job Description We are seeking a highly skilled Senior Data Engineer with 8+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift). This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads. The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments. Key Responsibilities: Data Pipeline & Orchestration • Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelines • Implement best practices for DAG performance, dependency management, retries, SLA monitoring, and alerting • Optimize Airflow scheduler, executor, and worker configurations for high-concurrency workloads dbt Core & Data Modeling • Lead dbt Core implementation, including project structure, environments, and CI/CD integration • Design and maintain robust dbt models (staging, intermediate, marts) following analytics engineering best practices • Implement dbt tests, documentation, macros, and incremental models to ensure data quality and performance • Optimize dbt query performance for large-scale datasets and downstream reporting needs Cloud, Kubernetes & OpenShift • Deploy and manage data workloads on Kubernetes / OpenShift platforms • Design strategies for workload distribution, horizontal scaling, and resource optimization • Configure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloads • Troubleshoot container-level performance issues and resource contention Performance & Reliability • Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platforms • Identify bottlenecks in query execution, orchestration, and infrastructure • Implement observability solutions (logs, metrics, alerts) for proactive issue detection • Ensure high availability, fault tolerance, and resiliency of data pipelines Collaboration & Governance • Work closely with data architects, platform engineers, and business stakeholders • Support financial reporting, accounting, and regulatory data use cases • Enforce data engineering standards, security best practices, and governance policies Required Skills & Qualifications: Experience • 10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles • Proven experience designing and supporting enterprise-scale data platforms in production environments Must-Have Technical Skills • Expert-level Apache Airflow (DAG design, scheduling, performance tuning) • Expert-level DBT Core (data modeling, testing, macros, implementation) • Strong proficiency in Python for data engineering and automation • Deep understanding of Kubernetes and/or OpenShift in production environments • Extensive experience with distributed workload management and performance optimization • Strong SQL skills for complex transformations and analytics Cloud & Platform Experience • Experience running data platforms on cloud environments • Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows Preferred Qualifications • Experience supporting financial services or accounting platforms • Exposure to enterprise system migrations (e.g., legacy platform to modern data stack) • Experience with data warehouses (Oracle)