Tech Interacts Inc

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focusing on Apache Airflow, dbt Core, and Kubernetes. Contract length is unspecified, with a pay rate of $60.00 per hour, requiring in-person work and expertise in financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
March 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Jersey City, NJ 07302
-
🧠 - Skills detailed
#Deployment #Oracle #Documentation #GIT #Migration #Data Pipeline #Data Processing #Cloud #Scala #Data Warehouse #"ETL (Extract #Transform #Load)" #Data Modeling #dbt (data build tool) #Macros #Monitoring #SQL (Structured Query Language) #Datasets #Automation #Data Architecture #Airflow #Batch #Apache Airflow #Data Quality #Kubernetes #AutoScaling #Security #Data Engineering #Observability #Python
Role description
<Job Summary>We are seeking a highly skilled Senior Data Engineer with 8+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift). This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads. The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments. Key Responsibilities: Data Pipeline & Orchestration Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelines Implement best practices for DAG performance, dependency management, retries, SLA monitoring, and alerting Optimize Airflow scheduler, executor, and worker configurations for high-concurrency workloads dbt Core & Data Modeling Lead dbt Core implementation, including project structure, environments, and CI/CD integration Design and maintain robust dbt models (staging, intermediate, marts) following analytics engineering best practices Implement dbt tests, documentation, macros, and incremental models to ensure data quality and performance Optimize dbt query performance for large-scale datasets and downstream reporting needs Cloud, Kubernetes & OpenShift Deploy and manage data workloads on Kubernetes / OpenShift platforms Design strategies for workload distribution, horizontal scaling, and resource optimization Configure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloads Troubleshoot container-level performance issues and resource contention Performance & Reliability Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platforms Identify bottlenecks in query execution, orchestration, and infrastructure Implement observability solutions (logs, metrics, alerts) for proactive issue detection Ensure high availability, fault tolerance, and resiliency of data pipelines Collaboration & Governance Work closely with data architects, platform engineers, and business stakeholders Support financial reporting, accounting, and regulatory data use cases Enforce data engineering standards, security best practices, and governance policies Required Skills & Qualifications: Experience 10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles Proven experience designing and supporting enterprise-scale data platforms in production environments Must-Have Technical Skills Expert-level Apache Airflow (DAG design, scheduling, performance tuning) Expert-level DBT Core (data modeling, testing, macros, implementation) Strong proficiency in Python for data engineering and automation Deep understanding of Kubernetes and/or OpenShift in production environments Extensive experience with distributed workload management and performance optimization Strong SQL skills for complex transformations and analytics Cloud & Platform Experience Experience running data platforms on cloud environments Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows Preferred Qualifications Experience supporting financial services or accounting platforms Exposure to enterprise system migrations (e.g., legacy platform to modern data stack) Experience with data warehouses (Oracle) Pay: $60.00 per hour Work Location: In person