

IT America Inc
Sr. Data Engineer - W2 Contract
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on a W2 contract in Jersey City, NJ, hybrid (3 days in-office). Requires 10+ years in data engineering, expertise in Kubernetes, Airflow, and DBT in on-prem environments. Pay rate: "Unknown".
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#GIT #Migration #Python #Deployment #Data Modeling #Data Pipeline #Scala #Oracle #"ETL (Extract #Transform #Load)" #Kubernetes #dbt (data build tool) #Data Processing #Cloud #Macros #Automation #Apache Airflow #Data Warehouse #SQL (Structured Query Language) #Airflow #Data Engineering
Role description
Role: Senior Data Engineer - Airflow, DBT Core, Kubernetes/OpenShift
Location: Jersey City, NJ, Hybrid 3 days a week in the office
Looking for Permanent / Visa Independent Consultants
MUST HAVE SKILLS are Kubernetes operations, Airflow design and implementation, and hands-on DBT model development in an on-prem OpenShift environment.
MUST HAVE SKILLS
• 10+ years of professional experience in data engineering, designing and supporting enterprise-scale data platforms in production environments
• the role requires hands-on experience with DBT and Apache Airflow deployed on Kubernetes, specifically within an on-prem OpenShift environment.
• This position involves closer interaction with infrastructure, including Kubernetes operations, Airflow design and implementation, and hands-on DBT model development in an on-prem setup.
• Given these requirements, we are looking for someone with deeper, practical experience in dbt and Airflow within Kubernetes-based, on-prem environments.
Job Summary
We are seeking a highly skilled Senior Data Engineer with 8-10+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift). This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads. The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.
Required Skills & Qualifications:
• 10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles
• Proven experience designing and supporting enterprise-scale data platforms in production environments
• Expert-level Apache Airflow (DAG design, scheduling, performance tuning)
• Expert-level DBT Core (data modeling, testing, macros, implementation)
• Strong proficiency in Python for data engineering and automation
• Deep understanding of Kubernetes and/or OpenShift in production environments
• Extensive experience with distributed workload management and performance optimization
• Strong SQL skills for complex transformations and analytics
• Experience running data platforms on cloud environments
• Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows
• Experience supporting financial services or accounting platforms is plus
• Exposure to enterprise system migrations (e.g., legacy platform to modern data stack) is plus
• Experience with data warehouses (Oracle) is plus
Role: Senior Data Engineer - Airflow, DBT Core, Kubernetes/OpenShift
Location: Jersey City, NJ, Hybrid 3 days a week in the office
Looking for Permanent / Visa Independent Consultants
MUST HAVE SKILLS are Kubernetes operations, Airflow design and implementation, and hands-on DBT model development in an on-prem OpenShift environment.
MUST HAVE SKILLS
• 10+ years of professional experience in data engineering, designing and supporting enterprise-scale data platforms in production environments
• the role requires hands-on experience with DBT and Apache Airflow deployed on Kubernetes, specifically within an on-prem OpenShift environment.
• This position involves closer interaction with infrastructure, including Kubernetes operations, Airflow design and implementation, and hands-on DBT model development in an on-prem setup.
• Given these requirements, we are looking for someone with deeper, practical experience in dbt and Airflow within Kubernetes-based, on-prem environments.
Job Summary
We are seeking a highly skilled Senior Data Engineer with 8-10+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift). This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads. The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.
Required Skills & Qualifications:
• 10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles
• Proven experience designing and supporting enterprise-scale data platforms in production environments
• Expert-level Apache Airflow (DAG design, scheduling, performance tuning)
• Expert-level DBT Core (data modeling, testing, macros, implementation)
• Strong proficiency in Python for data engineering and automation
• Deep understanding of Kubernetes and/or OpenShift in production environments
• Extensive experience with distributed workload management and performance optimization
• Strong SQL skills for complex transformations and analytics
• Experience running data platforms on cloud environments
• Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows
• Experience supporting financial services or accounting platforms is plus
• Exposure to enterprise system migrations (e.g., legacy platform to modern data stack) is plus
• Experience with data warehouses (Oracle) is plus






