

Cipher7
Senior Data Engineer - Data Bricks
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience in Python, Databricks, and AWS, offering a W2 contract for 5 days/week onsite in New Jersey. Local candidates only; expertise in ETL processes and data architecture is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date
January 9, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
New Jersey, United States
-
π§ - Skills detailed
#Data Warehouse #AWS CloudWatch #AWS Glue #Security #Data Architecture #Data Modeling #Monitoring #Version Control #Databricks #Data Science #Data Lake #AWS (Amazon Web Services) #GIT #Spark (Apache Spark) #Datasets #Data Governance #Data Quality #PySpark #Compliance #Cloud #Python #Infrastructure as Code (IaC) #Lambda (AWS Lambda) #Scala #Data Ingestion #Data Bricks #Data Pipeline #Data Analysis #"ETL (Extract #Transform #Load)" #Data Engineering
Role description
Job Title: Senior Data Engineer β Python / Databricks / AWS
Location: New Jersey, USA / Boston
Work Type: W2 Contract, 5 days/week, Onsite
Experience: 10+ Years
Candidate Requirements: Local candidates only (no relocation assistance)
Job Description:
We are seeking a highly experienced Senior Data Engineer to design, develop, and maintain scalable data pipelines and ETL processes. The ideal candidate will have 10+ years of hands-on experience with Python, Databricks, and AWS Glue, and a strong background in data architecture, data lakes, and data warehouses. This is an onsite W2 contract role based in New Jersey, USA.
Key Responsibilities:
β’ Design, develop, and maintain scalable data pipelines and ETL workflows using Python, Databricks, and AWS Glue.
β’ Build and optimize data warehouses and data lakes leveraging PySpark and AWS cloud platforms.
β’ Implement data ingestion, transformation, and orchestration workflows using AWS Step Functions, Lambda, and Glue Jobs.
β’ Monitor, troubleshoot, and optimize pipeline performance using AWS CloudWatch and other monitoring tools.
β’ Collaborate with data analysts, data scientists, and business stakeholders to define data requirements and deliver reliable datasets.
β’ Ensure data quality, governance, and security across all systems and workflows.
β’ Automate repetitive data engineering tasks and contribute to building reusable frameworks and templates.
β’ Support continuous improvement by adopting best practices for CI/CD, version control, and infrastructure-as-code (IaC).
Qualifications:
β’ 10+ years of experience in Data Engineering or related roles.
β’ Strong expertise in Python, PySpark, Databricks, and AWS services (Glue, Lambda, Step Functions, CloudWatch).
β’ Proven experience building and managing data warehouses and data lakes.
β’ Solid understanding of ETL pipelines, data modeling, and workflow orchestration.
β’ Familiarity with CI/CD, version control (Git), and infrastructure-as-code tools.
β’ Excellent problem-solving skills and ability to collaborate with both technical and non-technical stakeholders.
Preferred Skills:
β’ Knowledge of data governance, security best practices, and compliance standards.
β’ Experience in monitoring and optimizing large-scale data pipelines.
β’ Strong communication skills for interacting with business and technical teams.
Candidate Notes:
β’ Local profiles only β candidates must already reside in New Jersey or nearby; no relocation assistance will be provided.
β’ Work schedule: 5 days/week onsite at the NJ location.
Job Title: Senior Data Engineer β Python / Databricks / AWS
Location: New Jersey, USA / Boston
Work Type: W2 Contract, 5 days/week, Onsite
Experience: 10+ Years
Candidate Requirements: Local candidates only (no relocation assistance)
Job Description:
We are seeking a highly experienced Senior Data Engineer to design, develop, and maintain scalable data pipelines and ETL processes. The ideal candidate will have 10+ years of hands-on experience with Python, Databricks, and AWS Glue, and a strong background in data architecture, data lakes, and data warehouses. This is an onsite W2 contract role based in New Jersey, USA.
Key Responsibilities:
β’ Design, develop, and maintain scalable data pipelines and ETL workflows using Python, Databricks, and AWS Glue.
β’ Build and optimize data warehouses and data lakes leveraging PySpark and AWS cloud platforms.
β’ Implement data ingestion, transformation, and orchestration workflows using AWS Step Functions, Lambda, and Glue Jobs.
β’ Monitor, troubleshoot, and optimize pipeline performance using AWS CloudWatch and other monitoring tools.
β’ Collaborate with data analysts, data scientists, and business stakeholders to define data requirements and deliver reliable datasets.
β’ Ensure data quality, governance, and security across all systems and workflows.
β’ Automate repetitive data engineering tasks and contribute to building reusable frameworks and templates.
β’ Support continuous improvement by adopting best practices for CI/CD, version control, and infrastructure-as-code (IaC).
Qualifications:
β’ 10+ years of experience in Data Engineering or related roles.
β’ Strong expertise in Python, PySpark, Databricks, and AWS services (Glue, Lambda, Step Functions, CloudWatch).
β’ Proven experience building and managing data warehouses and data lakes.
β’ Solid understanding of ETL pipelines, data modeling, and workflow orchestration.
β’ Familiarity with CI/CD, version control (Git), and infrastructure-as-code tools.
β’ Excellent problem-solving skills and ability to collaborate with both technical and non-technical stakeholders.
Preferred Skills:
β’ Knowledge of data governance, security best practices, and compliance standards.
β’ Experience in monitoring and optimizing large-scale data pipelines.
β’ Strong communication skills for interacting with business and technical teams.
Candidate Notes:
β’ Local profiles only β candidates must already reside in New Jersey or nearby; no relocation assistance will be provided.
β’ Work schedule: 5 days/week onsite at the NJ location.






