Jobs via Dice

Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer in Roseland, NJ, offering a 12-month contract at a W2 pay rate. Requires 8+ years of experience, expertise in AWS, Databricks, Python, and PySpark, along with strong leadership and data integration skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 20, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Roseland, NJ
-
🧠 - Skills detailed
#Databricks #Scala #"ETL (Extract #Transform #Load)" #PySpark #Data Analysis #Computer Science #Leadership #Python #Terraform #Data Science #GitHub #Jenkins #Compliance #Monitoring #Security #Data Engineering #Data Quality #CRM (Customer Relationship Management) #SQL (Structured Query Language) #Splunk #RDBMS (Relational Database Management System) #Metadata #Replication #Datasets #MongoDB #Spark (Apache Spark) #Data Integration #SQL Server #Data Lake #Informatica #Dynatrace #Data Catalog #Oracle #Batch #NoSQL #Agile #Cloud #Data Pipeline #Jira #AWS (Amazon Web Services) #Informatica PowerCenter
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Black Rock Group, is seeking the following. Apply via Dice today! Title: Lead Data Engineer Location: Roseland, NJ (Hybrid 3 days onsite) Duration: 12-Month Contract Employment Type: W2 Relocation: Accepted Job Description We are seeking an experienced Lead Data Engineer specializing in AWS and Databricks to drive the design, development, and delivery of a scalable data hub/marketplace supporting internal analytics, data science, and downstream applications. This role focuses on building robust data integration workflows, enterprise data models, and curated datasets to support complex business needs. You will collaborate across engineering, analytics, and business teams to define integration rules, data acquisition strategies, data quality standards, and metadata best practices. This position requires strong leadership, hands-on technical depth, and the ability to communicate effectively with technical and non-technical stakeholders. Must-Have Skills • AWS • Databricks • Python • PySpark • Lead/Staff-level engineering experience • Contact Center experience (nice to have) Key Responsibilities • Lead the architecture, design, and delivery of an enterprise data hub/marketplace. • Build and optimize data integration workflows, ingestion pipelines, and subscription-based services. • Develop and maintain enterprise data models for data lakes, warehouses, and analytics environments. • Define integration rules and data acquisition methods (batch, streaming, replication). • Conduct detailed data analysis to validate source systems and support use-case requirements. • Establish data quality standards, monitoring practices, and governance alignment. • Maintain enterprise data taxonomy, lineage, and catalog metadata. • Mentor junior developers and collaborate closely with architects and peer engineering teams. • Communicate clearly with business and technical stakeholders. Required Qualifications • Bachelor s degree in Computer Science, Information Technology, or related field. • 8+ years of experience integrating and transforming data into standardized, consumption-ready datasets. • Strong expertise with AWS, Databricks, Python, PySpark, and SQL. • Advanced knowledge of cloud-based data platforms and warehouse technologies. • Strong experience with RDBMS (Oracle, SQL Server). • Familiarity with NoSQL (MongoDB). • Experience designing scalable data pipelines for structured and unstructured data. • Strong understanding of data quality, governance, compliance, and security. • Experience building ingestion pipelines and lakehouse-style architectures. • Ability to define and design complex data engineering solutions with minimal guidance. • Excellent communication, analytical, and problem-solving skills. Nice-to-Have Skills • Knowledge of contact center technologies: Salesforce, ServiceNow, Oracle CRM, Genesys Cloud/InfoMart, Calabrio, Nuance, IBM Chatbot, etc. • Experience with GitHub, JIRA, Confluence. • CI/CD experience with Jenkins, Terraform, Splunk, Dynatrace. • Knowledge of Informatica PowerCenter, Data Quality, Data Catalog. • Experience with Agile methodology. • Databricks Data Engineer Associate certification.