

Jobs via Dice
Lead Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer in Atlanta, GA (Hybrid) on a 3 to 6-month contract. Requires 11+ years of experience, expertise in Databricks, Azure Data Lake Storage, data pipeline development, and strong analytical skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 29, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Databricks #Data Lakehouse #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Framework #Azure Databricks #Data Pipeline #Scala #Data Engineering #Data Science #ML (Machine Learning) #Azure ADLS (Azure Data Lake Storage) #Storage #SSRS (SQL Server Reporting Services) #Visualization #Data Lake #SQL Server #Data Ingestion #Azure #Data Mart #Monitoring #Debugging #ADLS (Azure Data Lake Storage) #Data Quality #Tableau #Data Mining #Cloud #Data Integrity
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Tharu Technologies, is seeking the following. Apply via Dice today!
Role: Lead Data Engineer
Location: Atlanta, GA (Hybrid)
Job Type: 3 to 6 Months contract to hire
Tax Term: W2
Exp: 11+ Years
Job Responsibilities
Design and implement scalable data frameworks to manage end-to-end data pipelines for workforce data analytics.
Develop secure, high-quality production code and data pipelines on Databricks, reviewing and debugging processes implemented by others.
Identify opportunities to eliminate or automate remediation of recurring issues to improve operational stability of software applications and systems.
Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture.
Work with business stakeholders to understand requirements and design appropriate solutions, producing architecture and design artifacts for complex applications and implement the right ER Model with right fact and dimension tables.
Implement robust monitoring and alerting systems to proactively identify and address data ingestion issues, optimizing performance and throughput.
Implement data quality checks and validation processes to ensure accuracy and reliability of data.
Cloud Functions
Design and implement end-to-end data solutions using Databricks Pipelines and Azure Data Lake Storage (ADLS), Azure Databricks.
Develop, monitor, and maintain scalable data pipelines to support analytics and data science projects.
Understand and perform Data Integrity checks and Data Quality checks. Research and utilize Databricks test suites and make them absorbed into data eco system.
Utilize Azure Databricks to perform complex data transformation, aggregation, and machine learning tasks.
Assist with designing the architecture and workflow for an Enterprise Data Lakehouse in Databricks who wish to build their own Data Marts
Required Knowledge, Skills, & Abilities:
Understanding of reporting and/or visualization tools (e.g., Tableau, SSRS, SQL Server)
Knowledge of a variety of technologies, data models, and insights across all relevant data sources
Understanding of Governance principles
Understands concepts like data mining, extraction and analysis as it pertains to a specific bank pillar (e.g., Commercial, Retail, Wealth)
Analytical and critical thinking skills
Ability to quickly shift between technology stacks
Ability to mentor and train team members
Strong verbal and written communication skills
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Tharu Technologies, is seeking the following. Apply via Dice today!
Role: Lead Data Engineer
Location: Atlanta, GA (Hybrid)
Job Type: 3 to 6 Months contract to hire
Tax Term: W2
Exp: 11+ Years
Job Responsibilities
Design and implement scalable data frameworks to manage end-to-end data pipelines for workforce data analytics.
Develop secure, high-quality production code and data pipelines on Databricks, reviewing and debugging processes implemented by others.
Identify opportunities to eliminate or automate remediation of recurring issues to improve operational stability of software applications and systems.
Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture.
Work with business stakeholders to understand requirements and design appropriate solutions, producing architecture and design artifacts for complex applications and implement the right ER Model with right fact and dimension tables.
Implement robust monitoring and alerting systems to proactively identify and address data ingestion issues, optimizing performance and throughput.
Implement data quality checks and validation processes to ensure accuracy and reliability of data.
Cloud Functions
Design and implement end-to-end data solutions using Databricks Pipelines and Azure Data Lake Storage (ADLS), Azure Databricks.
Develop, monitor, and maintain scalable data pipelines to support analytics and data science projects.
Understand and perform Data Integrity checks and Data Quality checks. Research and utilize Databricks test suites and make them absorbed into data eco system.
Utilize Azure Databricks to perform complex data transformation, aggregation, and machine learning tasks.
Assist with designing the architecture and workflow for an Enterprise Data Lakehouse in Databricks who wish to build their own Data Marts
Required Knowledge, Skills, & Abilities:
Understanding of reporting and/or visualization tools (e.g., Tableau, SSRS, SQL Server)
Knowledge of a variety of technologies, data models, and insights across all relevant data sources
Understanding of Governance principles
Understands concepts like data mining, extraction and analysis as it pertains to a specific bank pillar (e.g., Commercial, Retail, Wealth)
Analytical and critical thinking skills
Ability to quickly shift between technology stacks
Ability to mentor and train team members
Strong verbal and written communication skills






