

Jobs via Dice
Enterprise Data Lake Tech Lead - Dallas, TX (Hybrid) - 12 Months Contract - Direct Client
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Enterprise Data Lake Tech Lead in Dallas, TX (Hybrid) with a 12-month contract. Key skills include 5+ years in Apache Kafka, Databricks, RDBMS, cloud platforms (Azure/AWS), and leadership experience. Strong SQL and programming skills required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 22, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#IAM (Identity and Access Management) #Metadata #Storage #Indexing #Alation #AWS S3 (Amazon Simple Storage Service) #MySQL #Data Management #Monitoring #"ETL (Extract #Transform #Load)" #Leadership #SQL (Structured Query Language) #SSIS (SQL Server Integration Services) #Delta Lake #Cloud #S3 (Amazon Simple Storage Service) #Complex Queries #Database Migration #Azure #Vault #Data Engineering #Kafka (Apache Kafka) #Azure SQL #Data Governance #Apache Spark #Data Migration #Database Systems #ADF (Azure Data Factory) #AWS DMS (AWS Database Migration Service) #ADLS (Azure Data Lake Storage) #SQL Server #Databricks #AWS (Amazon Web Services) #Shell Scripting #Schema Design #Terraform #Collibra #Security #Scripting #Kubernetes #SSRS (SQL Server Reporting Services) #RDS (Amazon Relational Database Service) #Apache Kafka #Bash #Programming #Replication #Data Catalog #Aurora #DMS (Data Migration Service) #Python #Automation #DMP (Data Management Platform) #Compliance #C# #Data Lake #Migration #Azure ADLS (Azure Data Lake Storage) #RDBMS (Relational Database Management System) #Database Design #Azure Data Factory #Classification #Deployment #Spark (Apache Spark) #Clustering #Data Quality #PostgreSQL #Normalization #Data Extraction
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Accion Labs, is seeking the following. Apply via Dice today!
Required Qualifications
Technical Expertise
Data Streaming & Processing
• 5+ years with Apache Kafka (streaming architecture, Kafka Connect, Schema Registry, stream processing)
• 3+ years with Databricks (Delta Lake, Apache Spark optimization, Unity Catalog, cluster management)
• Deep understanding of data lake architecture patterns (Bronze/Silver/Gold, medallion architecture)
Data Governance & Cataloging
• 3+ years with DataHub or similar metadata management platforms (Alation, Collibra, Apache Atlas)
• Deep experience building and operating enterprise data catalog systems
• Expertise in automated metadata extraction, lineage tracking, and impact analysis
• Experience with data quality frameworks and metadata-driven data operations
• Knowledge of data governance policies, data classification, and compliance automation
• Understanding of data discovery, access control, and self-service analytics enablement
Relational Database Systems
• 5+ years with enterprise RDBMS platforms including:
• SQL Server (T-SQL, SSIS, SSRS, replication, Always On Availability Groups)
• PostgreSQL (advanced query optimization, partitioning, extensions, streaming replication)
• MySQL (replication, clustering, performance tuning)
• Strong SQL skills (complex queries, stored procedures, window functions, query optimization)
• Database design principles (normalization, indexing strategies, schema design, partitioning)
• Change Data Capture (CDC) patterns and implementation (Debezium, Azure Data Factory, AWS DMS, custom solutions)
• Database migration experience (schema migration, data migration, zero-downtime migrations)
Cloud & Infrastructure
• 5+ years with Azure or AWS Cloud including:
• Azure: Data Lake Storage (Gen2), Event Hubs, AKS, Azure SQL, Key Vault, Azure AD, Monitor
• AWS: S3, MSK/Kinesis, EKS, RDS/Aurora, Secrets Manager, IAM, CloudWatch
• Cloud-native data services, networking, security, and IAM
• 3+ years with Kubernetes (deployment strategies, scaling, monitoring, service mesh, Helm)
• 3+ years with Terraform (modules, state management, multi-environment deployments, multi-cloud)
Programming & Development
• Strong proficiency in any programming languages like C#, Go
• Expert-level SQL across multiple database platforms
• Experience with Python for data engineering tasks (preferred)
• Familiarity with Shell scripting (Bash)
Leadership & Experience
• 7+ years in data engineering, platform engineering, or database engineering roles
• 3+ years in technical leadership capacity (Tech Lead, Principal Engineer)
• Proven track record of delivering large-scale data infrastructure projects
• Experience leading teams of 5-10+ engineers
• Strong architectural design and system thinking capabilities
• Experience migrating legacy RDBMS workloads to modern data lake architectures
• Demonstrated ability to balance technical excellence with business needs
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Accion Labs, is seeking the following. Apply via Dice today!
Required Qualifications
Technical Expertise
Data Streaming & Processing
• 5+ years with Apache Kafka (streaming architecture, Kafka Connect, Schema Registry, stream processing)
• 3+ years with Databricks (Delta Lake, Apache Spark optimization, Unity Catalog, cluster management)
• Deep understanding of data lake architecture patterns (Bronze/Silver/Gold, medallion architecture)
Data Governance & Cataloging
• 3+ years with DataHub or similar metadata management platforms (Alation, Collibra, Apache Atlas)
• Deep experience building and operating enterprise data catalog systems
• Expertise in automated metadata extraction, lineage tracking, and impact analysis
• Experience with data quality frameworks and metadata-driven data operations
• Knowledge of data governance policies, data classification, and compliance automation
• Understanding of data discovery, access control, and self-service analytics enablement
Relational Database Systems
• 5+ years with enterprise RDBMS platforms including:
• SQL Server (T-SQL, SSIS, SSRS, replication, Always On Availability Groups)
• PostgreSQL (advanced query optimization, partitioning, extensions, streaming replication)
• MySQL (replication, clustering, performance tuning)
• Strong SQL skills (complex queries, stored procedures, window functions, query optimization)
• Database design principles (normalization, indexing strategies, schema design, partitioning)
• Change Data Capture (CDC) patterns and implementation (Debezium, Azure Data Factory, AWS DMS, custom solutions)
• Database migration experience (schema migration, data migration, zero-downtime migrations)
Cloud & Infrastructure
• 5+ years with Azure or AWS Cloud including:
• Azure: Data Lake Storage (Gen2), Event Hubs, AKS, Azure SQL, Key Vault, Azure AD, Monitor
• AWS: S3, MSK/Kinesis, EKS, RDS/Aurora, Secrets Manager, IAM, CloudWatch
• Cloud-native data services, networking, security, and IAM
• 3+ years with Kubernetes (deployment strategies, scaling, monitoring, service mesh, Helm)
• 3+ years with Terraform (modules, state management, multi-environment deployments, multi-cloud)
Programming & Development
• Strong proficiency in any programming languages like C#, Go
• Expert-level SQL across multiple database platforms
• Experience with Python for data engineering tasks (preferred)
• Familiarity with Shell scripting (Bash)
Leadership & Experience
• 7+ years in data engineering, platform engineering, or database engineering roles
• 3+ years in technical leadership capacity (Tech Lead, Principal Engineer)
• Proven track record of delivering large-scale data infrastructure projects
• Experience leading teams of 5-10+ engineers
• Strong architectural design and system thinking capabilities
• Experience migrating legacy RDBMS workloads to modern data lake architectures
• Demonstrated ability to balance technical excellence with business needs






