

Realign LLC
Lead Data Engineer (Locals Only)-1
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer in Dallas, TX, on a W2 contract basis. It requires 10+ years of data engineering experience, 3+ years with Snowflake and Databricks, strong Python skills, and expertise in data pipelines and automation.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 10, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Data Science #Azure #Data Warehouse #Jenkins #BI (Business Intelligence) #Spark (Apache Spark) #Scala #Agile #Python #Scripting #Semantic Models #Automation #Bash #GitLab #Snowflake #Cloud #BitBucket #Azure DevOps #"ETL (Extract #Transform #Load)" #DevOps #Libraries #Databricks #Deployment #Documentation #Data Quality #GitHub #PySpark #Monitoring #Data Ingestion #Logging #Data Engineering #Data Pipeline #Security #Data Integration #SQL (Structured Query Language) #Computer Science #Data Modeling
Role description
Job Type: Contract
Job Category: IT
Role: Lead Data Engineer
Location: Dallas, TX (Locals Only)
W2 Only
Job Description:
This role has a specialized focus on building and maintaining robust, scalable, and automated data pipelines and plays a key role in optimizing our data infrastructure and enabling efficient data delivery across the organization. As the organization consolidates data in Snowflake, this role will be instrumental in building data pipelines, ensuring data quality.
Job Duties:
Design, build, and maintain scalable and resilient pipelines for data applications and infrastructure, with a focus on Snowflake.
Implement and manage Snowflake projects for data transformation, including developing tests, and documentation, and integrating into CI/CD workflows.
Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.
Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.
Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.
Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.
Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.
Document system architectures, configurations, and operational procedures.
Optimize data pipelines for performance, scalability, and cost.
Minimum Qualifications / Other Expectations:
Education & Experience:
Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.
10+ years of hands-on experience in Data Engineering
3+ years of experience specifically focused on Snowflake and Databricks
Other:
Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.
Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.
Hands-on experience managing and optimizing Snowflake data warehouse environments.
Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.
Have familiarity with Spark and Pyspark and Databricks
Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus.
Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).
Knowledge of data integration tools and ETL/ELT concepts.
Familiarity with monitoring and logging tools.
Strong SQL skills.
Ability to work independently and as part of a collaborative team in an agile environment.
Strong communication skills, with the ability to explain complex technical concepts clearly.
Required Skills
DEVOPS ENGINEER
SENIOR EMAIL SECURITY ENGINEER
Job Type: Contract
Job Category: IT
Role: Lead Data Engineer
Location: Dallas, TX (Locals Only)
W2 Only
Job Description:
This role has a specialized focus on building and maintaining robust, scalable, and automated data pipelines and plays a key role in optimizing our data infrastructure and enabling efficient data delivery across the organization. As the organization consolidates data in Snowflake, this role will be instrumental in building data pipelines, ensuring data quality.
Job Duties:
Design, build, and maintain scalable and resilient pipelines for data applications and infrastructure, with a focus on Snowflake.
Implement and manage Snowflake projects for data transformation, including developing tests, and documentation, and integrating into CI/CD workflows.
Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.
Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.
Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.
Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.
Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.
Document system architectures, configurations, and operational procedures.
Optimize data pipelines for performance, scalability, and cost.
Minimum Qualifications / Other Expectations:
Education & Experience:
Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.
10+ years of hands-on experience in Data Engineering
3+ years of experience specifically focused on Snowflake and Databricks
Other:
Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.
Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.
Hands-on experience managing and optimizing Snowflake data warehouse environments.
Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.
Have familiarity with Spark and Pyspark and Databricks
Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus.
Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).
Knowledge of data integration tools and ETL/ELT concepts.
Familiarity with monitoring and logging tools.
Strong SQL skills.
Ability to work independently and as part of a collaborative team in an agile environment.
Strong communication skills, with the ability to explain complex technical concepts clearly.
Required Skills
DEVOPS ENGINEER
SENIOR EMAIL SECURITY ENGINEER