

Lead Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer in Dallas, TX, with a contract length of unspecified duration and a pay rate of "unknown." Key skills include Snowflake, dbt, SQL, and ETL processes, requiring 2+ years of relevant experience and a Bachelor's degree in a related field.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Azure #Data Governance #Version Control #AWS (Amazon Web Services) #dbt (data build tool) #Datasets #Data Quality #Airflow #SQL Queries #Data Pipeline #Scala #Deployment #Data Analysis #Microsoft SQL Server #Luigi #Statistics #Microsoft SQL #SQL (Structured Query Language) #GIT #Data Orchestration #Data Warehouse #"ETL (Extract #Transform #Load)" #Security #Cloud #MS SQL (Microsoft SQL Server) #Programming #Data Science #SQL Server #Data Engineering #Computer Science #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title: Lead Data Engineer
Location: Dallas, TX
About the Role:
We are seeking a Data Engineer to join our team and leverage their expertise in Snowflake, dbt, and data transformation tools to build and maintain scalable data pipelines. You will be responsible for extracting, transforming, and loading (ETL) data from Microsoft SQL Server (MSSQL) into our Snowflake data warehouse, ensuring clean, reliable data for analysis and reporting.
Responsibilities:
β’ Design, develop, and maintain data pipelines using Snowflake, dbt, or Coalesce.
β’ Write and optimize SQL queries within dbt for efficient data transformation in Snowflake.
β’ Utilize Coalesce or similar data transformation tools to migrate data from MSSQL to Snowflake.
β’ Cleanse, validate, and transform data to ensure accuracy and consistency.
β’ Design and implement data models in dbt to transform raw data into analytical-ready datasets.
β’ Automate data pipelines and ensure their reliability and scalability.
β’ Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and translate them into technical solutions.
β’ Develop and implement unit tests for data pipelines and data models.
β’ Monitor and troubleshoot data pipelines for any issues.
β’ Document data pipelines and models for future reference.
Qualifications:
β’ Bachelor's degree in Computer Science, Data Science, Statistics, or a related field
β’ Minimum 2+ years of experience as a Data Engineer or similar role.
β’ Proven experience with Snowflake, including writing SQL queries and leveraging its functionalities.
β’ Expertise in dbt, including model development, testing, and deployment.
β’ Experience with data transformation tools like dbt, Coalesce, or Hevo Data.
β’ Strong understanding of data warehousing concepts and best practices.
β’ Experience with ETL/ELT processes.
β’ Excellent SQL programming skills (T-SQL and Snowflake SQL).
β’ Experience with version control systems (e.g., Git).
β’ Familiarity with data quality concepts and techniques.
β’ Excellent communication and collaboration skills.
Bonus Points:
β’ Experience with cloud platforms (e.g., AWS, Azure).
β’ Experience with data orchestration tools (e.g., Airflow, Luigi).
β’ Experience with data governance and security practices.