Headway Tek Inc

Sr. Snowflake Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Sr. Snowflake Data Engineer" in Dallas, TX, for 1 year at a competitive pay rate. Requires 12+ years in data engineering, 3+ years with Snowflake, and expertise in ETL tools, cloud platforms, and data migration.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 25, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Java #Data Integration #Data Integrity #Data Modeling #Computer Science #Documentation #Spark (Apache Spark) #Data Security #DevOps #Data Warehouse #SQL (Structured Query Language) #Compliance #Data Engineering #AWS (Amazon Web Services) #Airflow #GCP (Google Cloud Platform) #Informatica #ML (Machine Learning) #Apache Airflow #Microsoft Power BI #Programming #Cloud #Snowflake #Data Science #dbt (data build tool) #Looker #Data Pipeline #"ETL (Extract #Transform #Load)" #Tableau #Kafka (Apache Kafka) #Scala #Data Processing #Migration #BI (Business Intelligence) #Microservices #Azure #Python #Data Migration #Security #Talend
Role description
Job Title : Sr. Snowflake Data Engineer Location : Dallas, TX (Locals Only) Duration : 1 Year This role is for an expert Snowflake Data Engineer with 12+ years in software and data engineering, including 3-5 years focused on building and optimizing data microservices and delivering end-to-end solutions. You will design, implement, and deploy scalable Snowflake architectures, collaborating with business and technology teams to ensure efficient, high-quality data delivery from start to finish. Job Requirements: β€’ Design, develop, and optimize complex data pipelines and ETL processes using Snowflake and complementary cloud technologies. β€’ Architect and implement scalable data warehouse solutions to support business requirements and analytics needs. β€’ Lead data migration and modernization projects from legacy platforms to Snowflake. β€’ Develop and enforce best practices for Snowflake architecture, security, performance tuning, and cost optimization. β€’ Mentor and guide junior and mid-level engineers, fostering a culture of technical excellence and continuous learning. β€’ Collaborate closely with data scientists, analysts, and business stakeholders to understand requirements and deliver robust solutions. β€’ Develop and maintain documentation, data models, and data dictionaries for Snowflake environments. β€’ Monitor, troubleshoot, and resolve issues related to data integrity, performance, and reliability. β€’ Evaluate and integrate new data tools and technologies to enhance the data engineering ecosystem. β€’ Ensure compliance with company policies and industry standards regarding data security and governance. Required Qualifications: β€’ Bachelor’s or master’s degree in computer science, Engineering, Information Systems, or a related field. β€’ 12+ years of experience in data engineering or a related field, with demonstrable expertise in building and maintaining large-scale data systems. β€’ 3+ years of hands-on experience with Snowflake, including data modeling, performance tuning, and advanced SQL. β€’ Strong experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Apache Airflow) and data integration techniques. β€’ Proficiency in programming languages such as Python, Scala, or Java for data processing. β€’ Deep understanding of cloud data platforms (AWS, Azure, or GCP) and their integration with Snowflake. β€’ Proven track record of leading complex data migration and modernization projects. β€’ Excellent analytical, problem-solving, and communication skills. β€’ Experience with data security best practices, access controls, and compliance requirements. β€’ Familiarity with CI/CD pipelines and DevOps practices in the context of data engineering. Preferred Skills: β€’ Snowflake certification(s) strongly preferred. β€’ Experience with real-time data processing frameworks (e.g., Kafka, Spark Streaming). β€’ Knowledge of BI/reporting tools (e.g., Tableau, Power BI, Looker). β€’ Exposure to machine learning workflows and data science collaboration.