

CloudHive
Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Data Engineer on a contract basis in Austin, TX, requiring 5+ years of data engineering experience, strong Snowflake and AWS skills, and proficiency in SQL. Hybrid work involves 2 days on-site weekly.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
October 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, Texas Metropolitan Area
-
🧠 - Skills detailed
#Data Governance #AWS (Amazon Web Services) #Data Modeling #SQL (Structured Query Language) #dbt (data build tool) #Lambda (AWS Lambda) #DevOps #Security #Data Pipeline #Data Science #Cloud #Airflow #Snowflake #PostgreSQL #Compliance #"ETL (Extract #Transform #Load)" #Data Warehouse #Redshift #S3 (Amazon Simple Storage Service) #Data Engineering #Scala #Agile #Data Orchestration #Data Quality
Role description
Job Title: Snowflake Data Engineer
Location: Austin, TX (Hybrid – 2 Days On-site per Week)
Type: Contract
About the Role
We are seeking several highly skilled Snowflake Data Engineer with extensive experience in Snowflake, AWS, and SQL to join our data engineering team. This role will play a key part in designing, developing, and optimizing our cloud-based data infrastructure to support analytics, reporting, and data-driven decision-making across the organization.
Responsibilities
• Design, build, and maintain scalable data pipelines and ETL processes using Snowflake and AWS services.
• Optimize performance and ensure data quality across our Snowflake data warehouse.
• Work with stakeholders to understand data requirements and translate them into technical solutions.
• Collaborate with cross-functional teams including Data Science, Analytics, and DevOps.
• Ensure data governance, security, and compliance standards are upheld.
Required Qualifications
• 5+ years of professional experience in Data Engineering.
• Strong hands-on experience with Snowflake – architecture, performance tuning, data modeling.
• Deep understanding of AWS services related to data engineering (e.g., S3, Lambda, Glue, Redshift, etc.).
• Proficient in SQL and working knowledge of PostgreSQL.
• Experience designing and optimizing complex data pipelines.
• Strong problem-solving and communication skills.
Nice to Have
• Experience with CI/CD for data pipelines.
• Familiarity with data orchestration tools (e.g., Airflow, dbt).
• Background in data warehousing and analytics.
Work Environment
• Hybrid role: 2 days per week on-site in Austin, TX, remainder remote.
• Collaborative and agile team environment.
• Opportunity to work with modern data stack technologies.
Job Title: Snowflake Data Engineer
Location: Austin, TX (Hybrid – 2 Days On-site per Week)
Type: Contract
About the Role
We are seeking several highly skilled Snowflake Data Engineer with extensive experience in Snowflake, AWS, and SQL to join our data engineering team. This role will play a key part in designing, developing, and optimizing our cloud-based data infrastructure to support analytics, reporting, and data-driven decision-making across the organization.
Responsibilities
• Design, build, and maintain scalable data pipelines and ETL processes using Snowflake and AWS services.
• Optimize performance and ensure data quality across our Snowflake data warehouse.
• Work with stakeholders to understand data requirements and translate them into technical solutions.
• Collaborate with cross-functional teams including Data Science, Analytics, and DevOps.
• Ensure data governance, security, and compliance standards are upheld.
Required Qualifications
• 5+ years of professional experience in Data Engineering.
• Strong hands-on experience with Snowflake – architecture, performance tuning, data modeling.
• Deep understanding of AWS services related to data engineering (e.g., S3, Lambda, Glue, Redshift, etc.).
• Proficient in SQL and working knowledge of PostgreSQL.
• Experience designing and optimizing complex data pipelines.
• Strong problem-solving and communication skills.
Nice to Have
• Experience with CI/CD for data pipelines.
• Familiarity with data orchestration tools (e.g., Airflow, dbt).
• Background in data warehousing and analytics.
Work Environment
• Hybrid role: 2 days per week on-site in Austin, TX, remainder remote.
• Collaborative and agile team environment.
• Opportunity to work with modern data stack technologies.