

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 6-month contract-to-hire, offering $50-$60 per hour. Requires 7+ years of experience, 3+ years with Snowflake, strong SQL and Python skills, and Snowflake SnowPro Core Certification. Remote work available.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date discovered
September 4, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Austin, Texas Metropolitan Area
-
π§ - Skills detailed
#Streamlit #Data Quality #Python #Snowpark #SQL (Structured Query Language) #Azure DevOps #Alation #Fivetran #GitHub #Data Management #Informatica #ML (Machine Learning) #Scala #DevOps #Monitoring #Collibra #Boomi #SnowPipe #Azure #Metadata #Snowflake #Talend #Data Engineering #Data Lake #dbt (data build tool) #Leadership #Cloud #"ETL (Extract #Transform #Load)" #Data Governance #Data Pipeline #Terraform
Role description
Data Engineer | 6 Month Contract-to-Hire | $50-$60p/h
β’ No C2C enquiries
β’ Weβre looking for an experienced Snowflake Data Engineer to design, build, and manage enterprise-scale data solutions in a modern cloud environment. Youβll play a key role in integrating Azure Data Lake, Snowflake Data Cloud, Boomi Integration Platform, and data governance tools into a scalable, well-structured ecosystem that delivers high-quality, governed, and accessible data across the organization.
What Youβll Do:
β’ Architect and optimize Snowflake data pipelines, models, and transformations.
β’ Implement Snowpipe, Snowpark, and Streamlit for real-time ingestion, advanced analytics, and applications.
β’ Design data structures following bronze, silver, and gold layer principles.
β’ Lead integration and orchestration of data flows using Boomi (or similar tools like Informatica, Mulesoft, Talend, Fivetran).
β’ Partner with governance teams to operationalize cataloging, lineage, and stewardship within platforms like OvalEdge, Collibra, or Alation.
β’ Define and enforce standards for metadata management, access controls, and data quality monitoring.
β’ Mentor a team of data engineers, drive best practices, and collaborate with analysts, scientists, and business stakeholders.
What Weβre Looking For:
β’ 7+ years of data engineering experience, including 3+ years leading Snowflake-focused projects.
β’ Deep expertise in Snowflake (Snowpipe, Snowpark, Streamlit).
β’ Strong knowledge of Azure Data Lake and cloud-native architectures.
β’ Hands-on experience with Boomi or similar integration/orchestration platforms.
β’ Familiarity with data governance platforms (OvalEdge, Collibra, Alation, Informatica).
β’ Strong SQL and Python skills; proven track record building scalable pipelines.
β’ Leadership experience with the ability to set technical standards and mentor teams.
β’ Snowflake SnowPro Core Certification required.
β’ Boomi certification (or ability to obtain within 90 days, with training support provided).
Nice to Have:
β’ Experience with CI/CD and DevOps practices (GitHub Actions, Azure DevOps, Terraform, DBT, Coalesce, Rivery).
β’ Knowledge of machine learning workflows and enabling analytics with Snowpark.
β’ Background in healthcare, financial services, or other regulated industries.
β’ Snowflake SnowPro Advanced: Data Engineer certification.
Data Engineer | 6 Month Contract-to-Hire | $50-$60p/h
Data Engineer | 6 Month Contract-to-Hire | $50-$60p/h
β’ No C2C enquiries
β’ Weβre looking for an experienced Snowflake Data Engineer to design, build, and manage enterprise-scale data solutions in a modern cloud environment. Youβll play a key role in integrating Azure Data Lake, Snowflake Data Cloud, Boomi Integration Platform, and data governance tools into a scalable, well-structured ecosystem that delivers high-quality, governed, and accessible data across the organization.
What Youβll Do:
β’ Architect and optimize Snowflake data pipelines, models, and transformations.
β’ Implement Snowpipe, Snowpark, and Streamlit for real-time ingestion, advanced analytics, and applications.
β’ Design data structures following bronze, silver, and gold layer principles.
β’ Lead integration and orchestration of data flows using Boomi (or similar tools like Informatica, Mulesoft, Talend, Fivetran).
β’ Partner with governance teams to operationalize cataloging, lineage, and stewardship within platforms like OvalEdge, Collibra, or Alation.
β’ Define and enforce standards for metadata management, access controls, and data quality monitoring.
β’ Mentor a team of data engineers, drive best practices, and collaborate with analysts, scientists, and business stakeholders.
What Weβre Looking For:
β’ 7+ years of data engineering experience, including 3+ years leading Snowflake-focused projects.
β’ Deep expertise in Snowflake (Snowpipe, Snowpark, Streamlit).
β’ Strong knowledge of Azure Data Lake and cloud-native architectures.
β’ Hands-on experience with Boomi or similar integration/orchestration platforms.
β’ Familiarity with data governance platforms (OvalEdge, Collibra, Alation, Informatica).
β’ Strong SQL and Python skills; proven track record building scalable pipelines.
β’ Leadership experience with the ability to set technical standards and mentor teams.
β’ Snowflake SnowPro Core Certification required.
β’ Boomi certification (or ability to obtain within 90 days, with training support provided).
Nice to Have:
β’ Experience with CI/CD and DevOps practices (GitHub Actions, Azure DevOps, Terraform, DBT, Coalesce, Rivery).
β’ Knowledge of machine learning workflows and enabling analytics with Snowpark.
β’ Background in healthcare, financial services, or other regulated industries.
β’ Snowflake SnowPro Advanced: Data Engineer certification.
Data Engineer | 6 Month Contract-to-Hire | $50-$60p/h