

ideaHelix
Senior Data Engineer(Snowflake, ETL) Local to Bay Area
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Snowflake, ETL) with a contract length of "unknown" and a pay rate of "$XX/hour". It requires 5+ years of data engineering experience, expertise in Snowflake, ETL tools, SQL, and cloud platforms.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
December 5, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
San Francisco Bay Area
-
π§ - Skills detailed
#Snowflake #Microsoft Power BI #Visualization #GIT #Hadoop #Data Storage #Data Governance #Metadata #Big Data #Kafka (Apache Kafka) #Spark (Apache Spark) #Cloud #Azure #SQL (Structured Query Language) #Storage #Data Integrity #Scripting #Data Management #AWS (Amazon Web Services) #Talend #Security #Tableau #BI (Business Intelligence) #Data Quality #Data Orchestration #Data Pipeline #Data Processing #dbt (data build tool) #Python #Version Control #"ETL (Extract #Transform #Load)" #Database Management #Computer Science #Informatica #Data Engineering
Role description
Key Responsibilities
β’ Develop and optimize cloud-based data storage and processing solutions using Snowflake.
β’ Design, implement, and maintain robust data pipelines and ETL processes
β’ Collaborate with Federated teams and other data engineers to understand data requirements and deliver high-quality data solutions.
β’ Ensure data integrity and security across all data workflows and storage solutions.
β’ Monitor and troubleshoot data pipelines, addressing any issues promptly to ensure the smooth flow of data.
β’ Develop reusable and modular stored procedures and scripts for data processing.
β’ Contribute to the development and implementation of data governance and best practices.
β’ Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
β’ Implement best practices for data governance, data quality, and metadata management.
Minimum Qualifications
β’ Bachelorβs or masterβs degree in computer science, Engineering, or a related field.
β’ Minimum of 5 years of experience in data engineering or a related role.
β’ Proven experience with Snowflake is required.
β’ Knowledge of data warehousing concepts, dimensional modeling, and performance tuning.
β’ Hands-on experience with ETL tools (e.g., Informatica, Talend, dbt, or custom ETL frameworks).
β’ Strong proficiency in SQL and database management.
β’ Experience with cloud platforms such as AWS, Azure, or Google Cloud.
β’ Familiarity with version control (Git) and CI/CD for data pipelines.
β’ Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka is a plus.
β’ Excellent problem-solving and analytical skills.
β’ Strong communication and collaboration abilities.
Preferred Skills
β’ Experience with Python or other scripting languages.
β’ Experience with informatica but not necessary.
β’ Knowledge of data visualization tools such as Tableau or Power BI.
β’ Exposure to data orchestration tools
Key Responsibilities
β’ Develop and optimize cloud-based data storage and processing solutions using Snowflake.
β’ Design, implement, and maintain robust data pipelines and ETL processes
β’ Collaborate with Federated teams and other data engineers to understand data requirements and deliver high-quality data solutions.
β’ Ensure data integrity and security across all data workflows and storage solutions.
β’ Monitor and troubleshoot data pipelines, addressing any issues promptly to ensure the smooth flow of data.
β’ Develop reusable and modular stored procedures and scripts for data processing.
β’ Contribute to the development and implementation of data governance and best practices.
β’ Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
β’ Implement best practices for data governance, data quality, and metadata management.
Minimum Qualifications
β’ Bachelorβs or masterβs degree in computer science, Engineering, or a related field.
β’ Minimum of 5 years of experience in data engineering or a related role.
β’ Proven experience with Snowflake is required.
β’ Knowledge of data warehousing concepts, dimensional modeling, and performance tuning.
β’ Hands-on experience with ETL tools (e.g., Informatica, Talend, dbt, or custom ETL frameworks).
β’ Strong proficiency in SQL and database management.
β’ Experience with cloud platforms such as AWS, Azure, or Google Cloud.
β’ Familiarity with version control (Git) and CI/CD for data pipelines.
β’ Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka is a plus.
β’ Excellent problem-solving and analytical skills.
β’ Strong communication and collaboration abilities.
Preferred Skills
β’ Experience with Python or other scripting languages.
β’ Experience with informatica but not necessary.
β’ Knowledge of data visualization tools such as Tableau or Power BI.
β’ Exposure to data orchestration tools






