

Snowflake Developer with Strong ETL/SQL
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Snowflake Developer with strong ETL/SQL skills, located onsite in Lake Mary, FL. Contract length and pay rate are unspecified. Requires 8+ years in SQL, 5+ years with Snowflake, and expertise in data warehousing and performance tuning.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 14, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Lake Mary, FL
🧠 - Skills detailed
#Scala #SQL Queries #Data Architecture #Data Integrity #Cloud #Deployment #Agile #Data Quality #Data Pipeline #Data Warehouse #SnowSQL #Migration #Snowflake #Data Extraction #Documentation #GIT #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Python #Spark (Apache Spark) #Business Analysis #Data Engineering #Data Migration #Version Control #Security
Role description
Snowflake Developer with strong ETL/SQL
Location: Lake Mary, FL (Onsite)
Key Responsibilities:
• Design and develop scalable data pipelines and ETL processes using Snowflake.
• Write optimized, advanced SQL queries for data extraction, transformation, and loading (ETL).
• Develop, implement, and maintain Snowflake database objects including schemas, tables, views, and stored procedures.
• Work closely with business analysts, data architects, and data engineers to gather requirements and translate them into technical solutions.
• Ensure performance tuning, query optimization, and best practices in Snowflake development.
• Support data migration from legacy platforms to Snowflake.
• Create and maintain technical documentation related to data pipelines, ETL workflows, and Snowflake models.
• Perform data quality checks and ensure data integrity across systems.
• Collaborate with cross-functional teams for project delivery in an agile environment.
Required Skills and Qualifications:
• 8+ years of experience in SQL development, ETL design, and implementation.
• 5+ years of hands-on experience working with Snowflake Data Warehouse.
• Strong knowledge of Snowflake architecture, SnowSQL, and Snowflake security frameworks.
• Proficient in developing ETL pipelines using tools or custom Python/Spark scripts.
• Strong understanding of data warehousing concepts and dimensional modeling.
• Experience with any of the cloud platforms
• Expertise in performance tuning and query optimization.
• Knowledge of Git, CI/CD pipelines, and version control for data pipeline deployments.
Snowflake Developer with strong ETL/SQL
Location: Lake Mary, FL (Onsite)
Key Responsibilities:
• Design and develop scalable data pipelines and ETL processes using Snowflake.
• Write optimized, advanced SQL queries for data extraction, transformation, and loading (ETL).
• Develop, implement, and maintain Snowflake database objects including schemas, tables, views, and stored procedures.
• Work closely with business analysts, data architects, and data engineers to gather requirements and translate them into technical solutions.
• Ensure performance tuning, query optimization, and best practices in Snowflake development.
• Support data migration from legacy platforms to Snowflake.
• Create and maintain technical documentation related to data pipelines, ETL workflows, and Snowflake models.
• Perform data quality checks and ensure data integrity across systems.
• Collaborate with cross-functional teams for project delivery in an agile environment.
Required Skills and Qualifications:
• 8+ years of experience in SQL development, ETL design, and implementation.
• 5+ years of hands-on experience working with Snowflake Data Warehouse.
• Strong knowledge of Snowflake architecture, SnowSQL, and Snowflake security frameworks.
• Proficient in developing ETL pipelines using tools or custom Python/Spark scripts.
• Strong understanding of data warehousing concepts and dimensional modeling.
• Experience with any of the cloud platforms
• Expertise in performance tuning and query optimization.
• Knowledge of Git, CI/CD pipelines, and version control for data pipeline deployments.