

Data Snowflake Developer/Engineer-W2 Role
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Snowflake Developer/Engineer in Tallahassee, FL, for 12+ months at a competitive pay rate. Requires a Bachelor's degree, 8+ years in data engineering, 2+ years with Snowflake, and expert SQL skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Tallahassee, FL
-
π§ - Skills detailed
#Fivetran #AWS S3 (Amazon Simple Storage Service) #Storage #Data Governance #Talend #Snowflake #Programming #Data Warehouse #Computer Science #Oracle #Azure #Data Migration #Agile #Data Architecture #GitHub #Informatica #Migration #SQL (Structured Query Language) #GIT #Looker #Airflow #Tableau #AWS (Amazon Web Services) #BI (Business Intelligence) #MS SQL (Microsoft SQL Server) #SSIS (SQL Server Integration Services) #Scrum #Data Engineering #Semantic Models #Visualization #"ETL (Extract #Transform #Load)" #Microsoft Power BI #dbt (data build tool) #Scala #Data Pipeline #Security #SnowPipe #DevOps #SQL Server #Cloud #S3 (Amazon Simple Storage Service)
Role description
Position: Data Snowflake Developer/Engineer
Location: Tallahassee, FL (Onsite) Need Locals or Nearby Only.
Duration: 12+ Months
Education
Bachelor's Degree in a field of study related to Information Technology, Computer Science, Data Analytics, or Finance. Work experience may substitute on a year-for-year basis for the degree.
Experience
Candidate must have a minimum of 8 years of experience in data engineering, analytics, or cloud data warehousing, with at least 2 years of hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform.
Primary Job Duties/ Tasks
The submitted candidate must be able to perform the following duties and/or tasks. Duties of the selected candidate will include, but not be limited to:
β’ Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake.
β’ Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake.
β’ Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by translating them into Snowflake SQL and optimizing performance.
β’ Develop and automate ELT/ETL data pipelines using Snowflake's native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow).
Job Specific Knowledge, Skills, and Abilities (KSAs)
Expert level SQL programming is REQUIRED for this position.
β’ Proven experience with Snowflake platform architecture and data warehousing concepts.
β’ Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
β’ Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
β’ Solid understanding of data governance, security roles, masking policies, and RBAC within Snowflake.
β’ Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external tables in Snowflake.
β’ Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools.
β’ Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
β’ Strong understanding of current data governance concepts and best practices.
β’ Knowledge of data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
Preferred KSAs:
β’ Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend.
β’ Experience working with financial, ERP, or general ledger data in a reporting or analytics capacity.
β’ Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
β’ Familiarity with Agile/SCRUM frameworks and experience working in iterative development cycles.
β’ Experience with Oracle Data Warehouse.
β’ Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions).
Position: Data Snowflake Developer/Engineer
Location: Tallahassee, FL (Onsite) Need Locals or Nearby Only.
Duration: 12+ Months
Education
Bachelor's Degree in a field of study related to Information Technology, Computer Science, Data Analytics, or Finance. Work experience may substitute on a year-for-year basis for the degree.
Experience
Candidate must have a minimum of 8 years of experience in data engineering, analytics, or cloud data warehousing, with at least 2 years of hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform.
Primary Job Duties/ Tasks
The submitted candidate must be able to perform the following duties and/or tasks. Duties of the selected candidate will include, but not be limited to:
β’ Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake.
β’ Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake.
β’ Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by translating them into Snowflake SQL and optimizing performance.
β’ Develop and automate ELT/ETL data pipelines using Snowflake's native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow).
Job Specific Knowledge, Skills, and Abilities (KSAs)
Expert level SQL programming is REQUIRED for this position.
β’ Proven experience with Snowflake platform architecture and data warehousing concepts.
β’ Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
β’ Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
β’ Solid understanding of data governance, security roles, masking policies, and RBAC within Snowflake.
β’ Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external tables in Snowflake.
β’ Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools.
β’ Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
β’ Strong understanding of current data governance concepts and best practices.
β’ Knowledge of data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
Preferred KSAs:
β’ Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend.
β’ Experience working with financial, ERP, or general ledger data in a reporting or analytics capacity.
β’ Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
β’ Familiarity with Agile/SCRUM frameworks and experience working in iterative development cycles.
β’ Experience with Oracle Data Warehouse.
β’ Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions).