

Sr Lead Snowflake Data Engineer (In-Person Interview in Tallahassee FL)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Lead Snowflake Data Engineer in Tallahassee, FL, lasting 2 years at a competitive pay rate. Requires a Bachelor’s degree, 3 years in data engineering, and 2 years with Snowflake. Expertise in SQL, ELT/ETL, and data migration is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 12, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Tallahassee, FL
-
🧠 - Skills detailed
#Fivetran #SnowPipe #AWS S3 (Amazon Simple Storage Service) #Leadership #Storage #Talend #Snowflake #Programming #Data Warehouse #Computer Science #Oracle #Azure #Data Migration #Agile #Data Architecture #GitHub #Informatica #Migration #SQL (Structured Query Language) #GIT #Looker #Documentation #Airflow #Tableau #AWS (Amazon Web Services) #BI (Business Intelligence) #MS SQL (Microsoft SQL Server) #Datasets #Scrum #Data Engineering #SSIS (SQL Server Integration Services) #Clustering #"ETL (Extract #Transform #Load)" #Microsoft Power BI #dbt (data build tool) #Scala #Security #Monitoring #DevOps #SQL Server #Cloud #S3 (Amazon Simple Storage Service)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Candidates Must go for In-Person interview in Tallahassee FL
Job Title: Snowflake Data Engineer
Location: Tallahassee FL
Duration: 2 Years
The selected candidate will:
• Analyze current data environments, pipelines, and legacy structures to determine transformation needs and optimal migration strategies to Snowflake.
• Collaborate with stakeholders and data architects to design scalable, secure, and cost-effective Snowflake data architectures.
• Re-engineer legacy reporting logic (WebFOCUS, Mainframe FOCUS, T-SQL) into optimized Snowflake SQL.
• Develop and automate ELT/ETL pipelines using Snowflake native features (Snowpipe, Streams, Tasks) and tools such as Informatica, dbt, Airflow.
• Build reusable, secure data models and views in Snowflake for BI tools (Power BI, Tableau, Looker).
• Implement Snowflake performance optimization and governance best practices (security roles, RBAC, masking policies, cost monitoring, clustering).
• Support knowledge transfer, documentation, and training for internal teams.
Minimum Qualifications
• Bachelor’s degree in Information Technology, Computer Science, Data Analytics, Finance, or related field (or equivalent work experience).
• Minimum 3 years in data engineering, analytics, or cloud data warehousing.
• Minimum 2 years hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform.
Required Technical Skills
• Expert-level SQL programming.
• Proven experience with Snowflake platform architecture and cloud data warehousing concepts.
• Expertise in building efficient, secure, and scalable Snowflake data models (views, materialized views, secure shares).
• Strong knowledge of ELT/ETL patterns and tools (dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
• Solid understanding of Snowflake governance: security roles, masking policies, RBAC.
• Experience with cloud storage integration (AWS S3, Azure Blob) and Snowflake external tables.
• Dimensional modeling expertise (Star/Snowflake Schema), OLAP concepts, BI reporting layer design.
• Data migration from legacy systems (mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
Preferred Skills
• Experience with BI tools (Power BI, Tableau, Looker) and semantic model building using Snowflake.
• Experience with financial, ERP, or general ledger datasets.
• Mainframe systems and flat file integration with cloud platforms.
• Familiarity with Agile/SCRUM methodologies.
• Oracle Data Warehouse experience.
• DevOps/CI-CD practices in data engineering (Git, dbt Cloud, GitHub Actions).
General Competencies
• Communication: Clear written and verbal communication for technical and non-technical audiences.
• Customer Service: Strong client focus, problem resolution skills.
• Decision Making: Objective, well-informed, and timely decision-making.
• Flexibility: Adaptable to changing priorities and feedback.
• Interpersonal Skills: Professional, respectful, and collaborative.
• Leadership: Motivates and guides teams effectively.
• Problem Solving: Analytical and creative in developing solutions.
• Team Building: Fosters teamwork and shared goals.