Eliassen Group

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "Unknown" and a pay rate of $70.00 to $90.00/hr. (w2). It requires strong Snowflake, Python, and SQL skills, with experience in Azure Data Factory migration and Agile environments. Hybrid work is in Burbank, CA or Orlando, FL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
April 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Burbank, CA
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #Snowflake #REST (Representational State Transfer) #Debugging #REST API #Azure #Leadership #Agile #Data Pipeline #"ETL (Extract #Transform #Load)" #Azure Data Factory #Data Engineering #Scala #Security #Data Governance #Data Security #Storage #Data Storage #Migration #Snowpark #AI (Artificial Intelligence) #Documentation #Scrum #SQL (Structured Query Language) #Python
Role description
Hybrid 3-4 days onsite in either Burbank, CA or Orlando, FL Our client seeks senior data engineers to build and refactor data pipelines supporting enterprise data collection, transformation, and delivery. The role centers on Snowflake and Snowpark with advanced Python and SQL to implement performant, secure solutions from defined requirements. The position also advances AI-assisted development practices using tools such as Cursor and Microsoft Copilot to accelerate delivery and improve code quality. Work occurs in an Agile environment with a focus on clear execution, migration from Azure Data Factory, and alignment to established standards. Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Rate: $70.00 to $90.00/hr. w2 Responsibilities • Build and refactor data pipelines for enterprise data collection, transformation, and delivery. • Implement and support Snowflake infrastructure for data storage, processing, and retrieval. • Execute migration tasks converting Azure Data Factory pipelines to Snowflake Snowpark solutions. • Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics. • Interpret clearly defined requirements and implementation instructions from leads and architects. • Focus on execution and delivery, identifying issues or risks and escalating to leadership as needed. • Manage assigned tasks and deliverables aligned to timelines and priorities. • Use advanced tools and coding techniques to implement Snowflake Snowpark pipelines per provided designs and standards. • Develop reliable, performant Snowflake solutions that meet project requirements. • Leverage AI-assisted development tools for code generation, refactoring, debugging, and documentation within standards. • Validate AI-assisted outputs to meet security, performance, and data governance requirements. • Share AI usage patterns, efficiencies, and lessons learned to drive consistent and responsible adoption. Experience Requirements • Strong hands-on Snowflake experience including Snowpark and SQL-based transformations. • Advanced Python experience for data engineering workloads. • Advanced SQL skills with complex transformations and performance tuning. • Proven experience migrating ETL solutions from Azure to Snowflake. • Working knowledge of Azure Data Factory with ability to understand and replicate existing pipelines. • Experience with REST APIs in Python. • Experience in Agile/Scrum environments with daily standups and two-week sprints. • Understanding of data security requirements and adherence to policies and standards. • Use of AI tools such as Cursor and Microsoft Copilot to support development and migrations. • Solid coding background with a data engineering focus.