

aKUBE
Senior Data Engineer - Snowflake Migration & Python
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer focusing on Snowflake migration and Python, with a 12-month contract in Burbank, CA. Pay is up to $96/hr. Key skills required include Snowflake, Snowpark, Python, SQL, and Azure Data Factory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
768
-
ποΈ - Date
April 2, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Burbank, CA
-
π§ - Skills detailed
#Documentation #Scala #Security #Migration #Data Migration #Cloud #Snowpark #Agile #"ETL (Extract #Transform #Load)" #API (Application Programming Interface) #AWS (Amazon Web Services) #Data Engineering #REST (Representational State Transfer) #Data Pipeline #Data Security #Storage #ADF (Azure Data Factory) #REST API #Snowflake #Data Integration #Scrum #Azure #Python #Azure Data Factory #SQL (Structured Query Language) #Debugging #Data Storage #AI (Artificial Intelligence)
Role description
City: Burbank, CA
Onsite/ Hybrid/ Remote: Hybrid (4 days onsite per week, no flexibility)
Duration: 12 Months
Rate Range: Upto $96/hr on W2
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
β’ Snowflake
β’ Snowpark
β’ Python
β’ SQL
β’ Azure Data Factory
β’ ETL / data pipeline migration
β’ REST API integration
β’ Agile / Scrum
β’ AI-assisted development tools such as Cursor or Microsoft Copilot
Responsibilities:
β’ Build, refactor, and support enterprise data pipelines for data collection, transformation, and delivery
β’ Develop and maintain Snowflake-based data solutions using Snowpark, Python, and SQL
β’ Migrate existing Azure Data Factory pipelines into Snowflake Snowpark solutions
β’ Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics use cases
β’ Implement infrastructure that supports secure data storage, processing, and retrieval in Snowflake
β’ Execute work from defined requirements, technical designs, and priorities set by team leads and architects
β’ Identify delivery risks, technical issues, or blockers and escalate as needed
β’ Manage assigned tasks and deliverables against project timelines and sprint commitments
β’ Apply performance tuning and optimization across Python and SQL workflows
β’ Use AI-assisted development tools to support coding, refactoring, debugging, and documentation while following engineering standards
β’ Validate AI-generated output to ensure security, quality, performance, and governance requirements are met
β’ Share AI tool usage patterns and best practices with the broader engineering team
Qualifications:
β’ 3 to 5+ years of experience in Data Engineering or Data Integration roles
β’ Strong hands-on experience with Snowflake in a production environment
β’ Strong hands-on experience with Snowpark pipeline development
β’ Senior-level Python skills for data engineering and integration workloads
β’ Advanced SQL skills, including complex transformations and query tuning
β’ Experience working with Azure Data Factory and translating pipeline logic into Python-based implementations
β’ Experience migrating ETL or data pipelines across cloud platforms, especially from Azure to Snowflake
β’ Experience working with REST APIs using Python
β’ Experience in Agile/Scrum teams with sprint-based delivery
β’ Understanding of data security, governance, and enterprise engineering standards
β’ Bachelorβs degree or equivalent practical experience
Nice to Have:
β’ Snowflake Tasks and Streams
β’ Snowflake warehouse configuration and optimization
β’ AWS experience
β’ Azure experience
β’ CI/CD for data engineering workflows
β’ Experience with large-scale cloud data migration projects
City: Burbank, CA
Onsite/ Hybrid/ Remote: Hybrid (4 days onsite per week, no flexibility)
Duration: 12 Months
Rate Range: Upto $96/hr on W2
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
β’ Snowflake
β’ Snowpark
β’ Python
β’ SQL
β’ Azure Data Factory
β’ ETL / data pipeline migration
β’ REST API integration
β’ Agile / Scrum
β’ AI-assisted development tools such as Cursor or Microsoft Copilot
Responsibilities:
β’ Build, refactor, and support enterprise data pipelines for data collection, transformation, and delivery
β’ Develop and maintain Snowflake-based data solutions using Snowpark, Python, and SQL
β’ Migrate existing Azure Data Factory pipelines into Snowflake Snowpark solutions
β’ Join and transform data from multiple source systems for reporting, dashboards, KPIs, and analytics use cases
β’ Implement infrastructure that supports secure data storage, processing, and retrieval in Snowflake
β’ Execute work from defined requirements, technical designs, and priorities set by team leads and architects
β’ Identify delivery risks, technical issues, or blockers and escalate as needed
β’ Manage assigned tasks and deliverables against project timelines and sprint commitments
β’ Apply performance tuning and optimization across Python and SQL workflows
β’ Use AI-assisted development tools to support coding, refactoring, debugging, and documentation while following engineering standards
β’ Validate AI-generated output to ensure security, quality, performance, and governance requirements are met
β’ Share AI tool usage patterns and best practices with the broader engineering team
Qualifications:
β’ 3 to 5+ years of experience in Data Engineering or Data Integration roles
β’ Strong hands-on experience with Snowflake in a production environment
β’ Strong hands-on experience with Snowpark pipeline development
β’ Senior-level Python skills for data engineering and integration workloads
β’ Advanced SQL skills, including complex transformations and query tuning
β’ Experience working with Azure Data Factory and translating pipeline logic into Python-based implementations
β’ Experience migrating ETL or data pipelines across cloud platforms, especially from Azure to Snowflake
β’ Experience working with REST APIs using Python
β’ Experience in Agile/Scrum teams with sprint-based delivery
β’ Understanding of data security, governance, and enterprise engineering standards
β’ Bachelorβs degree or equivalent practical experience
Nice to Have:
β’ Snowflake Tasks and Streams
β’ Snowflake warehouse configuration and optimization
β’ AWS experience
β’ Azure experience
β’ CI/CD for data engineering workflows
β’ Experience with large-scale cloud data migration projects






