

CloudIngest
Data Engineer – Iceberg Migration / USC and GC Candidates
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer focused on migrating Snowflake datasets to Databricks Iceberg tables. Contract length is unspecified, with a pay rate of "TBD". Requires expertise in Snowflake, Databricks, SQL, and data migration. USC and GC candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Snowflake #IAM (Identity and Access Management) #REST (Representational State Transfer) #Cloud #Spark (Apache Spark) #AWS S3 (Amazon Simple Storage Service) #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Migration #Data Engineering #Data Modeling #Apache Iceberg #Automation #Data Governance #Spark SQL #S3 (Amazon Simple Storage Service) #Migration #Data Architecture #BI (Business Intelligence) #Data Access #Databricks #AI (Artificial Intelligence) #Datasets #Delta Lake #SQL (Structured Query Language) #AWS (Amazon Web Services)
Role description
Data Engineer – Iceberg Migration
Overview
We are looking for a Data Engineer / Data Architect to drive the migration of Snowflake datasets to Databricks Unity Catalog-managed Iceberg tables. The role is focused on ensuring a smooth transition, maintaining cross-platform data accessibility, and supporting advanced analytics and AI-driven workloads.
Key Responsibilities
Lead and execute the migration of Snowflake tables to Databricks Unity Catalog (Iceberg format)
Assess existing Snowflake data models, pipelines, and dependencies
Establish dual-access capabilities (Snowflake external access and Databricks)
Identify and convert queries to Spark SQL-compatible syntax
Work closely with data platform and datahub teams to ensure seamless onboarding
Perform data validation, reconciliation, and consistency checks post-migration
Enhance and optimize data pipelines and query performance in Databricks
Ensure adherence to data governance, access control, and Unity Catalog best practices
Troubleshoot and resolve migration-related issues efficiently
Develop and contribute to automation frameworks and reusable migration components
Required Skills
Strong expertise in Snowflake and Databricks (Spark, Unity Catalog)
Hands-on experience with Apache Iceberg, Delta Lake, or other open table formats
Proficiency in SQL and Spark SQL
Experience in data migration, ETL/ELT processes, and data modeling
Familiarity with AWS (S3, IAM, networking) or equivalent cloud environments
Solid understanding of data governance and access management
Ability to troubleshoot and optimize performance across distributed systems
Preferred Qualifications
Experience with cross-platform data sharing (Snowflake + Databricks)
Knowledge of REST catalog integration and Iceberg external tables
Exposure to AI/BI workloads and analytics ecosystems
Understanding of enterprise data platforms and data mesh architecture
Data Engineer – Iceberg Migration
Overview
We are looking for a Data Engineer / Data Architect to drive the migration of Snowflake datasets to Databricks Unity Catalog-managed Iceberg tables. The role is focused on ensuring a smooth transition, maintaining cross-platform data accessibility, and supporting advanced analytics and AI-driven workloads.
Key Responsibilities
Lead and execute the migration of Snowflake tables to Databricks Unity Catalog (Iceberg format)
Assess existing Snowflake data models, pipelines, and dependencies
Establish dual-access capabilities (Snowflake external access and Databricks)
Identify and convert queries to Spark SQL-compatible syntax
Work closely with data platform and datahub teams to ensure seamless onboarding
Perform data validation, reconciliation, and consistency checks post-migration
Enhance and optimize data pipelines and query performance in Databricks
Ensure adherence to data governance, access control, and Unity Catalog best practices
Troubleshoot and resolve migration-related issues efficiently
Develop and contribute to automation frameworks and reusable migration components
Required Skills
Strong expertise in Snowflake and Databricks (Spark, Unity Catalog)
Hands-on experience with Apache Iceberg, Delta Lake, or other open table formats
Proficiency in SQL and Spark SQL
Experience in data migration, ETL/ELT processes, and data modeling
Familiarity with AWS (S3, IAM, networking) or equivalent cloud environments
Solid understanding of data governance and access management
Ability to troubleshoot and optimize performance across distributed systems
Preferred Qualifications
Experience with cross-platform data sharing (Snowflake + Databricks)
Knowledge of REST catalog integration and Iceberg external tables
Exposure to AI/BI workloads and analytics ecosystems
Understanding of enterprise data platforms and data mesh architecture






