Optomi

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5-7 years of experience, focusing on migrating AWS data pipelines to GCP. Key skills include Python, PySpark, SQL, Terraform, and familiarity with Databricks and MLflow. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
February 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Spark SQL #Data Pipeline #"ETL (Extract #Transform #Load)" #UAT (User Acceptance Testing) #AI (Artificial Intelligence) #GCP (Google Cloud Platform) #ML (Machine Learning) #Databricks #SQL (Structured Query Language) #Migration #AWS (Amazon Web Services) #Data Integrity #Spark (Apache Spark) #MLflow #Datasets #Data Engineering #PySpark #Terraform #Storage #Python #Cloud #S3 (Amazon Simple Storage Service) #Integration Testing
Role description
Job Description: • Convert and migrate existing data pipelines and ETL code from AWS-based Databricks to GCP-based Databricks. • Refactor scripts to use new data locations and GCP services as needed. • Move relevant datasets from AWS storage (e.g., S3) to GCP storage (e.g., GCS). • Validate data integrity and ensure no loss or corruption during transfer. • Update references in notebooks and jobs to point to GCP endpoints • Recreate and validate Databricks jobs, asset bundles, and scheduled tasks in the new GCP environment. • Monitor job execution and remediate any compatibility or performance issues. • Support integration testing and user acceptance (UAT) cycles by validating the end-to-end functionality of applications. Required Skills & Tools: • Python, PySpark, SQL. • Terraform for infrastructure. • Familiarity with MLflow, Databricks, and machine learning workflows. • Migration experience (integration, testing). • Cloud platforms: AWS, GCP. • Experience: 5-7 years of experience + bachelors degree. Project Scope • Migrating AWS-hosted jobs to GCP. • Includes data engineering and platform engineering responsibilities. • Existing workflows: • Databricks machine learning workflows registered in MLflow. • Migration involves re-platforming these assets (no net-new AI/ML development).