

Fusion Solutions, LLC
Data Scientist (Expert – Data Migration)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Scientist (Expert – Data Migration) on a 12-month remote contract, paying $87-$100 per hour. Requires 6+ years in data migration, advanced Python and SQL skills, and experience with AWS or GCP cloud services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date
October 9, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Collibra #Lambda (AWS Lambda) #Dataflow #Informatica #S3 (Amazon Simple Storage Service) #Redshift #Data Engineering #Documentation #Computer Science #Spark (Apache Spark) #Data Governance #BigQuery #CRM (Customer Relationship Management) #Snowflake #GCP (Google Cloud Platform) #Airflow #Compliance #Data Science #Python #Metadata #PySpark #Cloud #Databricks #Data Modeling #SAP #SQL (Structured Query Language) #Data Management #MDM (Master Data Management) #SQL Queries #Data Lineage #Data Migration #Data Pipeline #Data Ingestion #AWS Glue #"ETL (Extract #Transform #Load)" #Migration #AWS (Amazon Web Services) #dbt (data build tool) #Storage #Oracle #Data Quality #Physical Data Model
Role description
Location: Remote (U.S. Based)Type: 12-Month Contract | W-2 Only | No Sponsorship or Visa TransfersPay Range: $87 – $100 per hour (W-2)Extension: Possible based on performance and project scope
Overview
We are seeking a highly technical Data Scientist (Expert) with deep experience in data migration, ETL/ELT development, data modeling, and cloud data engineering.This role supports large-scale modernization projects moving enterprise data from legacy environments into modern cloud-based platforms.
Key Responsibilities
Lead and execute end-to-end data migration initiatives (ERP/CRM modernization, legacy-to-cloud transitions).
Design and develop ETL/ELT pipelines for data ingestion, transformation, and validation using Python and SQL.
Build and optimize conceptual, logical, and physical data models to support warehouse and analytics layers.
Implement Master Data Management (MDM) and data governance processes to ensure accuracy and consistency across systems.
Collaborate with technical program and business stakeholders to define mapping rules, data lineage, and transformation logic.
Utilize AWS or GCP cloud data services (e.g., Redshift, Glue, S3, BigQuery, Dataflow) for migration and storage solutions.
Write and optimize SQL queries for profiling, validation, and performance tuning.
Apply data quality checks and develop automated audits to verify migration completeness and integrity.
Document all data flows, mapping logic, and technical specifications to maintain traceability and compliance.
Required Skills & Experience
6 + years of hands-on experience in data engineering or data science roles with a focus on data migration and integration.
Advanced proficiency in Python and SQL for ETL and data pipeline development.
Proven expertise in ETL/ELT tools (Airflow, AWS Glue, Informatica, dbt, or similar).
Strong knowledge of data modeling (star/snowflake schemas) and data warehousing concepts.
Hands-on experience with AWS (Redshift, S3, Glue, Lambda) or GCP (BigQuery, Dataflow).
Solid understanding of Master Data Management (MDM), data governance, and metadata management.
Demonstrated experience in ERP or CRM system data migration projects (SAP, Salesforce, Oracle, etc.).
Familiarity with data validation, profiling, and quality frameworks.
Excellent documentation and stakeholder communication skills.
Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or related field.
Preferred Qualifications
Experience with Snowflake, PySpark, or Databricks.
Exposure to Airflow or AWS Step Functions for workflow orchestration.
Knowledge of Collibra, Informatica MDM, or Reltio for data governance.
Familiarity with regulated environments (Life Sciences, Healthcare, AgTech, Finance).
Employment Terms
W-2 Only – No C2C, 1099, or third-party engagements.
We are unable to provide visa sponsorship or visa transfers.
Applicants must be authorized to work in the United States and available for full-time engagement (40 hours per week).
Job Types: Full-time, Contract
Pay: $90.00 - $100.00 per hour
Work Location: Remote
Location: Remote (U.S. Based)Type: 12-Month Contract | W-2 Only | No Sponsorship or Visa TransfersPay Range: $87 – $100 per hour (W-2)Extension: Possible based on performance and project scope
Overview
We are seeking a highly technical Data Scientist (Expert) with deep experience in data migration, ETL/ELT development, data modeling, and cloud data engineering.This role supports large-scale modernization projects moving enterprise data from legacy environments into modern cloud-based platforms.
Key Responsibilities
Lead and execute end-to-end data migration initiatives (ERP/CRM modernization, legacy-to-cloud transitions).
Design and develop ETL/ELT pipelines for data ingestion, transformation, and validation using Python and SQL.
Build and optimize conceptual, logical, and physical data models to support warehouse and analytics layers.
Implement Master Data Management (MDM) and data governance processes to ensure accuracy and consistency across systems.
Collaborate with technical program and business stakeholders to define mapping rules, data lineage, and transformation logic.
Utilize AWS or GCP cloud data services (e.g., Redshift, Glue, S3, BigQuery, Dataflow) for migration and storage solutions.
Write and optimize SQL queries for profiling, validation, and performance tuning.
Apply data quality checks and develop automated audits to verify migration completeness and integrity.
Document all data flows, mapping logic, and technical specifications to maintain traceability and compliance.
Required Skills & Experience
6 + years of hands-on experience in data engineering or data science roles with a focus on data migration and integration.
Advanced proficiency in Python and SQL for ETL and data pipeline development.
Proven expertise in ETL/ELT tools (Airflow, AWS Glue, Informatica, dbt, or similar).
Strong knowledge of data modeling (star/snowflake schemas) and data warehousing concepts.
Hands-on experience with AWS (Redshift, S3, Glue, Lambda) or GCP (BigQuery, Dataflow).
Solid understanding of Master Data Management (MDM), data governance, and metadata management.
Demonstrated experience in ERP or CRM system data migration projects (SAP, Salesforce, Oracle, etc.).
Familiarity with data validation, profiling, and quality frameworks.
Excellent documentation and stakeholder communication skills.
Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or related field.
Preferred Qualifications
Experience with Snowflake, PySpark, or Databricks.
Exposure to Airflow or AWS Step Functions for workflow orchestration.
Knowledge of Collibra, Informatica MDM, or Reltio for data governance.
Familiarity with regulated environments (Life Sciences, Healthcare, AgTech, Finance).
Employment Terms
W-2 Only – No C2C, 1099, or third-party engagements.
We are unable to provide visa sponsorship or visa transfers.
Applicants must be authorized to work in the United States and available for full-time engagement (40 hours per week).
Job Types: Full-time, Contract
Pay: $90.00 - $100.00 per hour
Work Location: Remote