

Insight Global
Data Migration Consultant
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Migration Consultant with a contract length of "X months" and a pay rate of "$X per hour." It requires strong ETL experience, expertise in SQL, PostgreSQL, and Cosmos DB, and proficiency in Azure Data Factory for optimizing large-scale data migrations.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date
April 28, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Databases #Azure #ADF (Azure Data Factory) #Data Pipeline #Database Architecture #Data Transformations #SQL Server #Datasets #PostgreSQL #Batch #Migration #Database Performance #SQL (Structured Query Language) #Azure Data Factory #NoSQL #"ETL (Extract #Transform #Load)" #Data Migration #JSON (JavaScript Object Notation) #Scala
Role description
Role Overview
We are seeking a Senior ETL / Data Migration Engineer to support a large-scale data migration and optimization initiative within the platform. This role will focus on moving and transforming massive volumes of historical and active customer data across multiple databases and environments while significantly improving migration performance and reliability.
The engineer will be responsible for analyzing existing data pipelines, identifying bottlenecks, and optimizing SQL-to-Cosmos DB (JSON) and SQL-to-PostgreSQL data transfers using Azure Data Factory. The current migration processes are functional but slow, and the goal is to improve efficiency, scalability, and overall throughput.
This is a hands-on role that requires deep expertise in databases, ETL optimization, and large-scale data movement across heterogeneous environments.
Key Responsibilities
β’ Optimize large-scale data migrations involving billions of records and thousands of terabytes of data
β’ Migrate data:
β’ From SQL Server to Cosmos DB (JSON format)
β’ From SQL Server to PostgreSQL with differing schemas
β’ Improve and refactor ETL pipelines built in Azure Data Factory
β’ Analyze and optimize:
β’ Database reads and writes
β’ Insert strategies and batch processing
β’ Query performance across systems
β’ Handle schema mismatches and complex data transformations
β’ Ensure historical and new service request data is migrated accurately and efficiently
β’ Evaluate current migration scripts and processes and recommend performance improvements
β’ Ensure migrations are stable, repeatable, and performant in different environments
β’ Environment & Scale
β’ ~800,000 customers
β’ Multi-year historical dataset (3+ years)
β’ Extremely large datasets where previous migrations of small customer subsets were time-consuming
β’ Data being migrated to support a new platform that must remain accessible and performant
Must Have Qualifications
Strong ETL and data migration experience across large-scale systems
Proven experience migrating data between:
SQL databases
PostgreSQL
NoSQL databases (Cosmos DB preferred)
Hands-on experience with Azure Data Factory, including complex transformations
Deep understanding of:
Database performance tuning
Optimizing reads, writes, inserts, and bulk loads
Query optimization across different database engines
Experience handling schema mismatches and complex transformations
Ability to assess existing pipelines and significantly improve efficiency
Strong understanding of how data moves across different environments and systems
Ability to work independently and deliver tangible performance improvements quickly
Plusses
Experience working with extremely large datasets (billions of records / TB-scale data)
Cosmos DB optimization experience (especially JSON-based ingestion patterns)
Experience migrating from legacy platforms (e.g., Salesforce or similar systems)
Background in performance engineering or database architecture
Experience optimizing custom migration scripts in addition to managed ETL tools
Familiarity with networking considerations that impact large data transfers
Prior experience parachuting into short-term engagements to βmake it faster and betterβ
Role Overview
We are seeking a Senior ETL / Data Migration Engineer to support a large-scale data migration and optimization initiative within the platform. This role will focus on moving and transforming massive volumes of historical and active customer data across multiple databases and environments while significantly improving migration performance and reliability.
The engineer will be responsible for analyzing existing data pipelines, identifying bottlenecks, and optimizing SQL-to-Cosmos DB (JSON) and SQL-to-PostgreSQL data transfers using Azure Data Factory. The current migration processes are functional but slow, and the goal is to improve efficiency, scalability, and overall throughput.
This is a hands-on role that requires deep expertise in databases, ETL optimization, and large-scale data movement across heterogeneous environments.
Key Responsibilities
β’ Optimize large-scale data migrations involving billions of records and thousands of terabytes of data
β’ Migrate data:
β’ From SQL Server to Cosmos DB (JSON format)
β’ From SQL Server to PostgreSQL with differing schemas
β’ Improve and refactor ETL pipelines built in Azure Data Factory
β’ Analyze and optimize:
β’ Database reads and writes
β’ Insert strategies and batch processing
β’ Query performance across systems
β’ Handle schema mismatches and complex data transformations
β’ Ensure historical and new service request data is migrated accurately and efficiently
β’ Evaluate current migration scripts and processes and recommend performance improvements
β’ Ensure migrations are stable, repeatable, and performant in different environments
β’ Environment & Scale
β’ ~800,000 customers
β’ Multi-year historical dataset (3+ years)
β’ Extremely large datasets where previous migrations of small customer subsets were time-consuming
β’ Data being migrated to support a new platform that must remain accessible and performant
Must Have Qualifications
Strong ETL and data migration experience across large-scale systems
Proven experience migrating data between:
SQL databases
PostgreSQL
NoSQL databases (Cosmos DB preferred)
Hands-on experience with Azure Data Factory, including complex transformations
Deep understanding of:
Database performance tuning
Optimizing reads, writes, inserts, and bulk loads
Query optimization across different database engines
Experience handling schema mismatches and complex transformations
Ability to assess existing pipelines and significantly improve efficiency
Strong understanding of how data moves across different environments and systems
Ability to work independently and deliver tangible performance improvements quickly
Plusses
Experience working with extremely large datasets (billions of records / TB-scale data)
Cosmos DB optimization experience (especially JSON-based ingestion patterns)
Experience migrating from legacy platforms (e.g., Salesforce or similar systems)
Background in performance engineering or database architecture
Experience optimizing custom migration scripts in addition to managed ETL tools
Familiarity with networking considerations that impact large data transfers
Prior experience parachuting into short-term engagements to βmake it faster and betterβ






