

Azure Data Factory Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Factory Engineer with an Active Secret clearance, offering a remote contract. Key skills include SQL, ADF, Dataverse, and Oracle experience. Requirements include data mapping, data profiling, and familiarity with Git and CI/CD processes.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 30, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Yes
-
π - Location detailed
Midlothian, VA
-
π§ - Skills detailed
#ADF (Azure Data Factory) #Dataverse #GIT #Data Integrity #Data Mapping #Data Architecture #Data Pipeline #Data Lake #Data Migration #Azure Data Factory #Data Profiling #Business Analysis #Azure SQL #Shell Scripting #Python #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Logging #SQL Queries #Scripting #Logic Apps #Azure #Oracle #Migration #Databases #Database Schema #Requirements Gathering
Role description
We are looking for an Azure Data Factory Engineer with an Active Secret clearance for a remote role.
Requirements Gathering
β’ Collaborate with business analysts, data architects, and stakeholders to gather requirements and ensure data integrity throughout the migration lifecycle.
β’ Ability to create Data mapping documents (Source to Target) by understanding business rules.
Data Understanding/Profiling
β’ Perform data profiling, cleansing, and quality checks to ensure accuracy and completeness of migrated data.
β’ Analyze and map source Oracle database schemas to target Dataverse tables/entities, defining transformation and cleansing logic as required.
Technical Requirements for Development
β’ Write and optimize SQL queries, stored procedures, and scripts for Oracle databases to support data migration and validation.
β’ Implement data transformation processes to ensure data compatibility with Dataverse data types, relationships, and constraints.
β’ Experience with Dataverse data types and CRUD operations
β’ Experience configuring data pipelines in ADF (to extract, transform, load). Experience using .csv files as sources of data for the ADF pipelines
β’ Experience using Oracle CDCs as sources of data for the ADF pipelines. Experience logging errors (audit framework) from an ADF pipeline to a Dataverse table
β’ Experience pushing data to Dataverse table(s) including necessary transformations to match target field types (i.e. Dataverse lookups, choices, date/time, etc.)
β’ Experience configuring setting for pipelines to improve concurrency/performance (i.e., ADF cores, memory, etc.)
β’ Experience with additional Azure services (e.g., Logic Apps, Data Lake, Azure Functions, Azure SQL).
β’ Experience/exposure in python/shell scripting is a plus
Data Validation Post migration
β’ Ability to generate a reconcile report of the migrated data using SQL or scripts
Code Migration
β’ Must be familiar with Git or source code repository & CI/CD process.
We are looking for an Azure Data Factory Engineer with an Active Secret clearance for a remote role.
Requirements Gathering
β’ Collaborate with business analysts, data architects, and stakeholders to gather requirements and ensure data integrity throughout the migration lifecycle.
β’ Ability to create Data mapping documents (Source to Target) by understanding business rules.
Data Understanding/Profiling
β’ Perform data profiling, cleansing, and quality checks to ensure accuracy and completeness of migrated data.
β’ Analyze and map source Oracle database schemas to target Dataverse tables/entities, defining transformation and cleansing logic as required.
Technical Requirements for Development
β’ Write and optimize SQL queries, stored procedures, and scripts for Oracle databases to support data migration and validation.
β’ Implement data transformation processes to ensure data compatibility with Dataverse data types, relationships, and constraints.
β’ Experience with Dataverse data types and CRUD operations
β’ Experience configuring data pipelines in ADF (to extract, transform, load). Experience using .csv files as sources of data for the ADF pipelines
β’ Experience using Oracle CDCs as sources of data for the ADF pipelines. Experience logging errors (audit framework) from an ADF pipeline to a Dataverse table
β’ Experience pushing data to Dataverse table(s) including necessary transformations to match target field types (i.e. Dataverse lookups, choices, date/time, etc.)
β’ Experience configuring setting for pipelines to improve concurrency/performance (i.e., ADF cores, memory, etc.)
β’ Experience with additional Azure services (e.g., Logic Apps, Data Lake, Azure Functions, Azure SQL).
β’ Experience/exposure in python/shell scripting is a plus
Data Validation Post migration
β’ Ability to generate a reconcile report of the migrated data using SQL or scripts
Code Migration
β’ Must be familiar with Git or source code repository & CI/CD process.