

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Senior Platform Developer) on a W2 contract, remote location. Requires 8+ years in Data Warehousing, expertise in Azure Data Factory, SQL, Power BI, and legacy data migration. Strong leadership and data governance skills are essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Illinois, United States
-
π§ - Skills detailed
#Data Lineage #Azure Data Factory #Data Conversion #Data Catalog #Azure SQL #Leadership #Compliance #Data Governance #Data Modeling #GCP (Google Cloud Platform) #Snowflake #Data Management #Data Warehouse #Strategy #Azure #Data Migration #Data Architecture #Data Integration #Migration #SQL (Structured Query Language) #DAX #Metadata #Data Mart #Consulting #Data Quality #Databricks #Debugging #AWS (Amazon Web Services) #BI (Business Intelligence) #Data Engineering #Data Profiling #ADF (Azure Data Factory) #Semantic Models #Data Science #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Scala #Big Data #Data Pipeline #Security #SQL Server #Cloud
Role description
Company Description
Krasan Consulting Services is a Chicago boutique management consulting and technology solutions integrator. As a certified BEP WBE with the State of Illinois and a certified WBE, MBE, and DBE recognized by the City of Chicago, Krasan provides complex technology solutions and services for clients in the Public and Commercial sectors. With decades of experience, Krasan specializes in end-to-end technology solutions, integration, and implementation services and is proud to enable the success of our customers.
Role Title: Data Engineer(Senior Platform Developer )
Type of contract: W2
Location: Remote
Note: This is a W-2 employee position on our payroll. Unfortunately, 1099/C2C/C2H are not allowed at this time.
Job Description:
Project Overview:
This position requires strong leadership in the analysis, conversion, and integration of application, system, and operational data. You will be responsible for defining and implementing strategies for data migration and synchronization that ensure the integrity and continuity of mission-critical services during system transformation.
Key Responsibilities:
Data Engineering & Analytics Platform Development
β’ Design, build, and maintain enterprise-scale data pipelines using Azure Data Factory (ADF).
β’ Develop and optimize Azure SQL-based data warehouses, data marts, and complex transformation logic.
β’ Architect and implement Power BI semantic models with DAX, Power Query (M), and dimensional modeling (star schema) best practices.
β’ Ensure data pipelines are efficient, reliable, and meet performance and security standards.
Legacy System Modernization & Data Transformation
β’ Lead legacy system analysis, data profiling, cleansing, and transformation activities.
β’ Collaborate with stakeholders to define and document legacy and future-state data requirements.
β’ Develop and execute comprehensive data migration strategies and processes for operational and case management systems.
β’ Coordinate data synchronization and bridging efforts to support phased implementation across new platforms.
β’ Provide hands-on leadership to ensure data quality, integrity, and alignment to modernized systems.
Leadership, Collaboration & Governance
β’ Lead and mentor a team of developers and analysts in best practices for data integration and transformation.
β’ Oversee resource planning, task allocation, and status reporting for data conversion activities.
β’ Collaborate cross-functionally with business users, system integrators, architects, and data scientists.
β’ Evaluate and enhance the existing data architecture, driving innovation and scalability.
Required Skills and Experience:
β’ 8+ years of progressive experience in Data Warehouse/Data Mart environments, with a focus on cloud-based solutions.
β’ Advanced expertise with Azure Data Factory (ADF) and Azure SQL Server.
β’ Proven ability to deliver enterprise-grade Power BI semantic models, including advanced DAX and Power Query transformations.
β’ Demonstrated experience in legacy data migration, including strategy definition, mapping, cleansing, and execution.
β’ Proficient in SQL and experienced with data modeling methodologies (e.g., Kimball, Inmon, Boyce-Codd).
β’ Familiarity with other cloud and big data platforms such as Databricks, Snowflake, AWS, or GCP.
β’ Strong understanding of data governance, security, and compliance principles.
β’ Excellent problem-solving, debugging, and analytical skills, particularly within large and complex data ecosystems.
β’ Proven ability to lead cross-functional teams and communicate effectively with both technical and non-technical stakeholders.
Preferred Qualifications:
β’ Experience implementing CI/CD pipelines and infrastructure-as-code in a data engineering context.
β’ Knowledge of data cataloging, metadata management, and data lineage tools.
Company Description
Krasan Consulting Services is a Chicago boutique management consulting and technology solutions integrator. As a certified BEP WBE with the State of Illinois and a certified WBE, MBE, and DBE recognized by the City of Chicago, Krasan provides complex technology solutions and services for clients in the Public and Commercial sectors. With decades of experience, Krasan specializes in end-to-end technology solutions, integration, and implementation services and is proud to enable the success of our customers.
Role Title: Data Engineer(Senior Platform Developer )
Type of contract: W2
Location: Remote
Note: This is a W-2 employee position on our payroll. Unfortunately, 1099/C2C/C2H are not allowed at this time.
Job Description:
Project Overview:
This position requires strong leadership in the analysis, conversion, and integration of application, system, and operational data. You will be responsible for defining and implementing strategies for data migration and synchronization that ensure the integrity and continuity of mission-critical services during system transformation.
Key Responsibilities:
Data Engineering & Analytics Platform Development
β’ Design, build, and maintain enterprise-scale data pipelines using Azure Data Factory (ADF).
β’ Develop and optimize Azure SQL-based data warehouses, data marts, and complex transformation logic.
β’ Architect and implement Power BI semantic models with DAX, Power Query (M), and dimensional modeling (star schema) best practices.
β’ Ensure data pipelines are efficient, reliable, and meet performance and security standards.
Legacy System Modernization & Data Transformation
β’ Lead legacy system analysis, data profiling, cleansing, and transformation activities.
β’ Collaborate with stakeholders to define and document legacy and future-state data requirements.
β’ Develop and execute comprehensive data migration strategies and processes for operational and case management systems.
β’ Coordinate data synchronization and bridging efforts to support phased implementation across new platforms.
β’ Provide hands-on leadership to ensure data quality, integrity, and alignment to modernized systems.
Leadership, Collaboration & Governance
β’ Lead and mentor a team of developers and analysts in best practices for data integration and transformation.
β’ Oversee resource planning, task allocation, and status reporting for data conversion activities.
β’ Collaborate cross-functionally with business users, system integrators, architects, and data scientists.
β’ Evaluate and enhance the existing data architecture, driving innovation and scalability.
Required Skills and Experience:
β’ 8+ years of progressive experience in Data Warehouse/Data Mart environments, with a focus on cloud-based solutions.
β’ Advanced expertise with Azure Data Factory (ADF) and Azure SQL Server.
β’ Proven ability to deliver enterprise-grade Power BI semantic models, including advanced DAX and Power Query transformations.
β’ Demonstrated experience in legacy data migration, including strategy definition, mapping, cleansing, and execution.
β’ Proficient in SQL and experienced with data modeling methodologies (e.g., Kimball, Inmon, Boyce-Codd).
β’ Familiarity with other cloud and big data platforms such as Databricks, Snowflake, AWS, or GCP.
β’ Strong understanding of data governance, security, and compliance principles.
β’ Excellent problem-solving, debugging, and analytical skills, particularly within large and complex data ecosystems.
β’ Proven ability to lead cross-functional teams and communicate effectively with both technical and non-technical stakeholders.
Preferred Qualifications:
β’ Experience implementing CI/CD pipelines and infrastructure-as-code in a data engineering context.
β’ Knowledge of data cataloging, metadata management, and data lineage tools.