

Principal Data Engineer – Azure & BI Modernization
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Principal Data Engineer – Azure & BI Modernization, offering a W2 contract for 12 months at a competitive pay rate. Requires 8+ years in data warehousing, expertise in Azure Data Factory, SQL, and Power BI, along with leadership experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 30, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Azure Data Factory #Databricks #Big Data #Data Profiling #Cloud #Debugging #Snowflake #Data Conversion #Security #Migration #Semantic Models #ADF (Azure Data Factory) #SQL Server #Metadata #Microsoft Power BI #Leadership #Data Science #Data Governance #Azure #Data Mart #DAX #Data Management #"ETL (Extract #Transform #Load)" #Data Integration #Data Engineering #GCP (Google Cloud Platform) #Data Modeling #Data Quality #Scala #Data Warehouse #Data Lineage #Azure SQL #Data Architecture #Data Pipeline #Data Catalog #Strategy #AWS (Amazon Web Services) #DevOps #Data Migration #Compliance #BI (Business Intelligence)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Principal Data Engineer – Azure & BI Modernization
•
•
• W2 Requirement
Key Responsibilities:
Data Engineering & Analytics Platform Development
• Design, build, and maintain enterprise-scale data pipelines using Azure Data Factory (ADF).
• Develop and optimize Azure SQL-based data warehouses, data marts, and complex transformation logic.
• Architect and implement Power BI semantic models with DAX, Power Query (M), and dimensional modeling (star schema) best practices.
• Ensure data pipelines are efficient, reliable, and meet performance and security standards.
Legacy System Modernization & Data Transformation
• Lead legacy system analysis, data profiling, cleansing, and transformation activities.
• Collaborate with stakeholders to define and document legacy and future-state data requirements.
• Develop and execute comprehensive data migration strategies and processes for operational and case management systems.
• Coordinate data synchronization and bridging efforts to support phased implementation across new platforms.
• Provide hands-on leadership to ensure data quality, integrity, and alignment to modernized systems.
Leadership, Collaboration & Governance
• Lead and mentor a team of developers and analysts in best practices for data integration and transformation.
• Oversee resource planning, task allocation, and status reporting for data conversion activities.
• Collaborate cross-functionally with business users, system integrators, architects, and data scientists.
• Evaluate and enhance the existing data architecture, driving innovation and scalability.
Required Skills and Experience:
• 8+ years of progressive experience in Data Warehouse/Data Mart environments, with a focus on cloud-based solutions.
• Advanced expertise with Azure Data Factory (ADF) and Azure SQL Server.
• Proven ability to deliver enterprise-grade Power BI semantic models, including advanced DAX and Power Query transformations.
• Demonstrated experience in legacy data migration, including strategy definition, mapping, cleansing, and execution.
• Proficient in SQL and experienced with data modeling methodologies (e.g., Kimball, Inmon, Boyce-Codd).
• Familiarity with other cloud and big data platforms such as Databricks, Snowflake, AWS, or GCP.
• Strong understanding of data governance, security, and compliance principles.
• Excellent problem-solving, debugging, and analytical skills, particularly within large and complex data ecosystems.
• Proven ability to lead cross-functional teams and communicate effectively with both technical and non-technical stakeholders.
Preferred Qualifications:
• Experience implementing CI/CD pipelines and infrastructure-as-code in a data engineering context.
• Exposure to DevOps practices for data operations and delivery.
• Experience supporting phased data bridging and live system cutovers during system modernization efforts.
• Knowledge of data cataloging, metadata management, and data lineage tools.