

Data Engineer – Microsoft Fabric, Azure Databricks, Mid-Level
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Mid-Level Data Engineer focused on Microsoft Fabric and Azure Databricks, offering a contract length of "unknown" and a pay rate of "unknown." Candidates should have 5+ years of experience, proficiency in SQL, Python, and Spark, and relevant certifications preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 31, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Compliance #Automation #Dataflow #Classification #Azure #Python #Azure Databricks #Spark (Apache Spark) #Scala #Data Modeling #Data Security #Data Quality #Computer Science #Datasets #Data Pipeline #Databricks #Schema Design #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Engineering #Microsoft Power BI #DevOps #Monitoring #BI (Business Intelligence) #Security #Data Governance #Logging #SSRS (SQL Server Reporting Services) #Migration #Data Analysis #Data Lifecycle #Azure DevOps #Data Migration
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Verified Job On Employer Career Site
Job Summary:
Resolution Technologies is seeking a highly skilled and proactive Data Engineer to support the modernization of their public sector client's data estate. This role involves designing and maintaining scalable data pipelines, implementing data governance, and collaborating with stakeholders to enable data-driven decision-making.
Responsibilities:
• Design, build, and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks.
• Implement medallion architecture (Bronze, Silver, Gold) to support data lifecycle and data quality.
• Support the sunsetting of legacy SQL-based infrastructure and SSRS, ensuring data continuity and stakeholder readiness.
• Create and manage notebooks (e.g., Fabric Notebooks, Databricks) for data transformation using Python, SQL, and Spark.
• Build and deliver curated datasets and analytics models to support Power BI dashboards and reports.
• Develop dimensional and real-time data models for analytics use cases.
• Collaborate with data analysts, stewards, and business stakeholders to deliver fit-for-purpose data assets.
• Apply data governance policies including row-level security, data masking, and classification in line with Microsoft Purview or Unity Catalog.
• Ensure monitoring, logging, and CI/CD automation using Azure DevOps for data workflows.
• Provide support during data migration and cutover events, ensuring minimal disruption.
Qualifications:
Required:
• Bachelor's degree in Computer Science, Information Systems, or related field
• 5+ years of experience in data engineering roles, preferably in government or regulated environments
• Proficiency in SQL, Python, Spark
• Hands-on experience with Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake)
• Experience with Power BI data modeling and dashboard development
• Familiarity with data governance tools (Microsoft Purview, Unity Catalog)
• Solid understanding of ETL/ELT pipelines, data warehousing concepts, and schema design
• Strong communication and collaboration skills.
• Design, build, and maintain scalable ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks.
• Implement medallion architecture (Bronze, Silver, Gold) to support data lifecycle and data quality.
• Support the sunsetting of legacy SQL-based infrastructure and SSRS, ensuring data continuity and stakeholder readiness.
• Create and manage notebooks (e.g., Fabric Notebooks, Databricks) for data transformation using Python, SQL, and Spark.
• Build and deliver curated datasets and analytics models to support Power BI dashboards and reports.
• Develop dimensional and real-time data models for analytics use cases.
• Collaborate with data analysts, stewards, and business stakeholders to deliver fit-for-purpose data assets.
• Apply data governance policies including row-level security, data masking, and classification in line with Microsoft Purview or Unity Catalog.
• Ensure monitoring, logging, and CI/CD automation using Azure DevOps for data workflows.
• Provide support during data migration and cutover events, ensuring minimal disruption.
Preferred:
• Certifications such as Microsoft Certified: Fabric Analytics Engineer or Azure Data Engineer Associate
• Knowledge of CI/CD automation with Azure DevOps
• Familiarity with data security and compliance (e.g., FIPS 199, NIST)
• Experience managing sunset and modernization of legacy reporting systems like SSRS
Company:
Resolution is a law firm that provides family, divorce, separation, domestic abuse, children, and dispute resolution services. Founded in 1982, the company is headquartered in London, England, GBR, with a team of 11-50 employees. The company is currently Early Stage. Resolution has a track record of offering H1B sponsorships.