

Analytics Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Analytics Engineer on a hybrid contract in Atlanta, GA, lasting over 6 months, with a pay rate of "competitive". Key skills include Azure Databricks, Microsoft Fabric, SQL, and Python. Experience with data governance and modeling is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Atlanta, GA
-
π§ - Skills detailed
#Data Processing #Data Quality #Dataflow #Monitoring #Python #Data Pipeline #Classification #Delta Lake #Data Engineering #SQL Server #Computer Science #Compliance #Schema Design #Batch #Azure #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Databricks #Data Management #Metadata #Pandas #SQL (Structured Query Language) #Data Lifecycle #Azure ADLS (Azure Data Lake Storage) #Data Governance #DevOps #Visualization #Azure DevOps #Scala #Microsoft Power BI #Logging #PySpark #Terraform #Datasets #Security #Strategy #Automation #"ACID (Atomicity #Consistency #Isolation #Durability)" #BI (Business Intelligence) #Azure Databricks #Data Modeling #Data Security #ADLS (Azure Data Lake Storage)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
DataStaff, Inc. is currently seeking an Analytics Engineer for a long-term contract opportunity with one of our direct clients in Atlanta, GA.
β’ This position is hybrid; on-site: Tuesday to Thursday.
Job Description:
The Analytics Engineer will contribute to our modern data estate strategy by developing scalable data solutions using Microsoft Fabric and Azure Databricks. This role will be instrumental in building resilient data pipelines, transforming raw data into curated datasets, and delivering analytics-ready models that support enterprise-level reporting and decision-making.
Key Responsibilities:
Data Engineering & Pipeline Development:
β’ Build and maintain ETL/ELT pipelines using Azure Databricks and Microsoft Fabric.
β’ Implement medallion architecture (Bronze, Silver, Gold layers) to support data lifecycle and quality.
β’ Develop real-time and batch ingestion processes from IES Gateway and other source systems.
β’ Ensure data quality, validation, and transformation logic is consistently applied.
β’ Use Python, Spark, and SQL in Databricks and Fabric notebooks for data transformation.
β’ Delta Lake: Implementing Delta Lake for data versioning, ACID transactions, and schema enforcement.
β’ Integration with Azure Services: Integrating Databricks with other Azure services like Azure One Lake, Azure ADLS Gen2, and Microsoft fabric.
Data Modeling & Curation:
β’ Collaborate with the Domain Owners to design dimensional and real-time data models.
β’ Create analytics-ready datasets for Power BI and other reporting tools.
β’ Standardize field naming conventions and schema definitions across datasets.
Data Governance & Security:
β’ Apply data classification and tagging based on the data governance framework.
β’ Implement row-level security, data masking, and audit logging as per compliance requirements.
β’ Support integration with Microsoft Purview for lineage and metadata management.
Data Modeling:
β’ Dimensional modeling
β’ Real-time data modeling patterns
Reporting & Visualization Support:
β’ Partner with BI developers to ensure data models are optimized for Power BI.
β’ Provide curated datasets that align with reporting requirements and business logic.
β’ Create BI dashboards and train users.
DevOps & Automation:
β’ Support CI/CD pipelines for data workflows using Azure DevOps.
β’ Assist in monitoring, logging, and performance tuning of data jobs and clusters.
Knowledge and Experience:
β’ Bachelorβs degree in computer science, Data Engineering, or related field.
β’ SQL Server 2019+
β’ Familiarity with data modeling, data governance, and data security best practices.
β’ Strong understanding of ETL/ELT processes, data quality, and schema design.
β’ Experience with Power BI datasets and semantic modeling.
β’ Knowledge of Microsoft Purview, Unity Catalog, or similar governance tools.
β’ Exposure to real-time data processing and streaming architectures.
β’ Knowledge of federal/state compliance requirements for data handling
β’ Familiarity with Azure DevOps, Terraform, or CI/CD for data pipelines.
β’ Microsoft Fabric Analytics Engineer Certifications (preferred)
β’ Strong analytical and problem-solving abilities.
β’ Excellent communication skills for technical and non-technical audiences.
β’ Experience working with government stakeholders.
Required Skills:
β’ 3 Years - Experience in data engineering or analytics engineering roles.
β’ 3 Years - Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.
β’ 3 Years - Azure Databricks (Spark, Delta Lake)
β’ 3 Years - Microsoft Fabric (Dataflows, Pipelines, OneLake)
β’ 3 Years - SQL and Python (Pandas, PySpark)
This position is available as a W2 position with a competitive benefits package. DataStaff offers medical, dental, and vision coverage options as well as paid vacation, sick, and holiday leave. As many of our opportunities are long-term, we also have a 401k program available for employees after 6 months.