Charter Global

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Atlanta, Georgia, on an 8-month+ contract with a pay rate of "unknown." Key skills include Microsoft Fabric, Azure Databricks, SQL, and Python. Requires 3-5 years of data engineering experience and a relevant bachelor's degree.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date
October 15, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
1099 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Scala #SQL Server #Datasets #DevOps #SQL (Structured Query Language) #SSIS (SQL Server Integration Services) #Azure Databricks #Data Governance #Monitoring #Data Analysis #Data Engineering #Migration #Spark (Apache Spark) #Microsoft Power BI #"ETL (Extract #Transform #Load)" #Dataflow #Automation #Azure DevOps #Data Pipeline #Computer Science #BI (Business Intelligence) #Azure #Data Lifecycle #Logging #Python #Databricks
Role description
Title/Role: Sr. Data Engineer Worksite Address: Atlanta, Georgia 30334 (Hybrid) Duration: 8 months+ contract β€’ Seeking a motivated Data Engineer to join our team and support the modernization of our data estate. This role focuses on assisting with data pipeline development, migration of legacy systems, and maintaining scalable, secure, and efficient data solutions using modern technologies, particularly Microsoft Fabric and Azure-based platforms. Work Location & Attendance: β€’ Must be physically located in Georgia β€’ On-site: Tuesday to Thursday (per manager’s discretion) β€’ Mandatory in-person meetings: All Hands, Enterprise Applications, All Staff Skill Required / Desired Amount of Experience Bachelor's degree in computer science, Information Systems, or related field. Required Years 3-5 years of experience in data engineering or related roles. Required Years Familiarity with LangGraph and RAG DB concepts. Required Years Understanding of ETL/ELT pipelines and data warehousing concepts. Required Years Knowledge of CI/CD automation with Azure DevOps. Nice to have Years Familiarity with data governance tools (Microsoft Purview, Unity Catalog). Nice to have Years Experience with SSIS package migration and maintenance. Nice to have Years Key Responsibilities β€’ Assist in designing, building, and maintaining ETL/ELT data pipelines using Microsoft Fabric and Azure Databricks. β€’ Support migration and maintenance of SSIS packages from legacy systems. β€’ Implement medallion architecture (Bronze, Silver, Gold) for data lifecycle and quality. β€’ Create and manage notebooks (Fabric Notebooks, Databricks) for data transformation using Python, SQL, and Spark. β€’ Build curated datasets to support Power BI dashboards. β€’ Collaborate with data analysts and business stakeholders to deliver fit-for-purpose data assets. β€’ Apply data governance policies in line with Microsoft Purview or Unity Catalog. β€’ Support monitoring, logging, and CI/CD automation using Azure DevOps. Technical Stack β€’ Microsoft Fabric (Dataflows, Pipelines, Notebooks, OneLake) β€’ Azure Databricks β€’ SQL Server / SQL Managed Instances β€’ Power BI β€’ SSIS (migration and maintenance) β€’ LangGraph and RAG DB (for advanced data workflows) Qualifications β€’ Bachelor's degree in computer science, Information Systems, or related field. β€’ 2–3 years of experience in data engineering or related roles. β€’ Proficiency in SQL, Python, Spark. β€’ Familiarity with LangGraph and RAG DB concepts. β€’ Hands-on experience with Microsoft Fabric and Power BI. β€’ Understanding of ETL/ELT pipelines and data warehousing concepts. Preferred β€’ Knowledge of CI/CD automation with Azure DevOps. β€’ Familiarity with data governance tools (Microsoft Purview, Unity Catalog). β€’ Experience with SSIS package migration and maintenance. Best Regards, David Roy | Accounts Manager – US Staffing | Charter Global Inc. | https://www.charterglobal.com LinkedIn