Azure Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with a 10-month contract in Harrisburg, PA. Requires 5 years' experience in Azure technologies, data warehousing, SQL Server, and CI/CD pipelines. A degree in Computer Science or related field is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 28, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Harrisburg, PA
-
🧠 - Skills detailed
#DevOps #Automation #Scala #Apache Spark #Data Warehouse #Compliance #Azure DevOps #Delta Lake #Data Analysis #Azure Databricks #Azure Data Factory #ADF (Azure Data Factory) #Data Architecture #Azure #Data Mining #Data Lake #Business Analysis #SQL (Structured Query Language) #Synapse #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Azure cloud #Quality Assurance #SSIS (SQL Server Integration Services) #Databricks #Documentation #Cloud #Data Engineering #EDW (Enterprise Data Warehouse) #Computer Science #Microsoft Azure #SQL Server #BI (Business Intelligence) #Code Reviews #Databases #Data Processing #Classification #Deployment #Python
Role description
Location: Harrisburg, PA Position Type: Hybrid Hybrid Schedule: Onsite as needed, at least one day a month Contract Length: 10 months This role supports the modernization of the Enterprise Data Warehouse (EDW) by designing, developing, and implementing advanced cloud-based data solutions in Azure. The Architect / Azure DW Developer will modernize data architecture, support reporting and analytics needs, and deliver scalable solutions that improve data-driven decision-making across the organization. Required Skills: β€’ 5 years of technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python. β€’ 5 years of experience with design, implementation, and maintenance of business intelligence and data warehouse solutions, with expertise in SQL Server and Azure Synapse. β€’ 5 years of experience producing ETL/ELT using SQL Server Integration Services (SSIS) and other tools. β€’ 5 years of experience with SQL Server, T-SQL, scripts, and queries. β€’ 5 years of experience as an Azure DevOps CI/CD Pipeline Release Manager, including design, implementation, and maintenance of robust, scalable pipelines. β€’ 5 years of experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering. β€’ 5 years of experience in data engineering, database file system optimization, APIs, and analytics as a service. β€’ 5 years of experience with data mining architecture, modeling standards, reporting, and data analysis methodologies. β€’ 4-year college degree in Computer Science or related field (advanced study preferred). Duties: β€’ Design, develop, test, and implement data lakes, databases, ELT programs, applications, and reports in Azure. β€’ Modernize the EDW into Microsoft Azure Cloud using Databricks, Delta Lake, Synapse, Data Factory, Pipelines, Apache Spark, and Python. β€’ Plan, organize, prioritize, and manage data warehouse development and modernization efforts. β€’ Collaborate with business analysts, application developers, DBAs, and system staff to meet project objectives. β€’ Gather and analyze business and technical requirements to design optimized data solutions. β€’ Perform research on potential technologies and recommend solutions to support data modernization. β€’ Support large-scale data processing, statistical analysis, and reporting for enterprise-wide initiatives. β€’ Develop centralized data models and ensure compliance with federal and organizational data standards. β€’ Build and maintain relationships with stakeholders, presenting technical solutions clearly to varied audiences. β€’ Create, review, and maintain technical documentation, flowcharts, diagrams, test plans, and code reviews. β€’ Implement and maintain CI/CD pipelines using Azure DevOps, including automation of builds, tests, and deployments. β€’ Conduct testing, quality assurance reviews, and performance optimization of implemented solutions. β€’ Provide knowledge transfer, training, and procedural documentation for ongoing system maintenance. β€’ Track progress, submit status reports, and ensure timely project deliverables.