

Azure Data Warehouse Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Warehouse Developer on a hybrid contract in Harrisburg, PA, with a pay rate of "unknown." Requires 5+ years of experience in data warehousing, expertise in Azure technologies, and a degree in Computer Science or related field.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Harrisburg, PA
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Data Governance #Synapse #Data Quality #BI (Business Intelligence) #Documentation #Azure Data Factory #Computer Science #SSIS (SQL Server Integration Services) #Delta Lake #Classification #Compliance #Data Processing #Databricks #DevOps #Azure DevOps #Azure Databricks #Data Mining #Datasets #Spark (Apache Spark) #Deployment #Python #Apache Spark #Data Warehouse #Microsoft Azure #Scala #EDW (Enterprise Data Warehouse) #Schema Design #SQL Server #Cloud #Azure #SQL (Structured Query Language) #Business Analysis
Role description
e&e is seeking an Azure Data Warehouse Developer for a hybrid contract opportunity in Harrisburg, PA!
The Azure Data Warehouse Developer will play a critical role in supporting a large-scale Data Modernization Initiative aimed at transforming public health operations through data-driven policies and interventions. This role will focus on designing, developing, and implementing enterprise data warehouse solutions in Microsoft Azure. The developer will support both current reporting and analytical needs while helping build a modernized Azure-based data environment utilizing Databricks, Delta Lake, Synapse, and other advanced cloud technologies. This position requires hands-on technical expertise, strong collaboration skills, and the ability to deliver secure, scalable, and high-performance data solutions.
Responsibilities:
β’ Design, develop, and maintain enterprise data warehouse (EDW) solutions in Microsoft Azure.
β’ Build and optimize ELT/ETL pipelines using Azure Data Factory, Synapse, SQL Server Integration Services (SSIS), and other tools.
β’ Implement and manage Azure Databricks, Delta Lake, and Spark-based data processing for large datasets.
β’ Create centralized data models to support advanced analytics and reporting requirements.
β’ Translate business requirements into optimized technical designs and implement robust data solutions.
β’ Manage and enhance CI/CD pipelines using Azure DevOps for efficient build, test, and deployment processes.
β’ Ensure data quality, integrity, and compliance with federal and state standards.
β’ Conduct performance tuning and optimization of database and warehouse systems.
β’ Prepare and maintain documentation, including system diagrams, test plans, procedures, and technical standards.
β’ Collaborate with business analysts, application developers, DBAs, and infrastructure teams to ensure seamless solution delivery.
β’ Participate in technical reviews, testing, and user acceptance sessions.
β’ Provide knowledge transfer, training, and ongoing support for implemented systems.
β’ Contribute to strategic initiatives such as the Data Modernization Initiative, LIMS replacement, NEDSS NextGen, and other statewide projects.
Requirements:
β’ 4-year degree in Computer Science, Information Systems, or related field (advanced study preferred).
β’ 5+ years of experience in data warehousing and business intelligence solution design, implementation, and maintenance.
β’ Strong hands-on expertise with Azure Synapse, Azure Databricks, Azure Delta Lake, Azure Data Factory, Apache Spark, and Python.
β’ Proficiency in SQL Server, T-SQL, and complex query development.
β’ Proven experience producing ETL/ELT pipelines with SSIS or equivalent tools.
β’ Experience in CI/CD pipeline design and maintenance using Azure DevOps, including Monorepo-based pipelines.
β’ Advanced knowledge of relational and dimensional modeling, star schema design, and data warehousing best practices.
β’ Experience with data mining, data quality, cleansing, classification, and data governance processes.
β’ Strong analytical and problem-solving skills, with the ability to translate business needs into scalable technical solutions.
β’ Excellent communication skills, with the ability to present technical concepts to both technical and non-technical stakeholders.
β’ Ability to manage multiple projects simultaneously with minimal supervision.
Preferred Experience
β’ Background working with healthcare or public health data sets.
β’ Experience with analytics as a service, APIs, and database file system optimization.
β’ Familiarity with state and federal data compliance frameworks.
e&e is seeking an Azure Data Warehouse Developer for a hybrid contract opportunity in Harrisburg, PA!
The Azure Data Warehouse Developer will play a critical role in supporting a large-scale Data Modernization Initiative aimed at transforming public health operations through data-driven policies and interventions. This role will focus on designing, developing, and implementing enterprise data warehouse solutions in Microsoft Azure. The developer will support both current reporting and analytical needs while helping build a modernized Azure-based data environment utilizing Databricks, Delta Lake, Synapse, and other advanced cloud technologies. This position requires hands-on technical expertise, strong collaboration skills, and the ability to deliver secure, scalable, and high-performance data solutions.
Responsibilities:
β’ Design, develop, and maintain enterprise data warehouse (EDW) solutions in Microsoft Azure.
β’ Build and optimize ELT/ETL pipelines using Azure Data Factory, Synapse, SQL Server Integration Services (SSIS), and other tools.
β’ Implement and manage Azure Databricks, Delta Lake, and Spark-based data processing for large datasets.
β’ Create centralized data models to support advanced analytics and reporting requirements.
β’ Translate business requirements into optimized technical designs and implement robust data solutions.
β’ Manage and enhance CI/CD pipelines using Azure DevOps for efficient build, test, and deployment processes.
β’ Ensure data quality, integrity, and compliance with federal and state standards.
β’ Conduct performance tuning and optimization of database and warehouse systems.
β’ Prepare and maintain documentation, including system diagrams, test plans, procedures, and technical standards.
β’ Collaborate with business analysts, application developers, DBAs, and infrastructure teams to ensure seamless solution delivery.
β’ Participate in technical reviews, testing, and user acceptance sessions.
β’ Provide knowledge transfer, training, and ongoing support for implemented systems.
β’ Contribute to strategic initiatives such as the Data Modernization Initiative, LIMS replacement, NEDSS NextGen, and other statewide projects.
Requirements:
β’ 4-year degree in Computer Science, Information Systems, or related field (advanced study preferred).
β’ 5+ years of experience in data warehousing and business intelligence solution design, implementation, and maintenance.
β’ Strong hands-on expertise with Azure Synapse, Azure Databricks, Azure Delta Lake, Azure Data Factory, Apache Spark, and Python.
β’ Proficiency in SQL Server, T-SQL, and complex query development.
β’ Proven experience producing ETL/ELT pipelines with SSIS or equivalent tools.
β’ Experience in CI/CD pipeline design and maintenance using Azure DevOps, including Monorepo-based pipelines.
β’ Advanced knowledge of relational and dimensional modeling, star schema design, and data warehousing best practices.
β’ Experience with data mining, data quality, cleansing, classification, and data governance processes.
β’ Strong analytical and problem-solving skills, with the ability to translate business needs into scalable technical solutions.
β’ Excellent communication skills, with the ability to present technical concepts to both technical and non-technical stakeholders.
β’ Ability to manage multiple projects simultaneously with minimal supervision.
Preferred Experience
β’ Background working with healthcare or public health data sets.
β’ Experience with analytics as a service, APIs, and database file system optimization.
β’ Familiarity with state and federal data compliance frameworks.