Azure Data Warehouse Developer - Contract - 95% Remote (Client in Harrisburg, PA) - B4000B

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Warehouse Developer on a 10-month contract, 95% remote, requiring PA residency. Key skills include Azure, SQL Server, ETL/ELT, and CI/CD pipeline management. A minimum of 5 years' experience is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 29, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Harrisburg, PA
-
🧠 - Skills detailed
#Code Reviews #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Synapse #Storage #BI (Business Intelligence) #Programming #Documentation #Azure Data Factory #Computer Science #Databases #SSIS (SQL Server Integration Services) #Delta Lake #Classification #Data Processing #Databricks #DevOps #Azure DevOps #Azure Databricks #Data Mining #Datasets #SaaS (Software as a Service) #Spark (Apache Spark) #Deployment #Python #Apache Spark #Data Warehouse #Data Lake #Scala #EDW (Enterprise Data Warehouse) #Azure cloud #Cloud #SQL Server #Azure #SQL (Structured Query Language) #Data Engineering #Business Analysis #Data Analysis
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Technovision, Inc., is seeking the following. Apply via Dice today! Our direct client is looking for a Azure Data Warehouse Developer for a 95% Remote Contract (Client in Harrisburg, PA). NOTE: β€’ This will be a 10-month engagement beginning in September 2025 β€’ The contractor must reside in PA, and will be permitted to work from home. β€’ The contractor is expected to be in the office at least '1' day per month, subject to additional days in office at Manager discretion. β€’ In addition, DOH will supply all necessary hardware and software for daily use that are needed to complete assigned work items. JOB DESCRIPTION: β€’ Support of a Data Modernization Initiative, with the vision that all public health policies and interventions are driven by data, and the mission to provide all internal and external public health decision makers with accessible, timely, reliable, and meaningful data to drive policies and interventions. The Enterprise Data Warehouse (EDW) is responding to DOH s need for centralized data and state of the art data analysis services by modernizing its data portfolio, architecture, and statistical analysis capabilities aimed at improving public health surveillance, interventions, future outbreak prevention, outcomes, and research. β€’ Azure DW Developer position will support both the existing business and reporting requirements of individual DOH / DDAP systems and program areas, and the construction of a modern data warehouse that will serve DOH / DDAP from an enterprise perspective. β€’ The primary objective of this engagement is for the selected candidate to serve as the data warehouse developer supporting the analysis and reporting needs of the DOH / DDAP, and the design and construction of a modern EDW in Azure. β€’ This position s scope includes the modernization of DOH operations; plan, coordinate and respond to data reporting needs, set standards and define framework; assist with large volume data processing, statistical analysis of large datasets; revamping the EDW into Microsoft s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional DOH systems into the EDW. Requirements β€’ This is a senior level resource with advanced, specialized knowledge and experience in data warehousing, database, and programming concepts and technology. The selected contractor must have proven experience in the development, maintenance, testing, and maintenance of Azure production systems and projects. This position designs, develops, tests, and implements data lakes, databases, extract-load-transform programs, applications, and reports. This position will work with business analysts, application developers, DBAs, network, and system staff to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives. β€’ Can design, develop, and implement data and ELT application infrastructure in Azure to provide reliable and scalable applications and systems to meet the organization s objectives and requirements. The Architect is familiar with a variety of application and database technologies, environments, concepts, methodologies, practices, and procedures β€’ The candidate must have significant, hands-on technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python. β€’ Significant, hands-on technical experience and expertise with the design, implementation and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL Server and Azure Synapse. β€’ Experience producing ETL/ELT using SQL Server Integration Services and other tools. β€’ Experience with SQL Server, T-SQL, scripts, queries. β€’ Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines, automate the build, test, and deployment processes for various applications and services, troubleshoot and resolve pipeline issues and bottlenecks, and has experience with Monorepo-based CI/CD pipelines β€’ Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques. β€’ Experience with data mining architecture, modeling standards, reporting and data analysis methodologies. β€’ Experience with data engineering, database file systems optimization, APIs, and analytics as a service. β€’ Analyzing and translating business requirements and use cases into optimized designs and developing sound solutions. β€’ Advanced knowledge of relational databases, dimensional databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology. β€’ Creates and maintains technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases. Follows established SDLC best practices, documents code and participates in peer code reviews. β€’ Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision. β€’ Demonstrated ability to communicate and document clearly and concisely β€’ Ability to work collaboratively and effectively with colleagues as a member of a team. β€’ Ability to present complex technical concepts and data to a varied audience effectively. β€’ More than 5 years of relevant experience. SKILL MATRIX: β€’ Technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python. - Required β€’ Design, implementation, and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL ServerAzure Synapse - Required β€’ Experience producing ETL/ELT using SQL Server Integration Services and other tools. - Required β€’ Experience with SQL Server, T-SQL, scripts, queries - Required β€’ Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines - Required β€’ Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering - Required β€’ Experience with data engineering, database file systems optimization, APIs, and analytics as a service - Required β€’ Experience with data mining architecture, modeling standards, reporting and data analysis methodologies - Required β€’ 4-year college degree in computer science or related field with advanced study preferred. - Required Question 1: The candidate must reside in PA. Where does your candidate currently reside? Question 2: Does candidate possess more than '5' years as a Azura Data Warehouse Developer? Question 3: The candidate must be able to work '1 day a month on-site (but subject to additional days at Manager discretion). Are you fine with this? Location: 95% Remote (Client in Harrisburg, PA) Type: Contract Please send resume to "jobs at etechnovision dot com" with B4000B in Subject for immediate consideration.