

Data Modeler - EDW, Azure - 95% Remote - Contract (Client in Harrisburg, PA) - B4001B
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler - EDW, Azure, on a 10-month contract, 95% remote, requiring PA residency. Key skills include data modeling, SQL Server, Azure Databricks, and over 10 years of relevant experience, preferably in healthcare.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 29, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
Harrisburg, PA
-
π§ - Skills detailed
#Data Modeling #Data Profiling #Synapse #ADaM (Analysis Data Model) #Storage #Documentation #Agile #Computer Science #Databases #SSIS (SQL Server Integration Services) #Delta Lake #Complex Queries #Scrum #Data Processing #Databricks #DevOps #Azure DevOps #Azure Databricks #Datasets #Physical Data Model #SaaS (Software as a Service) #Python #Metadata #Data Warehouse #Data Lake #Scala #EDW (Enterprise Data Warehouse) #Azure cloud #Cloud #Data Architecture #ERWin #Database Design #SQL Server #Azure #Data Design #SQL (Structured Query Language) #Business Analysis #Data Analysis
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Technovision, Inc., is seeking the following. Apply via Dice today!
Our direct client is looking for a Data Modeler - EDW, Azure for a 95% Remote Contract (Client in Harrisburg, PA).
NOTE:
β’ This will be a 10-month engagement beginning in September 2025
β’ The contractor must reside in PA, and will be permitted to work from home. Office space for the contractor will be provided as needed.
β’ The contractor is expected to be in the office at least '1' day per month, subject to additional days in office at Manager discretion.
β’ In addition, DOH will supply all necessary hardware and software for daily use that are needed to complete assigned work items.
OBJECTIVES OF ENGAGEMENT
β’ The primary objective of this engagement is for the selected candidate to serve as the Data Modeler / Data Architect, to provide technical services to the DOH and EDW for data warehouse modernization. The selected candidate will serve as a data architect, modeler, and administrator for the EDW, supporting the analysis and reporting needs of the DOH and DDAP, as well as the design and construction of the new EDW in Azure.
β’ This position s scope includes the modernization of DOH operations; plan, coordinate and respond to data reporting needs, set standards and define framework; assist with large volume data processing, statistical analysis of large datasets; revamping the EDW into Microsoft s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional DOH systems into the EDW.
Operational Requirements
β’ The Data Modeler / Data Architect is a senior level resource with specialized knowledge and experience in data analysis, data modeling and database design. The selected contractor must have proven and demonstrated experience in the development, validation, publishing, and maintenance of data designs, logical and physical data models, data dictionaries, and metadata repositories.
β’ The selected contractor will collaborate with EDW team members and DOH program area stakeholders to develop and implement solutions that meet the requirements for using the data. This position will be expected to develop and implement data analysis methodologies, validate business use cases for accuracy and completeness of proposed data models, and work with business analysts, application developers, and DBAs to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.
REQUIRED SKILLS:
β’ Data Modeler can design, develop, and implement data models and data lake architecture to provide reliable and scalable applications and systems to meet the organization s objectives and requirements. The Data Modeler is familiar with a variety of database technologies, environments, concepts, methodologies, practices, and procedures.
β’ Demonstrable, advanced experience with data profiling, analysis and design, developing and documenting information domain models, data structures, objects, attributes, relationships, and integrity rules
β’ Experience with logical and physical data modeling, including use of tools like ErWin, or similar.
β’ Experience managing metadata, data dictionaries and technical documentation
β’ Experience with analyzing and translating business requirements and use cases into optimized models, data flows, and developing database solutions
β’ Experience with and knowledge of relational databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology
β’ Strong proficiency with SQL Server, T-SQL, SSIS, stored procedures, ELT processes, scripts, and complex queries
β’ Working knowledge of Azure Databricks, Delta Lake, Synapse and Python
β’ Experience with evaluating implemented data systems for variances, discrepancies, and efficiency
β’ Experience with Azure DevOps and Agile / Scrum development methods
β’ Auditing databases to maintain quality and creating systems to keep data secure
β’ Demonstrable ability to communicate and document clearly and concisely
β’ Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision
β’ Ability to work collaboratively and effectively with colleagues, and as a member of a team
β’ Ability to present complex technical concepts and data to a varied audience effectively.
β’ More than 10 years of relevant experience
SKILL MATRIX:
β’ Experience with logical and physical data modeling, including use of tools like ErWin, or similar. - Required
β’ Experience managing metadata, data dictionaries and technical documentation - Required
β’ Strong proficiency with SQL Server, T-SQL, SSIS, stored procedures, ELT processes, scripts, and complex queries - Required
β’ Working knowledge of Azure Databricks, Delta Lake, Synapse and Python - Required
β’ Experience with Azure DevOps and Agile / Scrum development methods - Required
β’ Auditing databases to maintain quality and creating systems to keep data secure - Required
β’ Experience working in the public health or healthcare industry with various health data sets. - Nice to have
β’ 4-year college degree in computer science or related field with advanced study preferred. - Nice to have
Question 1: The candidate must reside in PA. Where does your candidate currently reside?
Question 2: Does candidate possess more than '10' years of relevant Data Modeler experience ?
Question 3: The candidate must be able to work '1 day a month on-site (but subject to additional days at Manager discretion). Are you fine with this?
Location: 95% Remote (Client in Harrisburg, PA)
Type: Contract
Please send resume to "jobs at etechnovision dot com" with B4001B in Subject for immediate consideration.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Technovision, Inc., is seeking the following. Apply via Dice today!
Our direct client is looking for a Data Modeler - EDW, Azure for a 95% Remote Contract (Client in Harrisburg, PA).
NOTE:
β’ This will be a 10-month engagement beginning in September 2025
β’ The contractor must reside in PA, and will be permitted to work from home. Office space for the contractor will be provided as needed.
β’ The contractor is expected to be in the office at least '1' day per month, subject to additional days in office at Manager discretion.
β’ In addition, DOH will supply all necessary hardware and software for daily use that are needed to complete assigned work items.
OBJECTIVES OF ENGAGEMENT
β’ The primary objective of this engagement is for the selected candidate to serve as the Data Modeler / Data Architect, to provide technical services to the DOH and EDW for data warehouse modernization. The selected candidate will serve as a data architect, modeler, and administrator for the EDW, supporting the analysis and reporting needs of the DOH and DDAP, as well as the design and construction of the new EDW in Azure.
β’ This position s scope includes the modernization of DOH operations; plan, coordinate and respond to data reporting needs, set standards and define framework; assist with large volume data processing, statistical analysis of large datasets; revamping the EDW into Microsoft s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional DOH systems into the EDW.
Operational Requirements
β’ The Data Modeler / Data Architect is a senior level resource with specialized knowledge and experience in data analysis, data modeling and database design. The selected contractor must have proven and demonstrated experience in the development, validation, publishing, and maintenance of data designs, logical and physical data models, data dictionaries, and metadata repositories.
β’ The selected contractor will collaborate with EDW team members and DOH program area stakeholders to develop and implement solutions that meet the requirements for using the data. This position will be expected to develop and implement data analysis methodologies, validate business use cases for accuracy and completeness of proposed data models, and work with business analysts, application developers, and DBAs to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.
REQUIRED SKILLS:
β’ Data Modeler can design, develop, and implement data models and data lake architecture to provide reliable and scalable applications and systems to meet the organization s objectives and requirements. The Data Modeler is familiar with a variety of database technologies, environments, concepts, methodologies, practices, and procedures.
β’ Demonstrable, advanced experience with data profiling, analysis and design, developing and documenting information domain models, data structures, objects, attributes, relationships, and integrity rules
β’ Experience with logical and physical data modeling, including use of tools like ErWin, or similar.
β’ Experience managing metadata, data dictionaries and technical documentation
β’ Experience with analyzing and translating business requirements and use cases into optimized models, data flows, and developing database solutions
β’ Experience with and knowledge of relational databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology
β’ Strong proficiency with SQL Server, T-SQL, SSIS, stored procedures, ELT processes, scripts, and complex queries
β’ Working knowledge of Azure Databricks, Delta Lake, Synapse and Python
β’ Experience with evaluating implemented data systems for variances, discrepancies, and efficiency
β’ Experience with Azure DevOps and Agile / Scrum development methods
β’ Auditing databases to maintain quality and creating systems to keep data secure
β’ Demonstrable ability to communicate and document clearly and concisely
β’ Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision
β’ Ability to work collaboratively and effectively with colleagues, and as a member of a team
β’ Ability to present complex technical concepts and data to a varied audience effectively.
β’ More than 10 years of relevant experience
SKILL MATRIX:
β’ Experience with logical and physical data modeling, including use of tools like ErWin, or similar. - Required
β’ Experience managing metadata, data dictionaries and technical documentation - Required
β’ Strong proficiency with SQL Server, T-SQL, SSIS, stored procedures, ELT processes, scripts, and complex queries - Required
β’ Working knowledge of Azure Databricks, Delta Lake, Synapse and Python - Required
β’ Experience with Azure DevOps and Agile / Scrum development methods - Required
β’ Auditing databases to maintain quality and creating systems to keep data secure - Required
β’ Experience working in the public health or healthcare industry with various health data sets. - Nice to have
β’ 4-year college degree in computer science or related field with advanced study preferred. - Nice to have
Question 1: The candidate must reside in PA. Where does your candidate currently reside?
Question 2: Does candidate possess more than '10' years of relevant Data Modeler experience ?
Question 3: The candidate must be able to work '1 day a month on-site (but subject to additional days at Manager discretion). Are you fine with this?
Location: 95% Remote (Client in Harrisburg, PA)
Type: Contract
Please send resume to "jobs at etechnovision dot com" with B4001B in Subject for immediate consideration.