

Data Modeler/Data Architect (On W2)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler/Data Architect (On W2) in PA (Remote) for 10+ months at a pay rate of "unknown." Candidates must have 10 years of experience, expertise in Azure Databricks, data modeling, and public health data.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 5, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Pennsylvania, United States
-
🧠 - Skills detailed
#Computer Science #Datasets #Physical Data Model #Python #SSIS (SQL Server Integration Services) #Data Architecture #Azure #Metadata #Quality Assurance #Requirements Gathering #Agile #DevOps #Compliance #Consulting #Storage #Synapse #Databases #SQL (Structured Query Language) #Data Lake #Cloud #Data Design #ADaM (Analysis Data Model) #Scala #Azure Databricks #Database Design #Documentation #Business Analysis #Data Warehouse #Delta Lake #SaaS (Software as a Service) #EDW (Enterprise Data Warehouse) #Data Management #Azure cloud #Data Profiling #SharePoint #Data Modeling #Data Processing #Azure DevOps #ERWin #Databricks #Scrum #Data Analysis #SQL Server #Complex Queries
Role description
Contact Details:
1.Poonam Khandelwal
Email: poonam.khandelwal@peer-consulting.com
Cell: (732) 797-9766
Job Title: Data Modeler/Data Architect (On W2)
Location: PA (Remote)
Duration: 10 Months+
Years of Experience: 10 Yrs.
Required Hours/Week: 40hrs./Week
Notes:
• Need Local Candidates to PA
Objectives of Engagement:
• The primary objective of this engagement is for the selected candidate to serve as the Data Modeler / Data Architect, to provide technical services to the end-client and EDW for data warehouse modernization.
• The selected candidate will serve as a data architect, modeler, and administrator for the EDW, supporting the analysis and reporting needs of the end-client as well as the design and construction of the new EDW in Azure.
• This position’s scope includes the modernization of end-client's operations, plan, coordinate and respond to data reporting needs, set standards and define framework, assist with large volume data processing, statistical analysis of large datasets, revamping the EDW into Microsoft’s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies, create a centralized data model, support for end-client's projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional end-client's systems into the EDW.
Operational Requirements:
• The Data Modeler / Data Architect is a senior level resource with specialized knowledge and experience in data analysis, data modeling and database design.
• The selected contractor must have proven and demonstrated experience in the development, validation, publishing, and maintenance of data designs, logical and physical data models, data dictionaries, and metadata repositories.
• The selected contractor will collaborate with EDW team members and end-client's program area stakeholders to develop and implement solutions that meet the requirements for using the data.
• This position will be expected to develop and implement data analysis methodologies, validate business use cases for accuracy and completeness of proposed data models, and work with business analysts, application developers, and DBAs to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.
Duties and Responsibilities:
• Manage assignments and track progress against agreed upon timelines.
• Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams.
• Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders.
• Participate in business and technical requirements gathering.
• Perform research on potential solutions and provide recommendations to the EDW and end-client's.
• Develop and implement solutions that meet business and technical requirements.
• Participate in testing of implemented solution(s).
• Build and maintain relationships with key stakeholders and customer representatives.
• Give presentations for the EDW, other end-client's offices, and agencies involved with this project.
• Develops and maintains processes and procedural documentation.
• Ensure project compliance with relative federal and commonwealth standards and procedures.
• Understand and convey best practices and standards for data management and analysis.
• Conduct training and transfer of knowledge sessions for maintaining and updating data models, data dictionaries, and metadata.
• Complete weekly timesheet reporting in People Fluent/VectorVMS by COB each Friday.
• Complete the weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered into Daptiv.
• Provide weekly personal status reporting by COB Friday submitted on SharePoint.
• Utilize a SharePoint site for project and operational documentation, review existing documentation.
Required Skills and Experience:
The Data Modeler / Architect can design, develop, and implement data models and data lake architecture to provide reliable and scalable applications and systems to meet the organization’s objectives and requirements. The Data Modeler is familiar with a variety of database technologies, environments, concepts, methodologies, practices, and procedures.
• Demonstrable, advanced experience with data profiling, analysis and design, developing and documenting information domain models, data structures, objects, attributes, relationships, and integrity rules
• Experience with logical and physical data modeling, including use of tools like ErWin, or similar.
• Experience managing metadata, data dictionaries and technical documentation
• Experience with analyzing and translating business requirements and use cases into optimized models, data flows, and developing database solutions
• Experience with and knowledge of relational databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology
• Strong proficiency with SQL Server, T-SQL, SSIS, stored procedures, ELT processes, scripts, and complex queries
• Working knowledge of Azure Databricks, Delta Lake, Synapse and Python
• Experience with evaluating implemented data systems for variances, discrepancies, and efficiency
• Experience with Azure DevOps and Agile / Scrum development methods
• Auditing databases to maintain quality and creating systems to keep data secure
• Demonstrable ability to communicate and document clearly and concisely
• Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision
• Ability to work collaboratively and effectively with colleagues, and as a member of a team
• Ability to present complex technical concepts and data to a varied audience effectively.
• More than 10 years of relevant experience
• 4-year college degree in computer science or related field with advanced study preferred.
Preferred Experience:
• Experience working in the public health or healthcare industry with various health data sets.
Work Location:
• The contractor must reside in PA, and will be permitted to work from home. Office space for the contractor will be provided as needed. The contractor is expected to be in the office at least '1' day per month, subject to additional days in office at Manager discretion.
• In addition, the end-client will supply all necessary hardware and software for daily use that are needed to complete assigned work items.
Contact Details:
1.Poonam Khandelwal
Email: poonam.khandelwal@peer-consulting.com
Cell: (732) 797-9766
Job Title: Data Modeler/Data Architect (On W2)
Location: PA (Remote)
Duration: 10 Months+
Years of Experience: 10 Yrs.
Required Hours/Week: 40hrs./Week
Notes:
• Need Local Candidates to PA
Objectives of Engagement:
• The primary objective of this engagement is for the selected candidate to serve as the Data Modeler / Data Architect, to provide technical services to the end-client and EDW for data warehouse modernization.
• The selected candidate will serve as a data architect, modeler, and administrator for the EDW, supporting the analysis and reporting needs of the end-client as well as the design and construction of the new EDW in Azure.
• This position’s scope includes the modernization of end-client's operations, plan, coordinate and respond to data reporting needs, set standards and define framework, assist with large volume data processing, statistical analysis of large datasets, revamping the EDW into Microsoft’s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies, create a centralized data model, support for end-client's projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional end-client's systems into the EDW.
Operational Requirements:
• The Data Modeler / Data Architect is a senior level resource with specialized knowledge and experience in data analysis, data modeling and database design.
• The selected contractor must have proven and demonstrated experience in the development, validation, publishing, and maintenance of data designs, logical and physical data models, data dictionaries, and metadata repositories.
• The selected contractor will collaborate with EDW team members and end-client's program area stakeholders to develop and implement solutions that meet the requirements for using the data.
• This position will be expected to develop and implement data analysis methodologies, validate business use cases for accuracy and completeness of proposed data models, and work with business analysts, application developers, and DBAs to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.
Duties and Responsibilities:
• Manage assignments and track progress against agreed upon timelines.
• Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams.
• Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders.
• Participate in business and technical requirements gathering.
• Perform research on potential solutions and provide recommendations to the EDW and end-client's.
• Develop and implement solutions that meet business and technical requirements.
• Participate in testing of implemented solution(s).
• Build and maintain relationships with key stakeholders and customer representatives.
• Give presentations for the EDW, other end-client's offices, and agencies involved with this project.
• Develops and maintains processes and procedural documentation.
• Ensure project compliance with relative federal and commonwealth standards and procedures.
• Understand and convey best practices and standards for data management and analysis.
• Conduct training and transfer of knowledge sessions for maintaining and updating data models, data dictionaries, and metadata.
• Complete weekly timesheet reporting in People Fluent/VectorVMS by COB each Friday.
• Complete the weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered into Daptiv.
• Provide weekly personal status reporting by COB Friday submitted on SharePoint.
• Utilize a SharePoint site for project and operational documentation, review existing documentation.
Required Skills and Experience:
The Data Modeler / Architect can design, develop, and implement data models and data lake architecture to provide reliable and scalable applications and systems to meet the organization’s objectives and requirements. The Data Modeler is familiar with a variety of database technologies, environments, concepts, methodologies, practices, and procedures.
• Demonstrable, advanced experience with data profiling, analysis and design, developing and documenting information domain models, data structures, objects, attributes, relationships, and integrity rules
• Experience with logical and physical data modeling, including use of tools like ErWin, or similar.
• Experience managing metadata, data dictionaries and technical documentation
• Experience with analyzing and translating business requirements and use cases into optimized models, data flows, and developing database solutions
• Experience with and knowledge of relational databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology
• Strong proficiency with SQL Server, T-SQL, SSIS, stored procedures, ELT processes, scripts, and complex queries
• Working knowledge of Azure Databricks, Delta Lake, Synapse and Python
• Experience with evaluating implemented data systems for variances, discrepancies, and efficiency
• Experience with Azure DevOps and Agile / Scrum development methods
• Auditing databases to maintain quality and creating systems to keep data secure
• Demonstrable ability to communicate and document clearly and concisely
• Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision
• Ability to work collaboratively and effectively with colleagues, and as a member of a team
• Ability to present complex technical concepts and data to a varied audience effectively.
• More than 10 years of relevant experience
• 4-year college degree in computer science or related field with advanced study preferred.
Preferred Experience:
• Experience working in the public health or healthcare industry with various health data sets.
Work Location:
• The contractor must reside in PA, and will be permitted to work from home. Office space for the contractor will be provided as needed. The contractor is expected to be in the office at least '1' day per month, subject to additional days in office at Manager discretion.
• In addition, the end-client will supply all necessary hardware and software for daily use that are needed to complete assigned work items.