

Senior Data Modeler
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Modeler with a 12-month contract in Philadelphia, PA, offering a competitive pay rate. Requires 10+ years of enterprise data modeling experience, proficiency in cloud platforms, and strong SQL skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Philadelphia, PA
-
π§ - Skills detailed
#Data Quality #Data Modeling #SQL (Structured Query Language) #XML (eXtensible Markup Language) #Security #Computer Science #Migration #Business Analysis #AWS (Amazon Web Services) #Data Engineering #AI (Artificial Intelligence) #Data Catalog #Data Lineage #Scrum #Synapse #ML (Machine Learning) #ERWin #JSON (JavaScript Object Notation) #Scala #Databricks #Redshift #"ETL (Extract #Transform #Load)" #Data Governance #Metadata #BI (Business Intelligence) #Normalization #Physical Data Model #GDPR (General Data Protection Regulation) #Data Architecture #Snowflake #Data Security #Cloud #Azure #Data Mapping #Data Privacy #Data Profiling #Agile
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
e&e is seeking a Data Modeler for an onsite contract opportunity in Philadelphia, PA!
We are seeking a highly skilled Senior Data Modeler to design and implement robust, scalable data architectures that support both operational and analytical workloads. This role will collaborate with cross-functional teamsβincluding business analysts, data engineers, product teams, and governance stakeholdersβto define data modeling strategies, develop metadata standards, and ensure data quality and integrity across the enterprise. The ideal candidate will have significant experience with modern cloud data platforms, data governance frameworks, and advanced modeling tools to support business intelligence and AI/ML use cases.
Responsibilities:
β’ Collaborate with business and technical teams to gather data requirements and translate them into conceptual, logical, and physical data models.
β’ Optimize data models for performance, scalability, and usability in analytics and reporting environments.
β’ Define and maintain data modeling standards, naming conventions, and metadata repositories.
β’ Conduct gap analyses of current data environments and identify opportunities for architectural improvement.
β’ Support data lineage and data mapping activities to ensure traceability and data quality throughout the migration lifecycle.
β’ Integrate data models with BI tools, semantic layers, and data catalogs to enable self-service analytics.
β’ Work with AI/ML teams to model data sets required for predictive models and advanced analytics.
β’ Lead model review sessions with engineering and analytics teams to validate data structures against business needs.
β’ Stay updated with emerging data modeling technologies, including Lakehouse, Data Mesh, and Medallion architectures.
Requirements:
β’ Bachelorβs degree in Computer Science, Information Technology, or a related field.
β’ 10+ years of experience designing enterprise data models for both transactional and analytical systems.
β’ 5+ years of experience using data modeling tools such as Erwin or ER/Studio; experience with multi-user environments is preferred.
β’ 3+ years of experience with cloud data platforms (AWS, Azure, Snowflake, Databricks, Redshift, or Synapse).
β’ Expertise in normalization, de-normalization, and dimensional modeling techniques.
β’ Strong SQL skills for data profiling, analysis, validation, and reconciliation.
β’ Experience with structured and semi-structured data (e.g., XML, JSON, AVRO).
β’ Familiarity with ETL/ELT pipelines, BI tools, and data catalog solutions.
β’ Understanding of data privacy regulations (GDPR, CCPA) and best practices in data security.
β’ Exposure to modern data architecture concepts such as Lakehouse, Data Mesh, and Data Fabric.
β’ Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders.
β’ Experience working in Agile/Scrum environments.