

Senior Data Modeler
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Modeler for a contract position in Philadelphia, PA, offering a pay rate of "unknown." Candidates should have a Bachelor's degree, 10+ years in data modeling, and 3+ years with cloud platforms. Key skills include SQL, data governance, and Agile experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 7, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Philadelphia, PA
-
π§ - Skills detailed
#Data Profiling #Data Lineage #SQL (Structured Query Language) #ERWin #Data Modeling #Data Privacy #Synapse #Data Mapping #Scrum #Cloud #Business Analysis #JSON (JavaScript Object Notation) #ML (Machine Learning) #Databricks #Data Quality #XML (eXtensible Markup Language) #Metadata #Normalization #AWS (Amazon Web Services) #Physical Data Model #BI (Business Intelligence) #Security #Data Architecture #Data Catalog #AI (Artificial Intelligence) #Data Engineering #Migration #Azure #GDPR (General Data Protection Regulation) #Data Security #"ETL (Extract #Transform #Load)" #Agile #Scala #Snowflake #Computer Science #Redshift #Data Governance
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
e&e is seeking a Data Modeler for an onsite contract opportunity in Philadelphia, PA!
We are seeking a highly skilled Senior Data Modeler to design and implement robust, scalable data architectures that support both operational and analytical workloads. This role will collaborate with cross-functional teamsβincluding business analysts, data engineers, product teams, and governance stakeholdersβto define data modeling strategies, develop metadata standards, and ensure data quality and integrity across the enterprise. The ideal candidate will have significant experience with modern cloud data platforms, data governance frameworks, and advanced modeling tools to support business intelligence and AI/ML use cases.
Responsibilities:
β’ Collaborate with business and technical teams to gather data requirements and translate them into conceptual, logical, and physical data models.
β’ Optimize data models for performance, scalability, and usability in analytics and reporting environments.
β’ Define and maintain data modeling standards, naming conventions, and metadata repositories.
β’ Conduct gap analyses of current data environments and identify opportunities for architectural improvement.
β’ Support data lineage and data mapping activities to ensure traceability and data quality throughout the migration lifecycle.
β’ Integrate data models with BI tools, semantic layers, and data catalogs to enable self-service analytics.
β’ Work with AI/ML teams to model data sets required for predictive models and advanced analytics.
β’ Lead model review sessions with engineering and analytics teams to validate data structures against business needs.
β’ Stay updated with emerging data modeling technologies, including Lakehouse, Data Mesh, and Medallion architectures.
Requirements:
β’ Bachelorβs degree in Computer Science, Information Technology, or a related field.
β’ 10+ years of experience designing enterprise data models for both transactional and analytical systems.
β’ 5+ years of experience using data modeling tools such as Erwin or ER/Studio; experience with multi-user environments is preferred.
β’ 3+ years of experience with cloud data platforms (AWS, Azure, Snowflake, Databricks, Redshift, or Synapse).
β’ Expertise in normalization, de-normalization, and dimensional modeling techniques.
β’ Strong SQL skills for data profiling, analysis, validation, and reconciliation.
β’ Experience with structured and semi-structured data (e.g., XML, JSON, AVRO).
β’ Familiarity with ETL/ELT pipelines, BI tools, and data catalog solutions.
β’ Understanding of data privacy regulations (GDPR, CCPA) and best practices in data security.
β’ Exposure to modern data architecture concepts such as Lakehouse, Data Mesh, and Data Fabric.
β’ Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders.
β’ Experience working in Agile/Scrum environments.