MethodHub

Data Modeler

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler in Chicago, IL, with a 12-month contract at a pay rate of "unknown." Requires 8+ years of experience, expertise in data modeling tools, and knowledge of Finance & Capital Markets data structures.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
May 2, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Storage #Scala #Metadata #Snowflake #Data Modeling #Data Integrity #dbt (data build tool) #Spark (Apache Spark) #Data Management #Data Storage #Compliance #Scrum #Data Lineage #Data Governance #AWS (Amazon Web Services) #ERWin #Databricks #Physical Data Model #Cloud #Data Architecture #Agile
Role description
Data Modeler Chicago, IL (2-3 days onsite a week) 12 months contract Key Teck Stack: Data Modeling Tools, Medallion Architecture, Cloud (AWS), Exp with Logical & Physical data models Position Overview: We are looking for a Data Modeler with strong expertise in designing, optimizing, and maintaining conceptual, logical, and physical data models for financial data. The candidate will work closely with data architects, engineers, and business stakeholders to ensure data integrity, performance, and alignment with domain-specific needs in Financial or Capital Markets. Key Responsibilities: • Design and maintain conceptual, logical, and physical data models supporting trading, risk, and compliance functions. • Work with Medallion Architecture (Bronze/Silver/Gold layers) to align data models with cloud lakehouse design. • Collaborate with Data Architects and Engineers to translate models into efficient schemas for AWS, Spark, Parquet, and Iceberg. • Model time-series, reference data, market data, and transactional flows specific to Finance & Capital Markets. • Define data standards, naming conventions, and metadata management practices. • Optimize data models for performance, scalability, and regulatory reporting needs. • Partner with business stakeholders to capture requirements and ensure semantic consistency across domains. Required Skills & Experience: • 8+ years of relevant data modeling experience. • Hands-on experience in data modeling tools (Erwin, ER/Studio, PowerDesigner, or similar). • Strong knowledge of relational, dimensional, and lakehouse modeling techniques. • Experience with Parquet, Iceberg, and cloud-native data storage formats. • Strong understanding of Finance & Capital Markets data structures (trades, positions, risk, reference/master data). Preferred: • Exposure to AWS data services, Databricks, Snowflake, DBT. • Knowledge of data governance, data lineage, and regulatory compliance. • Familiarity with the Agile delivery model and working with Scrum teams.