TEK NINJAS

Data Modeler

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler in Chicago, IL, with a contract length of 6-12 months and a pay rate of $58-$63/hr C2C. Key skills include data modeling tools, expertise in Finance & Capital Markets, and experience with AWS and cloud-native data formats.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
504
-
🗓️ - Date
October 8, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #ERWin #Data Management #dbt (data build tool) #Snowflake #Data Storage #Cloud #Spark (Apache Spark) #Data Governance #Data Integrity #Data Modeling #Data Lineage #Scrum #Data Architecture #Physical Data Model #Storage #Compliance #Scala #Metadata #Agile #Databricks
Role description
Job Title: Data Modeler (Finance & Capital Markets Project) Location: Chicago, IL Duration: 6-12 Months Rate/hour Range: $58/hr-$63/hr C2C Position Overview: We are looking for a Data Modeler with strong expertise in designing, optimizing, and maintaining conceptual, logical, and physical data models for financial data. The candidate will work closely with data architects, engineers, and business stakeholders to ensure data integrity, performance, and alignment with domain-specific needs in Capital Markets. Key Responsibilities • Design and maintain conceptual, logical, and physical data models supporting trading, risk, and compliance functions. • Work with Medallion Architecture (Bronze/Silver/Gold layers) to align data models with cloud lakehouse design. • Collaborate with Data Architects and Engineers to translate models into efficient schemas for AWS, Spark, Parquet, Iceberg. • Model time-series, reference data, market data, and transactional flows specific to Finance & Capital Markets. • Define data standards, naming conventions, and metadata management practices. • Optimize data models for performance, scalability, and regulatory reporting needs. • Partner with business stakeholders to capture requirements and ensure semantic consistency across domains. Required Skills & Experience • Hands-on experience in data modeling tools (Erwin, ER/Studio, PowerDesigner, or similar). • Strong knowledge of relational, dimensional, and lakehouse modeling techniques. • Experience with Parquet, Iceberg, and cloud-native data storage formats. • Strong understanding of Finance & Capital Markets data structures (trades, positions, risk, reference/master data). • 5–8 years of relevant data modeling experience. Preferred • Exposure to AWS data services, Databricks, Snowflake, DBT. • Knowledge of data governance, data lineage, and regulatory compliance. • Familiarity with Agile delivery model and working with Scrum teams.