

Tential Solutions
Data Modeler
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Modeler, remote for 6 months with potential for 12+ month extension. Requires 5+ years in data modeling, advanced SQL skills, and financial services experience. Familiarity with regulatory standards and merger experience preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
March 1, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#ERWin #Oracle #Business Analysis #Data Quality #Data Vault #Snowflake #Redshift #BigQuery #"ETL (Extract #Transform #Load)" #PostgreSQL #Amazon Redshift #Data Modeling #Teradata #Compliance #Data Integration #Vault #Metadata #Migration #Data Catalog #Data Warehouse #Data Governance #Data Engineering #Cloud #Datasets #Alation #Collibra #Scala #dbt (data build tool) #Oracle RAC (Real Application Clusters) #Informatica #SQL (Structured Query Language)
Role description
Senior Data Modeler
Location: Remote
Duration: 6 months (initial) with 12+ month extension
Role Overview
We are seeking Senior Data Modelers to support a large-scale data integration initiative following a major banking merger. This is a pure design and architecture role, not a development or engineering position. You will be responsible for defining the data schemas, structures, and quality standards required to unify disparate banking systems into a cohesive, scalable environment.
Key Responsibilities
β’ Logical & Physical Modeling: Design complex schemas that bridge legacy systems from both organizations.
β’ Integration Mapping: Identify overlaps and gaps in data structures between the two banks (e.g., reconciling different βCustomerβ or βTransactionβ definitions).
β’ Data Governance: Define data quality rules, access controls, and metadata standards to ensure regulatory compliance.
β’ Schema Evolution: Develop future-state models that support high-volume banking transactions and analytical reporting.
β’ Collaboration: Act as the bridge between Business Analysts (requirements) and Data Engineers (implementation).
Technical Profile & Tooling Options
The ideal candidate should have 5+ years of experience and mastery in one or more of the following categories: Category Representative Technologies (Examples) Data Modeling Tools Erwin Data Modeler, ER/Studio, or PowerDesigner Cloud Data Warehouses Snowflake, Google BigQuery, or Amazon Redshift Transformation/Logic dbt (Data Build Tool), SQL-based views, or Data Vault 2.0 Data Quality/Catalog Collibra, Alation, or Informatica Enterprise Data Catalog Database Engines Oracle RAC, Teradata (Legacy), or PostgreSQL
Required Qualifications
β’ 5+ Years in Data Modeling: Proven track record in large-scale enterprise environments (Financial Services/Banking preferred).
β’ Advanced SQL: Deep ability to query complex datasets to reverse-engineer existing schemas.
Preferred Qualifications:
β’ Regulatory Knowledge: Familiarity with banking-specific data standards (e.g., BCBS 239, AML/KYC data requirements).
β’ Merger Experience: (Preferred) Previous experience in βMapping & Gap Analysisβ during corporate acquisitions or system migrations.
Senior Data Modeler
Location: Remote
Duration: 6 months (initial) with 12+ month extension
Role Overview
We are seeking Senior Data Modelers to support a large-scale data integration initiative following a major banking merger. This is a pure design and architecture role, not a development or engineering position. You will be responsible for defining the data schemas, structures, and quality standards required to unify disparate banking systems into a cohesive, scalable environment.
Key Responsibilities
β’ Logical & Physical Modeling: Design complex schemas that bridge legacy systems from both organizations.
β’ Integration Mapping: Identify overlaps and gaps in data structures between the two banks (e.g., reconciling different βCustomerβ or βTransactionβ definitions).
β’ Data Governance: Define data quality rules, access controls, and metadata standards to ensure regulatory compliance.
β’ Schema Evolution: Develop future-state models that support high-volume banking transactions and analytical reporting.
β’ Collaboration: Act as the bridge between Business Analysts (requirements) and Data Engineers (implementation).
Technical Profile & Tooling Options
The ideal candidate should have 5+ years of experience and mastery in one or more of the following categories: Category Representative Technologies (Examples) Data Modeling Tools Erwin Data Modeler, ER/Studio, or PowerDesigner Cloud Data Warehouses Snowflake, Google BigQuery, or Amazon Redshift Transformation/Logic dbt (Data Build Tool), SQL-based views, or Data Vault 2.0 Data Quality/Catalog Collibra, Alation, or Informatica Enterprise Data Catalog Database Engines Oracle RAC, Teradata (Legacy), or PostgreSQL
Required Qualifications
β’ 5+ Years in Data Modeling: Proven track record in large-scale enterprise environments (Financial Services/Banking preferred).
β’ Advanced SQL: Deep ability to query complex datasets to reverse-engineer existing schemas.
Preferred Qualifications:
β’ Regulatory Knowledge: Familiarity with banking-specific data standards (e.g., BCBS 239, AML/KYC data requirements).
β’ Merger Experience: (Preferred) Previous experience in βMapping & Gap Analysisβ during corporate acquisitions or system migrations.






