RICEFW Technologies Inc

Senior Data Modeler (In-Person Interview)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Modeler with a contract length of "unknown," offering a pay rate of "unknown," and requiring in-person work. Candidates should have 8+ years of data modeling experience, expertise in Big Data technologies, and familiarity with Snowflake and AWS.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 6, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Trenton, NJ
-
🧠 - Skills detailed
#Data Management #Database Tuning #Snowflake #Data Modeling #Physical Data Model #Data Lake #Cloud #Redshift #Pega #Conceptual Data Model #AWS (Amazon Web Services) #SQL (Structured Query Language) #Data Profiling #Logical Data Model #BI (Business Intelligence) #SQL Server #Data Quality #Databases #Data Warehouse
Role description
Tasks: β€’ Understand and translate business needs into data models supporting long-term solutions for different systems. β€’ Work with the Application Development teams (ex: Pega Applications) to implement data strategies, build data flows and develop conceptual data models. β€’ Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. β€’ Optimize and update logical and physical data models to support Data Initiative project for different phases of implementation. β€’ Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. β€’ Recommend opportunities for reuse of data models in new environments. β€’ Perform reverse engineering of physical data models from databases like DB2 and SQL scripts. β€’ Evaluate data models and physical databases for variances and discrepancies across the different source systems . β€’ Analyze data-related system integration challenges and propose appropriate solutions and develop the model according to Standards. β€’ Work with all IT teams like System Analysts, Engineers, Programmers, BI and others on project limitations and capabilities, performance requirements and interfaces. β€’ Review modifications to existing software to improve efficiency and performance. Qualifications: β€’ Candidate should have over all 8+ years of Data Modelling experience. β€’ Minimum 4 years of overall experience and 3 years into Bigdata Technologies β€’ Strong work experience in Data modeling and usage of data modeling tools. β€’ Exposure with Snowflake’s Cloud Data Warehouse, if possible β€’ He/she should be well versed in understanding relational data, dimensional data, unstructured data, and master data. β€’ Should have recent data management trend skills such as developing data lake model along with traditional modelling practices like snowflake and star. β€’ They should know modeling techniques latest that are adapted for data warehousing and analytical structures. β€’ Able to build logical data model and translate logical model into physical database. β€’ They must be able to compare actual project procedures to the specified standards and procedures. β€’ They should be able to demonstrate expertise with the most recent and relevant technologies in the data modelling implementation. β€’ They must have excellent communication skills and possess the ability to collaborate with internal and external groups including vendors. β€’ They must have the ability of working independently and with minimal supervision Skills: β€’ 8+ years of experience in Data Modeling for Databases like DB2, AWS’ Redshift, Dynamo DB, Postgres, SQL Server, Pega Application. β€’ Data modeling software (6 years) β€’ AWS and Other Cloud systems experience is must. β€’ Strong SQL and data profiling skills in various relational databases. β€’ Analyzing source databases, source data, source database referential integrity and profiling source data (8 years) β€’ Database tuning techniques (minimum 4 years) β€’ Experience in building Data Lakes for both structured and unstructured data (2 years)