iSoftTek Solutions Inc

Senior Data Modeler/Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Modeler/Data Architect in Mount Laurel, NJ, for over 6 months at a competitive pay rate. Requires 14-15+ years of experience, proficiency in Data Vault and Cloud Azure, and expertise in data modeling tools like ERWin.
🌎 - Country
United States
πŸ’± - Currency
Unknown
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 7, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Mount Laurel, NJ
-
🧠 - Skills detailed
#Data Modeling #SQL (Structured Query Language) #Azure Synapse Analytics #Business Analysis #Security #Data Integration #Data Science #ERWin #Data Lake #Cloud #Data Vault #Synapse #Azure SQL Database #Data Architecture #Compliance #"ETL (Extract #Transform #Load)" #Normalization #Physical Data Model #Data Security #Databricks #Vault #Azure SQL #Azure #Data Engineering #Scala #Computer Science #Data Analysis
Role description
Location: Mount Laurel, NJ Duration: Fulltime / Contract Experience: 14-15 + Years Primary Skills: Data Vault, Data modeling with Data analysis and modeling tools (e.g. Power Designer, ERWin, ER/Studio) Job Description: We are seeking a skilled Sr. Data Modeler to join our team and contribute to the design and implementation of scalable and efficient data models. The ideal candidate will have experience with Data Vault techniques, proficiency in Cloud Azure, and good to have working knowledge with Databricks. Responsibilities:Data Modeling: Design and develop data models using Data Vault methodologies to ensure robust, scalable, and flexible data architectures. Create and maintain conceptual, logical, and physical data models to support business requirements. Collaborate with stakeholders to understand data requirements and translate them into effective data models. Implement best practices for data modeling, including normalization, denormalization, and data integration. Data Vault Implementation: Utilize Data Vault techniques to build enterprise data lake and lake house and integrate disparate data sources. Design and implement Hubs, Links, and Satellites to ensure comprehensive data capture and historical tracking. Optimize data vault models for performance and scalability. Cloud Azure: Design and deploy data models on Azure Data Services, including Azure SQL Database, Azure Data Lake, and Azure Synapse Analytics. Ensure data security, compliance, and best practices in Azure environments. Collaboration and Communication: Work closely with data engineers, data scientists, and business analysts to understand data needs and ensure data model alignment. Provide technical guidance and support on data modeling best practices and Data Vault principles. Document data models, data flows, and integration processes clearly and comprehensively. Data Modelling tools Azure: Data analysis and modeling tools (e.g. Power Designer, ERWin, ER/Studio) Requirements: Bachelor s degree in Computer Science, Information Systems, or a related field (Master s degree preferred). Proven experience in data modeling, with a strong focus on Data Vault techniques. Proficiency in Cloud Azure services, including Azure Data Lake, Azure SQL Database, and Azure Synapse Analytics. Understand/Knowledge with Databricks and its ecosystem. Strong understanding of data warehousing concepts, Data Lake house, ETL processes, and data integration techniques. Excellent analytical and problem-solving skills with attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams. Relevant certifications in Azure and Databricks are a plus.