

Commergence
Data Modeler/ Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler/Architect on a long-term contract, offering a pay rate of "X" and remote work from Orlando, FL. Key skills required include Erwin, Data Warehouse, Azure Cloud, SQL, Oracle, and Data Engineering experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
May 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Conceptual Data Model #Data Warehouse #Big Data #Azure #SQL (Structured Query Language) #Spark (Apache Spark) #Cloud #Oracle #Metadata #ERWin #Physical Data Model #PySpark #Azure cloud #Data Engineering
Role description
Job Tittle: Data Modeler/ Architect
cation: Orlando, FL(Remote EST Hours)
Duration: Long Term Contract
Role Overview
• Erwin, Data Warehouse, Azure Cloud, SQL, Oracle and any Data Engineering experience Bigdata, Pyspark/Spark is a plus
• Analyzing and translating business needs into long-term solution data models.
• Evaluating existing data systems.
• Working with the development team to create conceptual data models and data flows.
• Developing best practices for data coding to ensure consistency within the system.
• Reviewing modifications of existing systems for cross-compatibility.
• Implementing data strategies and developing physical data models.
• Updating and optimizing local and metadata models.
• Evaluating implemented data systems for variances, discrepancies, and efficiency.
• Troubleshooting and optimizing data systems.
Required skills include Erwin, Data Warehouse, Azure Cloud, SQL, Oracle, and any Data Engineering experience. Experience with Big Data, PySpark, and Spark is a plus
Skills: data warehouse,azure cloud,data modeler,erwin
Job Tittle: Data Modeler/ Architect
cation: Orlando, FL(Remote EST Hours)
Duration: Long Term Contract
Role Overview
• Erwin, Data Warehouse, Azure Cloud, SQL, Oracle and any Data Engineering experience Bigdata, Pyspark/Spark is a plus
• Analyzing and translating business needs into long-term solution data models.
• Evaluating existing data systems.
• Working with the development team to create conceptual data models and data flows.
• Developing best practices for data coding to ensure consistency within the system.
• Reviewing modifications of existing systems for cross-compatibility.
• Implementing data strategies and developing physical data models.
• Updating and optimizing local and metadata models.
• Evaluating implemented data systems for variances, discrepancies, and efficiency.
• Troubleshooting and optimizing data systems.
Required skills include Erwin, Data Warehouse, Azure Cloud, SQL, Oracle, and any Data Engineering experience. Experience with Big Data, PySpark, and Spark is a plus
Skills: data warehouse,azure cloud,data modeler,erwin






