

Insight Global
Data Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler in Cleveland, Ohio, with a contract length of unspecified duration and a pay rate of "unknown." Candidates must have a Bachelor’s degree or 12+ years of IT experience, strong skills in Hackolade or Erwin, and expertise in canonical data models and domain-driven design.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
January 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Quality #Data Governance #Data Strategy #Database Design #ERWin #Strategy #Data Modeling #API (Application Programming Interface) #Computer Science
Role description
Must Haves:
• Bachelor’s degree in Computer Science related field, or 12+ years of IT experience in lieu of a degree.
• Strong experience with data modeling tools, ideally Hackolade or Erwin.
• Hands-on experience designing and developing canonical data models.
• Experience with domain-driven design (software development and API design concepts).
Day-To-Day:
Insight Global is seeking a Data Modeler for one of our largest clients in Cleveland, Ohio. This is an exciting opportunity to join a forward-thinking team to help shape enterprise data strategy and drive business transformation. We are looking for an experienced Data Modeler with a strong foundation in domain-driven data modeling to help design and implement canonical data models. This role is critical in shaping how data is structured, shared, and understood across business domains. The ideal candidate is familiar with the SDLC and API/Microservice development and experience with Hackolade or Erwin modeling tools.
Success in this role is defined through contributing to the larger organization by discovering, analyzing, and understanding different perspectives on data. They will be working very closely with the data governance team and the business, working with data definitions, and making sure the definitions are system agnostic. The team will create and maintain artifacts to communicate concise and consistent data requirements from the business to IT, and within IT from analysts, modelers, and architects to database designers and API developers. The team will be measured on its ability to design enterprise data assets to optimize business processes while ensuring data quality and consistency.
Must Haves:
• Bachelor’s degree in Computer Science related field, or 12+ years of IT experience in lieu of a degree.
• Strong experience with data modeling tools, ideally Hackolade or Erwin.
• Hands-on experience designing and developing canonical data models.
• Experience with domain-driven design (software development and API design concepts).
Day-To-Day:
Insight Global is seeking a Data Modeler for one of our largest clients in Cleveland, Ohio. This is an exciting opportunity to join a forward-thinking team to help shape enterprise data strategy and drive business transformation. We are looking for an experienced Data Modeler with a strong foundation in domain-driven data modeling to help design and implement canonical data models. This role is critical in shaping how data is structured, shared, and understood across business domains. The ideal candidate is familiar with the SDLC and API/Microservice development and experience with Hackolade or Erwin modeling tools.
Success in this role is defined through contributing to the larger organization by discovering, analyzing, and understanding different perspectives on data. They will be working very closely with the data governance team and the business, working with data definitions, and making sure the definitions are system agnostic. The team will create and maintain artifacts to communicate concise and consistent data requirements from the business to IT, and within IT from analysts, modelers, and architects to database designers and API developers. The team will be measured on its ability to design enterprise data assets to optimize business processes while ensuring data quality and consistency.






