

Centraprise
Data Modeler
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler on a contract of "length" with a pay rate of "rate" located in "location." Key skills include AWS data services, PySpark, data modeling techniques, and strong SQL proficiency. Experience with data lakes and warehouses is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Reston, VA
-
🧠 - Skills detailed
#Data Integration #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Tableau #AWS (Amazon Web Services) #Data Engineering #PySpark #Physical Data Model #Spark (Apache Spark) #Data Modeling #AWS Glue #ERWin #SQL (Structured Query Language) #Data Lake #BI (Business Intelligence) #Big Data #Python #Snowflake #Microsoft Power BI #Scala #Data Analysis
Role description
We are seeking a detail-oriented Data Modeler to design and maintain logical and physical data models that support analytics and business intelligence solutions. The role requires strong expertise in AWS data services and data transformation using PySpark.
Key Responsibilities
• Design conceptual, logical, and physical data models
• Develop and maintain data models for data lakes and warehouses
• Work closely with data engineers to implement models using PySpark, Glue, and EMR
• Ensure data consistency, integrity, and performance optimization
• Translate business requirements into scalable data structures
• Support ETL development and data integration efforts
Required Skills
• Strong experience in data modeling techniques (ER, dimensional modeling)
• Proficiency in Python and PySpark
• Hands-on experience with AWS Glue, EMR, and S3
• Knowledge of data warehousing concepts (Star/Snowflake schemas)
• Strong SQL and data analysis skills
Preferred Skills
• Experience with data modeling tools (Erwin, ER/Studio, etc.)
• Understanding of big data ecosystems
• Exposure to BI tools (Tableau, Power BI)
We are seeking a detail-oriented Data Modeler to design and maintain logical and physical data models that support analytics and business intelligence solutions. The role requires strong expertise in AWS data services and data transformation using PySpark.
Key Responsibilities
• Design conceptual, logical, and physical data models
• Develop and maintain data models for data lakes and warehouses
• Work closely with data engineers to implement models using PySpark, Glue, and EMR
• Ensure data consistency, integrity, and performance optimization
• Translate business requirements into scalable data structures
• Support ETL development and data integration efforts
Required Skills
• Strong experience in data modeling techniques (ER, dimensional modeling)
• Proficiency in Python and PySpark
• Hands-on experience with AWS Glue, EMR, and S3
• Knowledge of data warehousing concepts (Star/Snowflake schemas)
• Strong SQL and data analysis skills
Preferred Skills
• Experience with data modeling tools (Erwin, ER/Studio, etc.)
• Understanding of big data ecosystems
• Exposure to BI tools (Tableau, Power BI)






