

Data Modeler
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Modeler to design and maintain data models on the Databricks platform, offering a contract length of "X months" at a pay rate of "$X per hour." Requires expertise in Databricks, SQL, Python, and data mesh architecture.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
San Francisco, CA
-
π§ - Skills detailed
#Data Orchestration #Data Pipeline #Data Manipulation #SQL (Structured Query Language) #Data Architecture #Data Integrity #Data Governance #Data Processing #Spark (Apache Spark) #Data Bricks #Data Quality #Python #Data Engineering #Databricks #Scala #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Computer Science #Data Ingestion #Delta Lake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Looking for a Data Modeler to design, build, and maintain data models on the Data bricks platform to support data analytics and business intelligence applications.
Responsibilities:
β’ Design and implement scalable and efficient data models within the data mesh architecture, considering factors such as domain-driven design, data as a product, and data governance.
β’ Work closely with data architects, data engineers, business users and translate business needs into technical solutions, and communicate data model designs effectively.
β’ Leverage Databricks for data engineering tasks such as data processing, data validation and data orchestration.
β’ Optimize data pipelines and ensure reliable and efficient data processing, high performance, and scalability.
β’ Implement data validation rules and data quality checks to ensure data integrity and consistency.
General Requirements:
β’ Bachelor's degree in Computer Science or equivalent experience.
β’ Data Mesh Data Modeler with Databricks Expertise.
β’ Skilled Data Mesh Data Modeler with Data Engineering expertise in Databricks.
β’ Familiarity with Databricks platform, including Spark, Delta Lake, and Unity Catalog.
β’ Lead the design and implementation of data models and data products within the Data Mesh Architecture.
β’ Design, implement and optimize Data Pipelines.
β’ Design, implement and manage the lifecycle of Data Products.
β’ Previous experience in data products modeling within a data mesh architecture.
β’ Strong hands-on expertise in Databricks and Spark.
β’ Proficiency in SQL and Python.
β’ Proficiency in SQL for data manipulation and querying.
β’ Experience with ETL tools and frameworks, and designing data pipelines for data ingestion and transformation.
β’ Problem-solving and troubleshooting skills.
β’ Strong communication skills.