Data Engineer/Architect (Databricks)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer/Architect (Databricks) contract position lasting over 6 months, offering $60.00 - $80.00 per hour. Key requirements include Databricks certification, 10+ years in big data systems, and expertise in PySpark, dbt, and dimensional modeling.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
640
-
πŸ—“οΈ - Date discovered
August 15, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Remote
-
🧠 - Skills detailed
#Big Data #Databricks #PySpark #Databases #dbt (data build tool) #Data Ingestion #Data Modeling #Spark (Apache Spark) #Data Engineering
Role description
Job Title: Data Engineer/Architect (Databricks)Location: RemoteDuration: Contract (06+ Month - Possible Extension)Schedule: Mon - Fri Job Description:DATA ENGINEER/ARCHITECT (Databricks) We are currently targeting experienced engineers and architects with the following qualifications: Databricks Certification: Candidates must be Databricks-certified engineers or architects. Extensive Big Data Experience: At least 10 years of hands-on experience designing, implementing, and configuring distributed big data systems β€” including a minimum of 6 years specifically using Databricks. Proven Medallion Architecture Implementation: Demonstrated experience deploying Databricks’ Medallion Architecture in either SMB or enterprise environments. Strong Conceptual Understanding: Ability to clearly articulate core Databricks concepts and explain their practical applications. Dimensional Data Modeling Expertise: 10+ years of experience implementing Kimball-style dimensional models. dbt on Databricks: A minimum of 3 years of experience using dbt within the Databricks ecosystem. Semantic Layer Implementation: At least 6 years of experience building and managing semantic layers. PySpark and Data Ingestion: 10+ years of experience with PySpark, including data ingestion from a variety of sources such as APIs, relational databases, and SFTP. Job Type: Contract Pay: $60.00 - $80.00 per hour Work Location: Remote