

W3Global
Databricks Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Engineer with 8+ years of data engineering experience, including 3+ years on Databricks. Key skills include PySpark, SQL, Delta Lake, and GCP Dataproc. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 5, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Irving, TX
-
🧠 - Skills detailed
#Documentation #Logging #PySpark #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Spark (Apache Spark) #Scala #Cloud #SQL (Structured Query Language) #Databricks #Spark SQL #Data Engineering #Delta Lake
Role description
Mandatory Skills: Databricks Developer, and DataBrick Admin ,and Databricks Support.
8+ years of experience in data engineering, with at least 3 years on Databricks.
• Strong proficiency in PySpark, SQL, and Delta Lake.
• Hands-on experience with GCP Dataproc.
• Responsibilities:
•
•
•
•
• Administration
•
• :
• Lead the installation and configuration of Databricks on GCP cloud platforms.
• Monitor platform health, performance, and cost optimization.
• Implement governance, logging, and auditing mechanisms.
•
•
• Development / Enhancements
•
• :
• Design and develop scalable ETL/ELT pipelines using PySpark, SQL, and Delta Lake.
• Collaborate with data engineers and analysts to enhance data workflows and models.
• Optimize existing notebooks and jobs for performance and reliability.
•
•
• Operations, Support & Troubleshooting
•
• :
• Provide L2/L3 support for Databricks-related issues and incidents.
• Troubleshoot cluster failures, job errors, and performance bottlenecks.
• Maintain technical documentation for platform setup, operations, and development standards
Mandatory Skills: Databricks Developer, and DataBrick Admin ,and Databricks Support.
8+ years of experience in data engineering, with at least 3 years on Databricks.
• Strong proficiency in PySpark, SQL, and Delta Lake.
• Hands-on experience with GCP Dataproc.
• Responsibilities:
•
•
•
•
• Administration
•
• :
• Lead the installation and configuration of Databricks on GCP cloud platforms.
• Monitor platform health, performance, and cost optimization.
• Implement governance, logging, and auditing mechanisms.
•
•
• Development / Enhancements
•
• :
• Design and develop scalable ETL/ELT pipelines using PySpark, SQL, and Delta Lake.
• Collaborate with data engineers and analysts to enhance data workflows and models.
• Optimize existing notebooks and jobs for performance and reliability.
•
•
• Operations, Support & Troubleshooting
•
• :
• Provide L2/L3 support for Databricks-related issues and incidents.
• Troubleshoot cluster failures, job errors, and performance bottlenecks.
• Maintain technical documentation for platform setup, operations, and development standards






