

Databricks Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Engineer with a 6–12 month W2 contract, offering a competitive hourly rate. Key skills include Databricks expertise, Python, SQL, and cloud experience (Azure/AWS). Databricks certifications are required. Must be a US Citizen or GC holder.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 8, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Databricks #ML (Machine Learning) #Python #dbt (data build tool) #Automation #Data Engineering #AWS (Amazon Web Services) #MLflow #Airflow #Data Pipeline #Security #Spark (Apache Spark) #Data Governance #Cloud #Delta Lake #DevOps #Azure
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
🚀 Contract Opportunity: Databricks Engineer (US Citizen / GC Holder Only)
📍 100% Remote | 💼 W2 Contract | 🔧 Data Engineering / ML / Platform Focused
We’re currently supporting a leading tech driven organization looking to bring on a Databricks Engineer to join their high impact data platform team.
What we’re looking for:
✅ Proven expertise in Databricks (including Delta Lake, Spark, and MLflow)
✅ Databricks certifications (e.g., Databricks Certified Data Engineer Associate/Professional or Machine Learning Associate/Professional)
✅ Hands-on experience building and optimizing large-scale data pipelines and workflows
✅ Strong Python and SQL skills
✅ Experience working in a cloud environment (Azure or AWS preferred)
✅ Solid grasp of CI/CD, automation, and DevOps practices for data pipelines
✅ Must be a US Citizen or Green Card holder due to client requirements
Nice to have:
• Exposure to ML workloads on Databricks
• Experience with orchestration tools (Airflow, dbt, etc.)
• Understanding of data governance and security best practices
📅 Duration: 6–12 months (with potential for extension)
💲 Competitive W2 hourly rate
🌐 Remote across the US
If you’re certified in Databricks and excited about scaling modern data platforms, we’d love to connect.
📩 Drop me a message or apply directly to discuss the role in more detail!