Third Republic

Databricks Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Databricks Engineer on a freelance contract for a major AI transformation project in New York, starting April/May, offering top-tier pay. Key skills include Databricks, Python, SQL, and cloud expertise (AWS, Azure, GCP).
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 15, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
1099 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #dbt (data build tool) #Snowflake #Azure #GCP (Google Cloud Platform) #Data Engineering #AWS (Amazon Web Services) #Delta Lake #PySpark #Cloud #Spark (Apache Spark) #AI (Artificial Intelligence) #Redshift #Databricks #Python #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #MLflow #Compliance #SQL (Structured Query Language) #BigQuery #Data Pipeline
Role description
πŸš€ 2 Senior Contract Databricks Data Engineers Needed for Massive Global AI Transformation Project (New York, Start April/May) – FREELANCE/CONTRACT ONLY THE OPPORTUNITY We have just secured a project to staff a groundbreaking new data program for a major global AI transformation consultancy. This is a large greenfield initiative collaborating with a leader in AI transformationβ€”all New York-based operations. WHY THIS ROLE IS GREAT The Scale: You will be building the data backbone for one of the most significant AI transformation projects in the US right now. The Rates: We are direct-to-customer so can offer excellent, top-tier rates for the right talent. The Impact: You aren't just moving tickets; you are engineering the data pipelines and architecture that will power next-gen enterprise AI models. WHO WE NEED (The "Must-Haves") We are looking for heavy hittersβ€”contractors who can drop into a complex environment and deliver value on Day 1. Professional Contractors: 1099 or Corp-To-Corp (C2C) Senior Databricks Data Engineering Expertise: Deep proficiency in Databricks (Unity Catalog, Delta Lake, Workflows), Python, SQL, and Spark (PySpark). Cloud Native: Mastery of Databricks on AWS, Azure, or GCP ecosystems (integrated with Snowflake, BigQuery, Redshift where needed). Modern Stack: Experience with Databricks Workflows (formerly Jobs), dbt on Databricks, Kafka integration, and CI/CD pipelines. AI Readiness: Experience preparing data at scale for ML/AI models using Databricks MLflow or Mosaic AI is a massive plus. CONTRACT TERMS – PLEASE READ CAREFULLY To ensure speed and compliance for this specific New York-based program, we are strictly accepting applications from: βœ… 1099 Freelancers βœ… Corp-to-Corp (C2C) Contractors ❌ No W2 applications will be considered. ❌ No 3rd party agencies or pass-throughs. TIMELINE Interviews: Happening now Start Dates: Rolling starts throughout April/May HOW TO APPLY If you are a 1099/C2C Databricks Data Expert ready to lock in a high-paying contract with a top-tier global firm, drop your resume immediately or DM me. These 2 spots will fill quickly. #Databricks #DataEngineering #Contract #C2C #1099 #Freelance #Globant #AI #BigData #Hiring #TechJobs #FreelanceLife #NYCJobs