Raas Infotek

Databricks Lead Only W2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Lead based in Bolingbrook, with a 12-month contract at a pay rate of "TBD." Candidates must have 10+ years in data engineering, 3+ years with Databricks, and strong skills in Apache Spark, Delta Lake, Python, SQL, and GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Bolingbrook, IL
-
🧠 - Skills detailed
#Observability #Scala #Migration #SQL (Structured Query Language) #Data Architecture #Cloud #ML (Machine Learning) #Apache Spark #Batch #GCP (Google Cloud Platform) #Data Quality #Data Pipeline #Data Governance #Leadership #Spark (Apache Spark) #Delta Lake #Big Data #Python #Data Engineering #Databricks
Role description
Job Title: Databricks Lead Location: Bolingbrook (Onsite role, 3 days biweekly) Duration: 12 Month Only W2 Candidate Mandatory Skills: Strong expertise in Apache Spark, Delta Lake, Databricks SQL, Python, SQL and GCP Platform services. Job Description: • 10+ years of experience in data engineering or data architecture in Big Data platforms. • 3+ years hands-on experience with Databricks platform architecture • Strong expertise in Apache Spark,Delta Lake,Databricks SQL,Python, SQL and GCP Platform services. Responsibilities: • Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake. • Design robust batch and streaming data pipelines leveraging Apache Spark, structured streaming, and modern ELT patterns. • Lead migration of Jobs from other cloud data platform to Databricks. • Implement secure data governance, access control, and lineage using Unity Catalog. • Architect integrations with cloud platforms such as Google Cloud Platform. • Optimize performance and manage compute costs through efficient cluster configuration and Spark workload tuning. • Collaborate with data engineers, analytics teams, and ML engineers to enable scalable data products and analytics. • Define best practices for data quality, reliability, and observability across the data platform. • Provide technical leadership, architecture guidance, and mentorship to data engineering teams.