

Deloitte
Azure Databricks Cloud Contractor
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Databricks Cloud Contractor with a contract length of unspecified duration, offering a pay rate of $87-$93 per hour. Key skills required include 10+ years of Databricks experience, proficiency in Azure Data Lake Storage, and advanced coding in Python, SQL, and PySpark. Candidates must be U.S.-based and authorized to work without sponsorship.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
744
-
🗓️ - Date
February 12, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Texas, United States
-
🧠 - Skills detailed
#Azure #Azure Synapse Analytics #Python #Data Lake #Azure Data Factory #PySpark #SQL (Structured Query Language) #Data Governance #AWS (Amazon Web Services) #ADLS (Azure Data Lake Storage) #Spark (Apache Spark) #Cloud #Automation #Project Management #ADF (Azure Data Factory) #Synapse #Delta Lake #Batch #Storage #Databricks #GCP (Google Cloud Platform) #Azure ADLS (Azure Data Lake Storage) #Deployment #Scala #Azure Databricks #Data Pipeline
Role description
Position Summary
Contractor
Hybrid
Work you'll do
Work You’ll Do
The Service your will provide the Deloitte Project Team as a Azure Databricks Contractor include:
• Working with a team of developers & support testing team members to design, develop, validate and maintain applications
• Collaborating with relevant stakeholders & SMEs to understand requirements and translate them into solution that meet the acceptance criteria
• Showcase strong analytical skills for reverse engineering existing systems in order to map current state processes
• Bring in a perspective while participating in technical design meetings to offer design & implementation suggestions
• Working experience with Project Management tools & techniques
• Outstanding analytical and problem-solving skills
Qualifications
10+ Years Experience Databricks
• Proven hands-on experience with Azure Databricks, including designing, building, and optimizing complex data pipelines and Delta Lake solutions.
• Strong background in Azure Data Lake Storage (ADLS)—architecture, best practices, and efficient handling of high-volume and diverse data sets.
• Previous experience designing and developing data models and integration layers that seamlessly pull and harmonize data from multiple ODS and other source systems.
• Demonstrated ability to deliver scalable, performant solutions for both batch and streaming workloads on Azure.
• Track record of driving process efficiency—targeting substantial improvements in workflow automation within the cloud ecosystem.
• Familiarity with Azure suite: ADLS, Azure Data Factory, Azure Synapse Analytics, etc.
• Commitment to data governance, quality management, and cost optimization in cloud environments.
• Advanced coding skills in Python, SQL, and PySpark (Databricks context).
• Nice to have: Experience developing CI/CD pipelines and leveraging Databricks Asset Bundles for deployment and versioning.
• Ability to prepare well-documented, reusable solutions and foster knowledge transfer within the team.
• Experience with additional cloud platforms (AWS, GCP) is a plus, though Azure expertise is paramount.
The expected pay range for this contract assignment is $87-$93.00 per hour. The exact pay rate will vary based on skills, experience, and location and will be determined by the third-party whose employees provide services to Deloitte.
Candidates interested in applying for this opportunity must be geographically based in the United States and must be legally authorized to work in the United States without the need for employer sponsorship
We do not accept agency resumes and are not responsible for any fees related to unsolicited resumes.
Deloitte is not the employer for this role.
This work is contracted through a third-party whose employees provide services to Deloitte.
#contract
Expected Work Schedule
During team’s core business hours
Approximate hours per week
About Deloitte
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It makes Deloitte one of the most rewarding places to work.
As used in this posting, “Deloitte” means , a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries.
Requisition code: 323973
Position Summary
Contractor
Hybrid
Work you'll do
Work You’ll Do
The Service your will provide the Deloitte Project Team as a Azure Databricks Contractor include:
• Working with a team of developers & support testing team members to design, develop, validate and maintain applications
• Collaborating with relevant stakeholders & SMEs to understand requirements and translate them into solution that meet the acceptance criteria
• Showcase strong analytical skills for reverse engineering existing systems in order to map current state processes
• Bring in a perspective while participating in technical design meetings to offer design & implementation suggestions
• Working experience with Project Management tools & techniques
• Outstanding analytical and problem-solving skills
Qualifications
10+ Years Experience Databricks
• Proven hands-on experience with Azure Databricks, including designing, building, and optimizing complex data pipelines and Delta Lake solutions.
• Strong background in Azure Data Lake Storage (ADLS)—architecture, best practices, and efficient handling of high-volume and diverse data sets.
• Previous experience designing and developing data models and integration layers that seamlessly pull and harmonize data from multiple ODS and other source systems.
• Demonstrated ability to deliver scalable, performant solutions for both batch and streaming workloads on Azure.
• Track record of driving process efficiency—targeting substantial improvements in workflow automation within the cloud ecosystem.
• Familiarity with Azure suite: ADLS, Azure Data Factory, Azure Synapse Analytics, etc.
• Commitment to data governance, quality management, and cost optimization in cloud environments.
• Advanced coding skills in Python, SQL, and PySpark (Databricks context).
• Nice to have: Experience developing CI/CD pipelines and leveraging Databricks Asset Bundles for deployment and versioning.
• Ability to prepare well-documented, reusable solutions and foster knowledge transfer within the team.
• Experience with additional cloud platforms (AWS, GCP) is a plus, though Azure expertise is paramount.
The expected pay range for this contract assignment is $87-$93.00 per hour. The exact pay rate will vary based on skills, experience, and location and will be determined by the third-party whose employees provide services to Deloitte.
Candidates interested in applying for this opportunity must be geographically based in the United States and must be legally authorized to work in the United States without the need for employer sponsorship
We do not accept agency resumes and are not responsible for any fees related to unsolicited resumes.
Deloitte is not the employer for this role.
This work is contracted through a third-party whose employees provide services to Deloitte.
#contract
Expected Work Schedule
During team’s core business hours
Approximate hours per week
About Deloitte
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It makes Deloitte one of the most rewarding places to work.
As used in this posting, “Deloitte” means , a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries.
Requisition code: 323973





