

Neotech Global
Databricks Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Architect with a 12+ year background in data engineering, 3+ years in Databricks, and expertise in Spark, PySpark, and SQL. Contract length is unspecified with a competitive pay rate. Experience in regulated industries preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Los Angeles, CA
-
🧠 - Skills detailed
#Spark SQL #Vault #Dataflow #Azure #Databricks #AWS S3 (Amazon Simple Storage Service) #GIT #ADLS (Azure Data Lake Storage) #Data Processing #PySpark #DevOps #Lambda (AWS Lambda) #Spark (Apache Spark) #Data Engineering #S3 (Amazon Simple Storage Service) #Documentation #REST (Representational State Transfer) #Terraform #ADF (Azure Data Factory) #GCP (Google Cloud Platform) #ML (Machine Learning) #SQL (Structured Query Language) #AWS (Amazon Web Services) #"ACID (Atomicity #Consistency #Isolation #Durability)" #Azure DevOps #MLflow #Cloud #Delta Lake #REST API
Role description
Required Skills & Experience
Technical Skills
• 12+ years of experience in data engineering/architecture.
• 3+ years of hands-on experience with Databricks.
• Strong expertise in Spark, PySpark, SQL, and distributed data processing.
• Deep understanding of Delta Lake features: ACID transactions, OPTIMIZE, ZORDER, Auto Loader.
• Experience with workflow orchestration, jobs, and Databricks REST APIs.
• Hands-on expertise with at least one cloud platform:
• Azure (preferred): ADF, ADLS, Key Vault, Event Hub, Azure DevOps
• AWS: S3, Glue, Lambda, Kinesis
• GCP: GCS, Dataflow, Pub/Sub
• Familiarity with CI/CD, Git, DevOps, and Infrastructure-as-Code (Terraform preferred).
Soft Skills
• Strong analytical and problem-solving skills.
• Excellent communication and stakeholder management.
• Ability to lead design discussions and guide technical teams.
• Strong documentation and architectural blueprinting skills.
Preferred Qualifications
• Databricks certifications, such as:
• Databricks Certified Data Engineer Professional
• Databricks Certified Machine Learning Professional
• Databricks Lakehouse Fundamentals
• Experience with MLflow, Feature Store, or MLOps workflows.
• Experience working in regulated industries (BFSI, healthcare, etc.).
Required Skills & Experience
Technical Skills
• 12+ years of experience in data engineering/architecture.
• 3+ years of hands-on experience with Databricks.
• Strong expertise in Spark, PySpark, SQL, and distributed data processing.
• Deep understanding of Delta Lake features: ACID transactions, OPTIMIZE, ZORDER, Auto Loader.
• Experience with workflow orchestration, jobs, and Databricks REST APIs.
• Hands-on expertise with at least one cloud platform:
• Azure (preferred): ADF, ADLS, Key Vault, Event Hub, Azure DevOps
• AWS: S3, Glue, Lambda, Kinesis
• GCP: GCS, Dataflow, Pub/Sub
• Familiarity with CI/CD, Git, DevOps, and Infrastructure-as-Code (Terraform preferred).
Soft Skills
• Strong analytical and problem-solving skills.
• Excellent communication and stakeholder management.
• Ability to lead design discussions and guide technical teams.
• Strong documentation and architectural blueprinting skills.
Preferred Qualifications
• Databricks certifications, such as:
• Databricks Certified Data Engineer Professional
• Databricks Certified Machine Learning Professional
• Databricks Lakehouse Fundamentals
• Experience with MLflow, Feature Store, or MLOps workflows.
• Experience working in regulated industries (BFSI, healthcare, etc.).






