

Databricks Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Architect, a remote contract position lasting over 6 months, offering $80.00 per hour. Requires 7+ years in cloud data platforms, 3+ years in Databricks, strong skills in Apache Spark, and relevant certifications.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#MLflow #MDM (Master Data Management) #Storage #Cloud #Azure #ML (Machine Learning) #Documentation #Consulting #Infrastructure as Code (IaC) #Programming #Apache Spark #ADLS (Azure Data Lake Storage) #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Security #Data Science #Delta Lake #Python #Azure Data Factory #PySpark #Automation #Logging #Data Engineering #AWS (Amazon Web Services) #Databricks #Terraform #ADF (Azure Data Factory) #Datasets #Spark (Apache Spark) #Airflow #Scala #SQL (Structured Query Language) #Batch #GCP (Google Cloud Platform) #Compliance #Leadership #Data Governance #Agile #Synapse #Deployment #AutoScaling
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Location: Remote
Employment Type: Full Time / Contract
Must have requirements:
Solution Architecture & Design: Lead end-to-end development of Databricks-based platforms, including lakehouse architecture, ETL/ELT pipelines, streaming and batch processing, and integration with enterprise systems.
Platform Optimization: Tune performance and cost-efficiency by optimizing cluster configurations, partitioning strategies, caching, autoscaling, and job orchestration.
Data Governance & Security: Implement enterprise-grade governance, role-based access controls (RBAC), Unity Catalog, data masking/encryption, audit logging, and compliance frameworks.
Infrastructure & IaC: Leverage Infrastructure-as-Code tools like Terraform, Bicep, ARM, or CloudFormation for reproducible, automated deployments.
Cloud Platform Integration: Integrate Databricks with cloud services across Azure (Data Factory, Synapse, ADLS), AWS, or GCP.
Advanced Analytics & ML Support: Enable data science workflows with tools like MLflow, Databricks SQL, feature store, and AutoML, often supporting generative AI use cases.
Consulting & Stakeholder Collaboration: Act as a trusted advisorβengaging in pre-sales solutioning, workshops, architectural roadmaps, documentation, and client presentations.
Team Leadership & Mentorship: Mentor junior engineers, lead architectural discussions, guide Agile delivery, and foster best practices adoption.
Required Skills & Experience
7+ years of experience in enterprise cloud data platforms, including 3+ years of hands-on Databricks architecture work.
Strong expertise in Apache Spark (PySpark), Delta Lake, Lakehouse architecture, and Databricks platform features like Unity Catalog and Workflows.
Proficient in programming languages such as Python, SQL, and Scala, with the ability to model and transform large datasets effectively.
Deep knowledge of cloud architecture on Azure, AWS, or GCP, including storage, networking, and security principles.
Experience with CI/CD pipelines, orchestration tools (Airflow, ADF, Databricks Workflows), and infrastructure automation.
Strong communication and stakeholder management skills with proven ability to translate technical requirements into strategic solutions.
Preferred Qualifications
Certifications such as Databricks Certified Data Engineer Professional, Azure Solutions Architect, or AWS/GCP equivalents.
Familiarity with modern data paradigms like data mesh, data product thinking, and large-scale multi-geo deployments.
Experience in regulated industries (e.g., government, federal, healthcare) where compliance and governance are critical.
Butterfly Technology ( ButterflyTeck.com ) is US based specialized staffing company focused on providing end to end staffing / talent management solutions for our customers. Our expertise includes digital transformation, MDM, program management, business consulting and staffing solutions.
Job Types: Full-time, Part-time, Contract
Pay: From $80.00 per hour
Work Location: Remote