

AMER Technology, Inc.
Databricks Administrator
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Administrator in Austin, TX, with a contract length of "unknown." Pay rate is "unknown." Requires 8+ years of experience in Databricks administration on AWS, IAM, Apache Spark, and DevOps tools. Local candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#Programming #Compliance #Databricks #DevOps #IAM (Identity and Access Management) #Scripting #Apache Spark #Data Lake #Security #ML (Machine Learning) #Data Security #Spark (Apache Spark) #Monitoring #Storage #Scala #AI (Artificial Intelligence) #Automation #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Terraform #MLflow #Cloud #Python #SQL (Structured Query Language)
Role description
Systems Analyst 3 (Databricks Administrator) – Austin, TX (Hybrid/Onsite)
Note: Only LOCAL candidates within the Austin area will be considered
Work Type: Hybrid / Onsite (In-Person Interview Required)
Role Overview
We are seeking an experienced Databricks Administrator to support enterprise data, analytics, and AI/ML workloads. This role will focus on platform administration, governance, performance optimization, and cloud integrations within AWS.
Key Responsibilities
• Administer and manage Databricks workspaces in AWS
• Configure clusters, job scheduling, and workspace environments
• Manage user access (IAM, SCIM, RBAC)
• Optimize Spark workloads and troubleshoot performance issues
• Integrate Databricks with cloud storage (S3)
• Implement cluster policies and governance standards
• Monitor platform health, availability, and performance
• Support DevOps automation (Terraform, CI/CD pipelines)
• Ensure data security, compliance, and cost optimization
Required Qualifications (8+ Years)
Databricks administration in AWS cloud
Cluster configuration, job orchestration, workspace management
IAM, RBAC, SCIM access control
Apache Spark (performance tuning & troubleshooting)
Databricks SQL, notebooks, and workflows
Cloud storage integrations (S3)
Platform monitoring & governance
Security, encryption, and compliance
DevOps tools (Terraform, scripting, CI/CD)
Preferred Qualifications
Unity Catalog experience
AI/ML workloads (MLflow, Databricks ML)
Data Lake / Lakehouse architecture knowledge
Cost optimization strategies
Experience in government or enterprise environments
Programming: Python, SQL, Scala
Systems Analyst 3 (Databricks Administrator) – Austin, TX (Hybrid/Onsite)
Note: Only LOCAL candidates within the Austin area will be considered
Work Type: Hybrid / Onsite (In-Person Interview Required)
Role Overview
We are seeking an experienced Databricks Administrator to support enterprise data, analytics, and AI/ML workloads. This role will focus on platform administration, governance, performance optimization, and cloud integrations within AWS.
Key Responsibilities
• Administer and manage Databricks workspaces in AWS
• Configure clusters, job scheduling, and workspace environments
• Manage user access (IAM, SCIM, RBAC)
• Optimize Spark workloads and troubleshoot performance issues
• Integrate Databricks with cloud storage (S3)
• Implement cluster policies and governance standards
• Monitor platform health, availability, and performance
• Support DevOps automation (Terraform, CI/CD pipelines)
• Ensure data security, compliance, and cost optimization
Required Qualifications (8+ Years)
Databricks administration in AWS cloud
Cluster configuration, job orchestration, workspace management
IAM, RBAC, SCIM access control
Apache Spark (performance tuning & troubleshooting)
Databricks SQL, notebooks, and workflows
Cloud storage integrations (S3)
Platform monitoring & governance
Security, encryption, and compliance
DevOps tools (Terraform, scripting, CI/CD)
Preferred Qualifications
Unity Catalog experience
AI/ML workloads (MLflow, Databricks ML)
Data Lake / Lakehouse architecture knowledge
Cost optimization strategies
Experience in government or enterprise environments
Programming: Python, SQL, Scala






