InterSources Inc

Senior Databricks Administrator

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Databricks Administrator with a long-term contract in Washington, DC. Candidates must have 10+ years of experience, a Bachelor's degree, and 3+ years administering Databricks. Only USC/GC holders are eligible.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Washington, DC
-
🧠 - Skills detailed
#Documentation #Compliance #S3 (Amazon Simple Storage Service) #IAM (Identity and Access Management) #SQL (Structured Query Language) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Security #VPC (Virtual Private Cloud) #Terraform #Cloud #CLI (Command-Line Interface) #Automation #Data Engineering #Logging #Delta Lake #Storage #Databricks #BI (Business Intelligence) #AWS IAM (AWS Identity and Access Management) #Deployment #AutoScaling #Monitoring
Role description
Position – Senior Databricks Administrator / Senior Databricks Engineer Duration – Longterm Contract Location – Washington, DC - Onsite Need only USC / GC Holders Please don’t share Data Engineer Profiles Essential Skills: β€’ Should have Minimum 10+ years of experience β€’ Bachelor’s degree with 7+ years cloud/data platform experience, including 3+ years administering Databricks. β€’ Serve as the technical owner of the Databricks platform, ensuring secure, reliable, compliant, and cost-efficient operations across SDLC environments. β€’ Administer workspaces, clusters, SQL warehouses, jobs, repos, runtimes, and serverless compute using Databricks CLI, APIs, and Terraform. β€’ Lead Unity Catalog governance including metastores, catalogs/schemas, grants, service principals, and governed storage access. β€’ Enforce IAM, RBAC, SSO/SCIM provisioning, secrets management, audit logging, and least-privilege security controls. β€’ Implement monitoring, alerting, CI/CD pipelines (DABs), and automation for configuration and deployment management. β€’ Optimize performance and cost through cluster policies, auto-scaling, Delta Lake tuning, and FinOps best practices. β€’ Support data engineering workloads (DLT pipelines, SQL tuning, ETL troubleshooting) and manage BI/tool integrations. β€’ Coordinate with cloud/network teams (AWS IAM, VPC, PrivateLink, S3, KMS) to ensure secure connectivity and compliance. β€’ Maintain documentation, runbooks, change management processes, and incident/SLA reporting.