

Data Engineer - Databricks
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Databricks, offering a contract length of "unknown" with a pay rate of "unknown." Key skills include Databricks administration, GCP/AWS experience, and 3+ years in production support. Preferred skills involve automation frameworks like Terraform.
π - Country
United Kingdom
π± - Currency
Β£ GBP
-
π° - Day rate
-
ποΈ - Date discovered
September 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Leeds, England, United Kingdom
-
π§ - Skills detailed
#Cloud #Data Engineering #Azure #Monitoring #Terraform #Automation #Kafka (Apache Kafka) #Data Architecture #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Compliance #Databricks #ML (Machine Learning) #Data Science #Security #Storage
Role description
Job Description:
Primary skills -
Databricks Admin with GCP / AWS, Databricks, KAFKA Data Architect
Ν
Job Description:
Responsibilities will include designing, implementing, and maintaining the Databricks platform, and providing operational support. Operational support responsibilities include platform set-up and configuration, workspace administration, resource monitoring, providing technical support to data engineering, Data Science/ML, and Application/integration teams, performing restores/recoveries, troubleshooting service issues, determining the root causes of issues, and resolving issues.
. The position will also involve the management of security and changes.
. The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts.
Responsibilities:
. Responsible for the administration, configuration, and optimization of the Databricks platform to enable data analytics, machine learning, and data engineering activities within the organization.
. Collaborate with the data engineering team to ingest, transform, and orchestrate data.
. Manage privileges over the entire Databricks account, as well as at the workspace level, Unity Catalog level and SQL warehouse level.
. Create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions.
. Install, configure, and maintain Databricks clusters and workspaces.
. Maintain Platform currency with security, compliance, and patching best practices.
. Monitor and manage cluster performance, resource utilization, platform costs, and troubleshoot issues to ensure optimal performance.
. Implement and manage access controls and security policies to protect sensitive data.
. Manage schema data with Unity Catalog - create, configure, catalog, external storage, and access permissions.
. Administer interfaces with Google Cloud Platform.
Required Skills:
. 3+ years of production support of the Databricks platform
Preferred:
. 2+ years of experience of AWS/Azure/GCP PaaS admin
. 2+ years of experience in automation frameworks such as Terraform
Job Description:
Primary skills -
Databricks Admin with GCP / AWS, Databricks, KAFKA Data Architect
Ν
Job Description:
Responsibilities will include designing, implementing, and maintaining the Databricks platform, and providing operational support. Operational support responsibilities include platform set-up and configuration, workspace administration, resource monitoring, providing technical support to data engineering, Data Science/ML, and Application/integration teams, performing restores/recoveries, troubleshooting service issues, determining the root causes of issues, and resolving issues.
. The position will also involve the management of security and changes.
. The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts.
Responsibilities:
. Responsible for the administration, configuration, and optimization of the Databricks platform to enable data analytics, machine learning, and data engineering activities within the organization.
. Collaborate with the data engineering team to ingest, transform, and orchestrate data.
. Manage privileges over the entire Databricks account, as well as at the workspace level, Unity Catalog level and SQL warehouse level.
. Create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions.
. Install, configure, and maintain Databricks clusters and workspaces.
. Maintain Platform currency with security, compliance, and patching best practices.
. Monitor and manage cluster performance, resource utilization, platform costs, and troubleshoot issues to ensure optimal performance.
. Implement and manage access controls and security policies to protect sensitive data.
. Manage schema data with Unity Catalog - create, configure, catalog, external storage, and access permissions.
. Administer interfaces with Google Cloud Platform.
Required Skills:
. 3+ years of production support of the Databricks platform
Preferred:
. 2+ years of experience of AWS/Azure/GCP PaaS admin
. 2+ years of experience in automation frameworks such as Terraform