

aKUBE
Senior Data Engineer - Databricks Apps
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer - Databricks Apps in Calabasas, CA, offering a 6-month contract at a competitive pay rate. Key skills include Databricks Apps, Python, PySpark, Azure, and API integration. Requires 7+ years of relevant experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 10, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Calabasas, CA
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Quality #Batch #Azure #DevOps #Agile #Data Engineering #Azure cloud #Data Pipeline #Deployment #SQL (Structured Query Language) #Spark (Apache Spark) #Cloud #API (Application Programming Interface) #Databricks #PySpark #Python
Role description
City: Calabasas, CA
Onsite/ Hybrid/ Remote: Onsite 4 days a week
Duration: 6 months contract to start
Work Authorization: GC, USC Only
Must Have:
Databricks Apps
Databricks Workflows and Jobs
Python
PySpark
Azure
API integration
ETL pipelines
CI/CD and DevOps
Responsibilities:
Design and build Databricks Apps to support data and analytics workflows.
Develop Python-based applications on the Databricks platform.
Integrate APIs, Databricks Jobs, and data pipelines into application flows.
Build and maintain batch data pipelines using Databricks and Azure services.
Ensure performance, reliability, and data quality across applications.
Collaborate with cross-functional teams to translate requirements into solutions.
Support CI/CD pipelines and deployment best practices.
Qualifications:
Bachelorβs degree in a related technical field required.
7+ years of data engineering or platform development experience.
Hands-on experience building Databricks Apps in production.
Strong Python, PySpark, and SQL skills.
Experience with Azure cloud data services.
Familiarity with Agile development practices.
City: Calabasas, CA
Onsite/ Hybrid/ Remote: Onsite 4 days a week
Duration: 6 months contract to start
Work Authorization: GC, USC Only
Must Have:
Databricks Apps
Databricks Workflows and Jobs
Python
PySpark
Azure
API integration
ETL pipelines
CI/CD and DevOps
Responsibilities:
Design and build Databricks Apps to support data and analytics workflows.
Develop Python-based applications on the Databricks platform.
Integrate APIs, Databricks Jobs, and data pipelines into application flows.
Build and maintain batch data pipelines using Databricks and Azure services.
Ensure performance, reliability, and data quality across applications.
Collaborate with cross-functional teams to translate requirements into solutions.
Support CI/CD pipelines and deployment best practices.
Qualifications:
Bachelorβs degree in a related technical field required.
7+ years of data engineering or platform development experience.
Hands-on experience building Databricks Apps in production.
Strong Python, PySpark, and SQL skills.
Experience with Azure cloud data services.
Familiarity with Agile development practices.






