

Smart IT Frame LLC
Databricks Developer/Data Engineeer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a remote contract position for a "Databricks Developer/Data Engineer" focusing on Azure Databricks and PySpark, requiring 12-14 years of experience, expertise in Azure DevOps, and familiarity with airline industry data. Pay rate is unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Analysis #Deployment #Logging #Documentation #Data Privacy #Data Pipeline #Data Architecture #Storage #Forecasting #Spark (Apache Spark) #Scala #Azure Databricks #"ETL (Extract #Transform #Load)" #Observability #DevOps #Azure DevOps #Databricks #Azure #SQL (Structured Query Language) #SQL Queries #Datasets #Data Transformations #PySpark #Data Engineering
Role description
Title: Azure Databricks Architect
Location: Remote
Type: Contract
About Smart IT Frame:
At Smart IT Frame, we connect top talent with leading organizations across the USA. With over a decade of staffing excellence, we specialize in IT, healthcare, and professional roles, empowering both clients and candidates to grow together.
Required Skills: Azure DevOps, PySpark, Azure Monitor, Databricks Workflows, Databricks SQL
Job Description:
• The Sr Architect will design and optimize data solutions using Databricks and Azure services to support complex analytical workloads in a hybrid work model.
• The role focuses on building scalable data platforms, enabling reliable pipelines, and ensuring high quality insights for business stakeholders, with preference for experience in airline domain environments.
Responsibilities:
• Design robust end to end data architectures using Databricks SQL and PySpark to support scalable analytics and reporting solutions across enterprise data platforms.
• Develop optimized data models and transformation pipelines on Databricks Workflows that ensure reliable, performant, and consistent delivery of curated datasets for downstream consumers.
• Implement secure and compliant data solutions on Azure that align with enterprise standards for access control, data privacy, and regulatory requirements across business domains.
• Coordinate with product owners and data analysts to translate analytical requirements into technical designs that maximize reuse and maintainability of shared data assets.
• Configure Azure DevOps project structures, repositories, and branching strategies to support collaborative development, continuous integration, and controlled deployments of data platform components.
• Create automated build and release pipelines in Azure DevOps that enable repeatable deployment of Databricks notebooks, jobs, and related infrastructure across multiple environments.
• Monitor platform health and performance using Azure Monitor to proactively detect issues, analyze trends, and recommend improvements to capacity, reliability, and cost efficiency.
• Optimize PySpark jobs and Databricks SQL queries by tuning partitioning, caching, and resource configurations to reduce processing times and improve overall platform throughput.
• Partner with data engineers and operations teams to define logging, alerting, and incident response practices that enhance observability and reduce recovery times for critical data services.
• Guide stakeholders on best practices for hybrid work collaboration by standardizing documentation, code review processes, and knowledge sharing routines that keep distributed teams aligned.
• Document solution designs, technical decisions, and operational runbooks in clear and reusable formats that support future enhancements and onboarding of new team members.
• Engage with business partners to demonstrate how data solutions built on Databricks and Azure deliver measurable value, such as improved decision accuracy and faster time to insight.
• Apply domain understanding from airline operations when available to design data solutions that support use cases such as demand forecasting, route optimization, and customer experience analytics.
Qualifications:
• Possess extensive experience working with Databricks SQL and PySpark to design and implement complex data transformations and analytical models for large scale datasets. -Demonstrate strong hands on expertise with Databricks Workflows including job orchestration, dependency management, and scheduling of production data pipelines.
• Exhibit practical proficiency in Azure DevOps by managing repositories, configuring pipelines, and enforcing quality gates that support disciplined software delivery practices.
• Show solid capability in using Azure Monitor for setting up metrics, dashboards, and alerts that provide clear visibility into data platform health and performance.
• Display familiarity with Azure based data services such as storage, compute, and networking components that are commonly integrated with Databricks solutions.
• Bring proven experience of twelve to fourteen years in data engineering or architecture roles with progressive responsibility for end to end solution ownership.
• Prefer prior exposure to airline industry data and processes where understanding of flight operations, revenue management, or customer engagement can enhance solution relevance.
• Demonstrate strong communication and collaboration skills that enable effective interaction with cross functional stakeholders in a hybrid work arrangement during standard day shifts.
• Exhibit commitment to quality and continuous improvement by adopting coding standards, testing practices, and documentation habits that enhance reliability and maintainability.
Apply today or share profiles at mario.i@smartitframe.com
Title: Azure Databricks Architect
Location: Remote
Type: Contract
About Smart IT Frame:
At Smart IT Frame, we connect top talent with leading organizations across the USA. With over a decade of staffing excellence, we specialize in IT, healthcare, and professional roles, empowering both clients and candidates to grow together.
Required Skills: Azure DevOps, PySpark, Azure Monitor, Databricks Workflows, Databricks SQL
Job Description:
• The Sr Architect will design and optimize data solutions using Databricks and Azure services to support complex analytical workloads in a hybrid work model.
• The role focuses on building scalable data platforms, enabling reliable pipelines, and ensuring high quality insights for business stakeholders, with preference for experience in airline domain environments.
Responsibilities:
• Design robust end to end data architectures using Databricks SQL and PySpark to support scalable analytics and reporting solutions across enterprise data platforms.
• Develop optimized data models and transformation pipelines on Databricks Workflows that ensure reliable, performant, and consistent delivery of curated datasets for downstream consumers.
• Implement secure and compliant data solutions on Azure that align with enterprise standards for access control, data privacy, and regulatory requirements across business domains.
• Coordinate with product owners and data analysts to translate analytical requirements into technical designs that maximize reuse and maintainability of shared data assets.
• Configure Azure DevOps project structures, repositories, and branching strategies to support collaborative development, continuous integration, and controlled deployments of data platform components.
• Create automated build and release pipelines in Azure DevOps that enable repeatable deployment of Databricks notebooks, jobs, and related infrastructure across multiple environments.
• Monitor platform health and performance using Azure Monitor to proactively detect issues, analyze trends, and recommend improvements to capacity, reliability, and cost efficiency.
• Optimize PySpark jobs and Databricks SQL queries by tuning partitioning, caching, and resource configurations to reduce processing times and improve overall platform throughput.
• Partner with data engineers and operations teams to define logging, alerting, and incident response practices that enhance observability and reduce recovery times for critical data services.
• Guide stakeholders on best practices for hybrid work collaboration by standardizing documentation, code review processes, and knowledge sharing routines that keep distributed teams aligned.
• Document solution designs, technical decisions, and operational runbooks in clear and reusable formats that support future enhancements and onboarding of new team members.
• Engage with business partners to demonstrate how data solutions built on Databricks and Azure deliver measurable value, such as improved decision accuracy and faster time to insight.
• Apply domain understanding from airline operations when available to design data solutions that support use cases such as demand forecasting, route optimization, and customer experience analytics.
Qualifications:
• Possess extensive experience working with Databricks SQL and PySpark to design and implement complex data transformations and analytical models for large scale datasets. -Demonstrate strong hands on expertise with Databricks Workflows including job orchestration, dependency management, and scheduling of production data pipelines.
• Exhibit practical proficiency in Azure DevOps by managing repositories, configuring pipelines, and enforcing quality gates that support disciplined software delivery practices.
• Show solid capability in using Azure Monitor for setting up metrics, dashboards, and alerts that provide clear visibility into data platform health and performance.
• Display familiarity with Azure based data services such as storage, compute, and networking components that are commonly integrated with Databricks solutions.
• Bring proven experience of twelve to fourteen years in data engineering or architecture roles with progressive responsibility for end to end solution ownership.
• Prefer prior exposure to airline industry data and processes where understanding of flight operations, revenue management, or customer engagement can enhance solution relevance.
• Demonstrate strong communication and collaboration skills that enable effective interaction with cross functional stakeholders in a hybrid work arrangement during standard day shifts.
• Exhibit commitment to quality and continuous improvement by adopting coding standards, testing practices, and documentation habits that enhance reliability and maintainability.
Apply today or share profiles at mario.i@smartitframe.com






