Realign LLC

Databricks Data Engineer(Azure + Python)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Data Engineer (Azure + Python) on a remote contract basis, requiring expertise in Apache Spark, Kubernetes, CI/CD, and Python. Key skills include SQL, Docker, and infrastructure management. Contract length and pay rate are unspecified.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date
March 13, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Apache Spark #AWS (Amazon Web Services) #Dynatrace #Alation #Kubernetes #Data Processing #Scala #SQL (Structured Query Language) #SQL Queries #Databricks #Data Engineering #Capacity Management #DevOps #Spark (Apache Spark) #Deployment #Data Lake #Python #Data Lakehouse #Observability #Ansible #Leadership #Azure #Docker #Infrastructure as Code (IaC) #Monitoring #Jenkins #Dremio #Data Pipeline #Security
Role description
Remote, New York 10048 Posted March 12th, 2026 Looking for more job opportunities? Click here! Job Type: Contract Job Category: IT Role : Databricks Data Engineer(Azure + Python) Location : Remote Contract Job Description Must Have Technical/Functional Skills: Exp. in Apache SPARK development, Kubenetes, CI-CD Pipeline, Jenkins, Dockers, Kubernetes, PL SQL, Python Writing SQL queries and procedures Writing Python code to automate and develop small functionalities Creating CI/CD pipelines Writing Jenkins jobs Managing applications in Kubernetes environments, including deployment, configuration, and triaging Hands-on experience with Apache Spark Roles & Responsibilities: The candidate will provide technical leadership for the team(s) they are associated with and participate in key technical decisions. They will engage with customers on escalations and ensure that there is continuous improvement in all areas. Participate in technical discussions within the team and with other groups within Business Units associated with specified projects You design, develop, and maintain our real time data processing, data Lakehouse infrastructure. You have experience with Python writing data pipelines and data processing layers. You develop and maintain Ansible playbooks for infrastructure configuration and management You develop and maintain Kubernetes manifests, Helm charts, and other deployment artifacts You have hands-on experience on Docker and containerization and how to manage/prune the images in private registries. You have hands-on experience on access control in K8S cluster You have hands-on experience on SPARK and maintaining SPARK CLUSTER You monitor and troubleshoot issues related to Kubernetes clusters and containerized applications You drive initiatives to containerize standalone apps to be containerized i n Kubernetes. You develop and maintain infrastructure as code (IaC) and collaborate with other teams to ensure consistent infrastructure management across the organization You use observability tools to do β€œcapacity management” of our services and infrastructure resources. You are for guiding the development and testing activities of other engineers that involve several inter-dependencies Experience in AWS ECS and EKS is added advantage Experience in Dremio is added advantage Experience in Dynatrace or any tracing, infrastructure, or real time monitoring tool is added advantage Required Skills DEVOPS ENGINEER SENIOR EMAIL SECURITY ENGINEER