

SwankTek Inc.
Databrick Developer with Azure
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Developer with Azure, offering a contract of over 6 months at $75.00 per hour. Required skills include 5 years of Databricks and Azure experience, proficiency in PySpark, and 2 years in the banking domain. Hybrid remote in New York, NY.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
January 8, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#Azure #Spark (Apache Spark) #Data Modeling #PySpark #"ETL (Extract #Transform #Load)" #Apache Spark #Data Processing #Cloud #Data Transformations #Data Lake #Data Quality #Delta Lake #SQL (Structured Query Language) #"ACID (Atomicity #Consistency #Isolation #Durability)" #Code Reviews #Data Engineering #GIT #Data Pipeline #Data Security #Data Science #Databricks #ADLS (Azure Data Lake Storage) #Batch #Security #Logging #Azure ADLS (Azure Data Lake Storage) #Synapse #Compliance #Scala #RDBMS (Relational Database Management System) #ML (Machine Learning) #Datasets #ADF (Azure Data Factory) #Spark SQL
Role description
We are seeking a skilled Databricks Developer to design, develop, and optimize large-scale data processing solutions using Databricks, Apache Spark, and cloud platforms. The ideal candidate will have strong experience in building data pipelines, transforming large datasets, and supporting analytics and machine learning workloads in a modern data platform.
Key Responsibilities
Design, develop, and maintain ETL/ELT pipelines using Databricks (Apache Spark)
Implement data transformations using PySpark / Spark SQL / Scala
Integrate data from multiple sources (RDBMS, APIs, streaming, files) into Databricks
Optimize Spark jobs for performance, scalability, and cost efficiency
Work with Delta Lake for ACID transactions, data versioning, and time travel
Collaborate with data engineers, data scientists, and business teams
Implement data quality checks, logging, and error handling
Support real-time and batch processing workloads
Participate in code reviews and follow best engineering practices
Ensure data security, governance, and compliance standards
Required Skills & Qualifications
Strong experience with Databricks and Apache Spark
Proficiency in PySpark, Spark SQL, or Scala
Experience with Delta Lake
Hands-on experience with cloud platforms:
Azure (ADLS Gen2, Synapse, ADF)
Experience building data pipelines and data lakes
Solid understanding of data modeling and data warehousing concepts
Experience with Git, CI/CD pipelines
Job Types: Full-time, Contract
Pay: $75.00 per hour
Expected hours: 40 per week
Experience:
Databrick: 5 years (Required)
Azure: 5 years (Required)
Banking Domain: 2 years (Required)
Work Location: Hybrid remote in New York, NY
We are seeking a skilled Databricks Developer to design, develop, and optimize large-scale data processing solutions using Databricks, Apache Spark, and cloud platforms. The ideal candidate will have strong experience in building data pipelines, transforming large datasets, and supporting analytics and machine learning workloads in a modern data platform.
Key Responsibilities
Design, develop, and maintain ETL/ELT pipelines using Databricks (Apache Spark)
Implement data transformations using PySpark / Spark SQL / Scala
Integrate data from multiple sources (RDBMS, APIs, streaming, files) into Databricks
Optimize Spark jobs for performance, scalability, and cost efficiency
Work with Delta Lake for ACID transactions, data versioning, and time travel
Collaborate with data engineers, data scientists, and business teams
Implement data quality checks, logging, and error handling
Support real-time and batch processing workloads
Participate in code reviews and follow best engineering practices
Ensure data security, governance, and compliance standards
Required Skills & Qualifications
Strong experience with Databricks and Apache Spark
Proficiency in PySpark, Spark SQL, or Scala
Experience with Delta Lake
Hands-on experience with cloud platforms:
Azure (ADLS Gen2, Synapse, ADF)
Experience building data pipelines and data lakes
Solid understanding of data modeling and data warehousing concepts
Experience with Git, CI/CD pipelines
Job Types: Full-time, Contract
Pay: $75.00 per hour
Expected hours: 40 per week
Experience:
Databrick: 5 years (Required)
Azure: 5 years (Required)
Banking Domain: 2 years (Required)
Work Location: Hybrid remote in New York, NY





