Azure Databricks Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Databricks Engineer on a long-term contract in New York, NY, offering $120.00 - $135.00 per hour. Requires 7+ years in data engineering, 3+ in Azure Databricks, and expertise in financial services environments.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
1080
-
πŸ—“οΈ - Date discovered
September 22, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York, NY 10022
-
🧠 - Skills detailed
#Security #Azure #Data Engineering #Databricks #Azure Databricks #PySpark #Spark SQL #SQL (Structured Query Language) #Spark (Apache Spark) #Scala #Python #Data Ingestion #DevOps #BI (Business Intelligence) #Batch #Delta Lake
Role description
Overview: Overview A top-tier independent investment bank seeks a Senior Azure Databricks Engineer to join its internal Azure/Data Engineering team on a long-term contract basis. Prefer candidates who are able to come to NY office 1-2 days a week. Senior candidate must be polished in presentation style and able to do 50/50 split between managing people/projects as well as fingers on keyboard coding. Job Summary: We are seeking an experienced Azure Databricks Engineer to design, implement, and optimize modern Lakehouse architectures in a regulated, financial services environment. The ideal candidate blends deep expertise in Azure and Databricks, with proven experience enforcing governance and security at scale. Responsibilities: Architect, build, and maintain Azure-based Lakehouse solutions leveraging Databricks, Delta Lake, and Azure-native services. Implement medallion (bronze/silver/gold) architecture to ensure scalability, governance, and analytics agility. Develop and optimize data ingestion frameworks for both batch and streaming pipelines (PySpark, Spark SQL, etc). Configure and manage Unity Catalog, enforcing row- and column-level security, role-based access, and lineage tracking. Collaborate with cross-functional teams to integrate siloed systems into unified data platforms. Ensure solutions meet financial services regulatory requirements and deliver secure, compliant data environments. Optimize Spark workloads for performance and cost efficiency in large-scale environments. Support BI and analytics teams by delivering performant, governed semantic layers. Champion best practices for SDLC, CI/CD, testing, and DevOps in Azure Databricks ecosystems. Requirements: 7+ years in data engineering, with at least 3+ focused on Azure Databricks. Strong expertise in Azure services (Data Factory, SQL, Fabric). Hands-on experience implementing row- and column-level security via Unity Catalog. Deep understanding of investment banking or similarly regulated industries. Proficiency in Python, PySpark, Spark SQL, T-SQL. Job Type: Contract Pay: $120.00 - $135.00 per hour Work Location: Hybrid remote in New York, NY 10022