

Azure Data Engineer with Databricks
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with Databricks expertise, contract length unspecified, offering $60.00 - $65.00 per hour. Located in Iselin, NJ (Hybrid), it requires strong skills in Databricks, PySpark, SQL, and Capital Markets experience.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
August 20, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Iselin, NJ 08830
-
π§ - Skills detailed
#PySpark #Spark SQL #Spark (Apache Spark) #Delta Lake #Databricks #Deployment #Python #Data Engineering #Azure #Cloud #Security #Compliance #SQL (Structured Query Language) #ADLS (Azure Data Lake Storage) #Data Security #BI (Business Intelligence) #Triggers #Datasets #Azure cloud #"ETL (Extract #Transform #Load)" #Version Control #Azure Blob Storage #Data Pipeline #Storage #Scala
Role description
Role : Azure Data Engineer with Databricks Expertise
Location : Iselin, NJ (Hybrid)
Must have: Strong Databricks, Pyspark, SQL, Datawarehouse resource with Capital Markets experience
Job Summary:
We are seeking highly skilled Azure Data Engineer with strong expertise in SQL, Python, Datawarehouse, Cloud ETL tools to join our data team. The ideal candidate will design, implement and optimize large-scale data pipeline, ensuring scalability, reliability and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting-edge data solutions.
Key Responsibilities:
1. Data Pipeline Development:
Β· Build and maintain scalable ETL/ELT pipelines using Databricks.
Β· Leverage PySpark/Spark and SQL to transform and process large datasets.
Β· Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems.
1. Collaboration & Analysis:
Β· Work Closely with multiple teams to prepare data for dashboard and BI Tools.
Β· Collaborate with cross-functional teams to understand business requirements and deliver tailored data solutions.
1. Performance & Optimization:
Β· Optimize Databricks workloads for cost efficiency and performance.
Β· Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
1. Governance & Security:
Β· Implement and manage data security, access controls and governance standards using Unity Catalog.
Β· Ensure compliance with organizational and regulatory data policies.
1. Deployment:
Β· Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments.
Β· Manage version control for Databricks artifacts and collaborate with team to maintain development best practices.
Technical Skills:
Β· Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.)
Β· Proficiency in Azure Cloud Services.
RegardsMohitLinkedIn : https://www.linkedin.com/in/mohit-saini-b21b73230/
Job Type: Contract
Pay: $60.00 - $65.00 per hour
Work Location: In person
Role : Azure Data Engineer with Databricks Expertise
Location : Iselin, NJ (Hybrid)
Must have: Strong Databricks, Pyspark, SQL, Datawarehouse resource with Capital Markets experience
Job Summary:
We are seeking highly skilled Azure Data Engineer with strong expertise in SQL, Python, Datawarehouse, Cloud ETL tools to join our data team. The ideal candidate will design, implement and optimize large-scale data pipeline, ensuring scalability, reliability and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting-edge data solutions.
Key Responsibilities:
1. Data Pipeline Development:
Β· Build and maintain scalable ETL/ELT pipelines using Databricks.
Β· Leverage PySpark/Spark and SQL to transform and process large datasets.
Β· Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems.
1. Collaboration & Analysis:
Β· Work Closely with multiple teams to prepare data for dashboard and BI Tools.
Β· Collaborate with cross-functional teams to understand business requirements and deliver tailored data solutions.
1. Performance & Optimization:
Β· Optimize Databricks workloads for cost efficiency and performance.
Β· Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
1. Governance & Security:
Β· Implement and manage data security, access controls and governance standards using Unity Catalog.
Β· Ensure compliance with organizational and regulatory data policies.
1. Deployment:
Β· Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments.
Β· Manage version control for Databricks artifacts and collaborate with team to maintain development best practices.
Technical Skills:
Β· Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.)
Β· Proficiency in Azure Cloud Services.
RegardsMohitLinkedIn : https://www.linkedin.com/in/mohit-saini-b21b73230/
Job Type: Contract
Pay: $60.00 - $65.00 per hour
Work Location: In person