

Synergyassure Inc
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Azure Data Engineer in New Jersey, offering a long-term contract at a competitive pay rate. Requires 14+ years of experience, expertise in P&C Insurance, Azure Data Factory, Databricks, and DevOps tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 6, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Data Warehouse #Databricks #Cloud #Microsoft Power BI #Data Mart #SQL (Structured Query Language) #Azure #ADLS (Azure Data Lake Storage) #Azure Event Hubs #Data Engineering #Azure Data Factory #"ETL (Extract #Transform #Load)" #Data Modeling #Data Pipeline #Azure DevOps #Azure SQL #Data Ingestion #ADF (Azure Data Factory) #Deployment #BI (Business Intelligence) #Version Control #Azure cloud #GIT #Data Lake #DevOps #Data Quality #Agile #Data Analysis #Automation #Synapse #Python #Visualization
Role description
Job Title: Lead – Azure Data Engineer
Location: New Jersey
Long Term
Experience Level: 14+ Years
Must have - P&C Insurance, Azure DevOps, Azure Data Factory, Databricks
Job Summary:
We are seeking an experienced Data Engineer with strong expertise in Azure cloud services, Data Engineering practices, and DevOps automation, combined with a solid understanding of the Property & Casualty (P&C) Insurance domain. The ideal candidate will play a key role in designing, building, and optimizing data solutions to support underwriting, claims, and policy analytics initiatives.
Key Responsibilities:
• A solid understanding and preferably practical experience working with event driven architectures, especially those involving event streaming. Hands on experience with Azure Event Hubs is a big plus.
• Design and develop data pipelines and ETL processes using Azure Data Factory, Databricks, and related Azure services.
• Extensive professional experience working with serverless computing architectures specifically Azure functions.
• Implement and maintain data models and data warehouses supporting P&C Insurance operations such as claims, policy, underwriting, and billing.
• Collaborate with business stakeholders and product teams to translate insurance data requirements into technical solutions.
• Utilize Azure DevOps for CI/CD automation, version control, and deployment of data solutions.
• Exposure to DevOps methodologies with practical experience in Azure DevOps (ADO), including configuration of CI/CD pipelines, managing build releases, and automating deployments across development and production environments.
• A solid understanding of the medallion architecture and underlying templatized pipelines for data estates would be good.
• Ensure data quality, governance, and lineage across multiple data systems and reporting environments.
• Work closely with data analysts, actuaries, and business teams to deliver insights and analytical solutions.
• Develop scripts and automation to optimize data ingestion, transformation, and validation workflows.
Required Skills & Experience:
• 12+ years of experience in data engineering, data analytics, or related fields.
• Strong knowledge of P&C Insurance processes (Claims, Policy Administration, Billing, Underwriting, etc.).
• Hands-on experience with Azure Data Services: Azure Data Factory, Databricks, Synapse, ADLS, and Azure SQL DB.
• Proficiency in DevOps tools – Azure DevOps, Git, CI/CD pipelines.
• Strong skills in SQL, Python, and data modeling (dimensional, relational, or lakehouse).
• Experience in building and maintaining data pipelines, data marts, and data lake architectures.
• Familiarity with reporting tools such as Power BI or other visualization platforms is an advantage.
• Excellent communication and problem-solving skills with the ability to work in an agile team environment.
Job Title: Lead – Azure Data Engineer
Location: New Jersey
Long Term
Experience Level: 14+ Years
Must have - P&C Insurance, Azure DevOps, Azure Data Factory, Databricks
Job Summary:
We are seeking an experienced Data Engineer with strong expertise in Azure cloud services, Data Engineering practices, and DevOps automation, combined with a solid understanding of the Property & Casualty (P&C) Insurance domain. The ideal candidate will play a key role in designing, building, and optimizing data solutions to support underwriting, claims, and policy analytics initiatives.
Key Responsibilities:
• A solid understanding and preferably practical experience working with event driven architectures, especially those involving event streaming. Hands on experience with Azure Event Hubs is a big plus.
• Design and develop data pipelines and ETL processes using Azure Data Factory, Databricks, and related Azure services.
• Extensive professional experience working with serverless computing architectures specifically Azure functions.
• Implement and maintain data models and data warehouses supporting P&C Insurance operations such as claims, policy, underwriting, and billing.
• Collaborate with business stakeholders and product teams to translate insurance data requirements into technical solutions.
• Utilize Azure DevOps for CI/CD automation, version control, and deployment of data solutions.
• Exposure to DevOps methodologies with practical experience in Azure DevOps (ADO), including configuration of CI/CD pipelines, managing build releases, and automating deployments across development and production environments.
• A solid understanding of the medallion architecture and underlying templatized pipelines for data estates would be good.
• Ensure data quality, governance, and lineage across multiple data systems and reporting environments.
• Work closely with data analysts, actuaries, and business teams to deliver insights and analytical solutions.
• Develop scripts and automation to optimize data ingestion, transformation, and validation workflows.
Required Skills & Experience:
• 12+ years of experience in data engineering, data analytics, or related fields.
• Strong knowledge of P&C Insurance processes (Claims, Policy Administration, Billing, Underwriting, etc.).
• Hands-on experience with Azure Data Services: Azure Data Factory, Databricks, Synapse, ADLS, and Azure SQL DB.
• Proficiency in DevOps tools – Azure DevOps, Git, CI/CD pipelines.
• Strong skills in SQL, Python, and data modeling (dimensional, relational, or lakehouse).
• Experience in building and maintaining data pipelines, data marts, and data lake architectures.
• Familiarity with reporting tools such as Power BI or other visualization platforms is an advantage.
• Excellent communication and problem-solving skills with the ability to work in an agile team environment.






