

Wall Street Consulting Services LLC
Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer on a contract basis in Warren, NJ, requiring 12-20 years of experience in SQL, Azure, ADF, and Commercial Insurance. Key skills include ETL development and data modeling, with preferred Azure certifications.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 10, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Warren, NJ
-
🧠 - Skills detailed
#ADLS (Azure Data Lake Storage) #DataOps #Spark (Apache Spark) #Data Modeling #DevOps #ML (Machine Learning) #PySpark #Data Engineering #Databricks #Azure Data Factory #Datasets #SQL (Structured Query Language) #Azure SQL #GIT #Scala #Synapse #AI (Artificial Intelligence) #ADF (Azure Data Factory) #Azure DevOps #Azure #Data Pipeline #"ETL (Extract #Transform #Load)"
Role description
Job Title: Data Engineer – SQL, Azure, ADF (Commercial Insurance)
Location: Warren NJ (Hybrid)
Experience: 12 -20 Years
Job Type: Contract
Required Skills: SQL, Azure, ADF, Commercial Insurance
Position Overview
We are seeking a highly skilled Data Engineer with strong experience in SQL, Azure Data Platform, and Azure Data Factory, preferably within the Insurance domain. The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines, integrating data from multiple insurance systems, and enabling analytical and reporting capabilities for underwriting, claims, policy, billing, and risk management teams.
Required Skills & Experience
• Minimum 12+ years of experience in Data Engineering or related roles.
• Strong expertise in:
• SQL, T-SQL, PL/SQL
• Azure Data Factory (ADF)
• Azure SQL, Synapse, ADLS
• Data modeling for relational and analytical systems.
• Hands-on experience with ETL/ELT development and complex pipeline orchestration.
• Experience in Azure DevOps Git, CI/CD pipelines, and DataOps practices.
• Understanding of insurance domain datasets: policy, premium, claims, exposures, brokers, reinsurers, underwriting workflows.
• Strong analytical and problem-solving skills, with the ability to handle large datasets and complex transformations.
Preferred Qualifications
• Experience with Databricks / PySpark for large-scale transformations.
• Knowledge of Commercial Property & Casualty (P&C) insurance.
• Experience integrating data from Guidewire ClaimCenter/PolicyCenter, DuckCreek, or similar platforms.
• Exposure to ML/AI pipelines for underwriting or claims analytics.
• Azure certifications such as:
• DP-203 (Azure Data Engineer)
• AZ-900, AZ-204, AI-900
Job Title: Data Engineer – SQL, Azure, ADF (Commercial Insurance)
Location: Warren NJ (Hybrid)
Experience: 12 -20 Years
Job Type: Contract
Required Skills: SQL, Azure, ADF, Commercial Insurance
Position Overview
We are seeking a highly skilled Data Engineer with strong experience in SQL, Azure Data Platform, and Azure Data Factory, preferably within the Insurance domain. The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines, integrating data from multiple insurance systems, and enabling analytical and reporting capabilities for underwriting, claims, policy, billing, and risk management teams.
Required Skills & Experience
• Minimum 12+ years of experience in Data Engineering or related roles.
• Strong expertise in:
• SQL, T-SQL, PL/SQL
• Azure Data Factory (ADF)
• Azure SQL, Synapse, ADLS
• Data modeling for relational and analytical systems.
• Hands-on experience with ETL/ELT development and complex pipeline orchestration.
• Experience in Azure DevOps Git, CI/CD pipelines, and DataOps practices.
• Understanding of insurance domain datasets: policy, premium, claims, exposures, brokers, reinsurers, underwriting workflows.
• Strong analytical and problem-solving skills, with the ability to handle large datasets and complex transformations.
Preferred Qualifications
• Experience with Databricks / PySpark for large-scale transformations.
• Knowledge of Commercial Property & Casualty (P&C) insurance.
• Experience integrating data from Guidewire ClaimCenter/PolicyCenter, DuckCreek, or similar platforms.
• Exposure to ML/AI pipelines for underwriting or claims analytics.
• Azure certifications such as:
• DP-203 (Azure Data Engineer)
• AZ-900, AZ-204, AI-900






