Wall Street Consulting Services LLC

Azure Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with 12+ years of experience, focusing on SQL, Azure Data Factory, and data modeling in the commercial insurance domain. Contract length and pay rate are unspecified; remote work is permitted.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Warren, NJ
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Azure Data Factory #Azure #Azure DevOps #Data Engineering #Data Modeling #ADF (Azure Data Factory) #Datasets #Data Pipeline #ADLS (Azure Data Lake Storage) #DevOps #JSON (JavaScript Object Notation) #SQL (Structured Query Language) #Synapse #Scala #DataOps #Azure SQL #GIT
Role description
Azure Data Engineer – SQL, Azure, JSON, ADF (Commercial Insurance) Position Overview We are seeking a highly skilled Data Engineer with strong experience in SQL, Azure Data Platform, and Azure Data Factory, preferably within the Insurance domain. The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines, integrating data from multiple insurance systems, and enabling analytical and reporting capabilities for underwriting, claims, policy, billing, and risk management teams. Required Skills & Experience • Minimum 12+ years of experience in Data Engineering or related roles. • Strong expertise in: • SQL, T-SQL, PL/SQL • Azure Data Factory (ADF) • Azure SQL, Synapse, ADLS • Data modeling for relational and analytical systems. • Hands-on experience with ETL/ELT development and complex pipeline orchestration. • Experience in Azure DevOps Git, CI/CD pipelines, and DataOps practices. • Understanding of insurance domain datasets: policy, premium, claims, exposures, brokers, reinsurers, underwriting workflows. • Strong analytical and problem-solving skills, with the ability to handle large datasets and complex transformations.