

Senior Azure Data Engineer � Insurance Analytics
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Engineer in Insurance Analytics, offering a hybrid schedule in Princeton, NJ. Contract length is unspecified with a pay rate of "unknown." Requires 5+ years in Azure BI, data engineering, and insurance domain experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date discovered
July 5, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Princeton, NJ
-
🧠 - Skills detailed
#Normalization #Documentation #SQL (Structured Query Language) #Scrum #Databricks #Deployment #MongoDB #PySpark #Azure SQL #Data Layers #Data Warehouse #Azure Data Factory #Databases #Automation #Azure Stream Analytics #BI (Business Intelligence) #SSRS (SQL Server Reporting Services) #ADF (Azure Data Factory) #Data Pipeline #SSIS (SQL Server Integration Services) #Azure Databricks #Data Engineering #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #ADLS (Azure Data Lake Storage) #SSAS (SQL Server Analysis Services) #Data Mining #Data Ingestion #Azure Event Hubs #Big Data #Data Lake #Azure Analysis Services #Microsoft Azure #Azure #Agile
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Summary: We are seeking a highly skilled Senior Azure Data Engineer with deep expertise in Azure BI technologies and strong experience in data engineering and data warehouse development. The ideal candidate will be well-versed in Microsoft Azure data services and have a working knowledge of insurance or actuarial data, particularly in the reinsurance and analytics space. This role requires an individual capable of driving end-to-end data solutions from ingestion to insights while collaborating in an Agile team environment.
Location: Princeton, NJ (Hybrid 3 Days Onsite Per Week)
Technical Skills Required
Business Intelligence & Big Data:
• Azure Data Factory (ADF), Azure Databricks, Azure Analysis Services (SSAS)
• Azure Data Lake Analytics, Azure Data Lake Store (ADLS)
• Azure Integration Runtime, Azure Event Hubs, Azure Stream Analytics
• DBT (Data Build Tool)
Databases
• Azure SQL
• MongoDB
• PySpark
Experience Requirements
• Proven experience in implementing Microsoft BI/Azure BI solutions including ADF, Databricks, SSAS, SSIS, and SSRS.
• Strong understanding of Azure Big Data technologies (ADLS, ADF, Data Lake Analytics) and data movement using U-SQL jobs.
• Expertise in data warehouse development from inception to deployment, using normalization and denormalization techniques.
• Experience developing data lake layers (staging, bronze, silver, gold) for structured and unstructured data ingestion and curation.
• Skilled in developing notebooks in Databricks to support ETL workflows and data transformation.
• Experienced in building and deploying Azure Analysis Services tabular models and automating scheduling using Azure Automation Runbooks.
• Deep knowledge in developing SSAS cubes (tabular and multidimensional), aggregations, KPIs, partitions, and data mining models.
Preferred Domain Knowledge (Insurance/Reinsurance)
• Familiarity with reinsurance broking data including placements, treaty structures, client hierarchies, and renewals.
• Understanding of actuarial rating data inputs and outputsexposures, experience data, layers, and tagging structures.
• Experience building data pipelines that support actuarial models, pricing tools, and client-facing reporting.
Soft Skills & Team Collaboration
• Strong analytical and interpersonal skills with a clear understanding of the Software Development Life Cycle (SDLC).
• Experience working in Agile/SCRUM methodologies.
• Proven ability to work independently and collaboratively to meet business objectives.
• Effective communication and technical documentation skills.
• Excellent decision-making ability in high-pressure or complex situations.
Must-Have Qualifications
• 5+ years of experience with Microsoft Azure BI and Big Data technologies.
• Demonstrated experience with Azure Data Factory, Databricks, and SSAS.
• Hands-on in ETL design and development, especially across heterogeneous systems.
• Practical knowledge of data warehouse architecture and multi-layer data lake modeling.
• Experience in the insurance or actuarial domain is a strong advantage.