Talentmatics

Azure Data Engineer (C2C Role) - Property and Casualty (P&C) Insurance

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer (C2C) focused on Property and Casualty insurance, requiring 10-12 years of experience. Located in Cheshire, CT, it offers a competitive pay rate and demands expertise in Azure Data Factory, Synapse, and SQL.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 9, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Corp-to-Corp (C2C)
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Cheshire, CT
-
🧠 - Skills detailed
#Scala #Spark (Apache Spark) #Azure Data Factory #PySpark #Big Data #Azure Synapse Analytics #Data Lake #Datasets #Data Ingestion #Microsoft Power BI #SQL (Structured Query Language) #Data Processing #"ETL (Extract #Transform #Load)" #BI (Business Intelligence) #DevOps #Azure #Synapse #Storage #ADF (Azure Data Factory) #ADLS (Azure Data Lake Storage) #Data Engineering #Azure SQL #Scrum #Agile
Role description
πŸš€ Hiring: Tech Lead – Azure Data Engineer (C2C Role) - Property and Casualty (P&C) insurance πŸ“ Location: Cheshire, CT (Work From Office) πŸ§‘ πŸ’» Experience Required: 10–12 Years 🏒 Client: ValueMomentum πŸ’Ό Engagement: C2C About ValueMomentum ValueMomentum is a leading solutions provider for the global Property & Casualty (P&C) insurance industry, backed by deep domain expertise and advanced technology solutions. We support major insurers with scalable, modern data and digital platforms. Role Overview We are seeking an experienced Azure Data Engineer / Tech Lead with strong expertise in Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake. You will be responsible for designing, developing, and maintaining advanced data engineering solutions to support analytics and BI initiatives. This role requires deep technical knowledge, strong SQL and ETL experience, and familiarity with Agile environments, DevOps practices, and P&C insurance domain data structures. Required Skills Primary Skills β€’ Azure Data Factory (ADF) β€’ Azure Synapse β€’ Azure Data Lake β€’ Advanced SQL Secondary Skills β€’ T-SQL β€’ Performance Optimization β€’ Power BI (Preferred) β€’ PySpark (Preferred) Responsibilities β€’ Design and develop ETL solutions for large-scale data processing. β€’ Build and optimize simple to complex pipelines using ADF. β€’ Work with Azure data stack: ADLS, Azure SQL DW, Azure Functions, Synapse, etc. β€’ Write advanced SQL and PySpark code. β€’ Develop transformation logic, mapping rules, and source-to-target specifications. β€’ Perform functional data validation ensuring accuracy and integrity. β€’ Work with business stakeholders to interpret data and ETL requirements. β€’ Manage DevOps/CI-CD pipelines and participate in Agile/Scrum ceremonies. β€’ Communicate project status and technical findings effectively. Requirements βœ” 7–10+ years of experience in big data, data engineering, or warehouse development βœ” Strong experience with Azure data services (ADF, ADLS, Synapse, Azure SQL DB, Blob, Storage Explorer) βœ” Hands-on experience with ETL/ELT, data ingestion, processing & scalable pipelines βœ” Proficiency in SQL performance tuning and large datasets βœ” Power BI reporting experience preferred βœ” Experience with DevOps, CI/CD pipelines, and Agile methodology βœ” Must have domain experience in Property & Casualty Insurance (Claims, Policy, Underwriting, etc.) βœ” Excellent communication, troubleshooting and stakeholder collaboration skills