

Talentmatics
Azure Data Engineer (C2C Role) - Property and Casualty (P&C) Insurance
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer (C2C) focused on Property and Casualty insurance, requiring 10-12 years of experience. Located in Cheshire, CT, it offers a competitive pay rate and demands expertise in Azure Data Factory, Synapse, and SQL.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
December 9, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Corp-to-Corp (C2C)
-
π - Security
Unknown
-
π - Location detailed
Cheshire, CT
-
π§ - Skills detailed
#Scala #Spark (Apache Spark) #Azure Data Factory #PySpark #Big Data #Azure Synapse Analytics #Data Lake #Datasets #Data Ingestion #Microsoft Power BI #SQL (Structured Query Language) #Data Processing #"ETL (Extract #Transform #Load)" #BI (Business Intelligence) #DevOps #Azure #Synapse #Storage #ADF (Azure Data Factory) #ADLS (Azure Data Lake Storage) #Data Engineering #Azure SQL #Scrum #Agile
Role description
π Hiring: Tech Lead β Azure Data Engineer (C2C Role) - Property and Casualty (P&C) insurance
π Location: Cheshire, CT (Work From Office)
π§ π» Experience Required: 10β12 Years
π’ Client: ValueMomentum
πΌ Engagement: C2C
About ValueMomentum
ValueMomentum is a leading solutions provider for the global Property & Casualty (P&C) insurance industry, backed by deep domain expertise and advanced technology solutions. We support major insurers with scalable, modern data and digital platforms.
Role Overview
We are seeking an experienced Azure Data Engineer / Tech Lead with strong expertise in Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake. You will be responsible for designing, developing, and maintaining advanced data engineering solutions to support analytics and BI initiatives.
This role requires deep technical knowledge, strong SQL and ETL experience, and familiarity with Agile environments, DevOps practices, and P&C insurance domain data structures.
Required Skills
Primary Skills
β’ Azure Data Factory (ADF)
β’ Azure Synapse
β’ Azure Data Lake
β’ Advanced SQL
Secondary Skills
β’ T-SQL
β’ Performance Optimization
β’ Power BI (Preferred)
β’ PySpark (Preferred)
Responsibilities
β’ Design and develop ETL solutions for large-scale data processing.
β’ Build and optimize simple to complex pipelines using ADF.
β’ Work with Azure data stack: ADLS, Azure SQL DW, Azure Functions, Synapse, etc.
β’ Write advanced SQL and PySpark code.
β’ Develop transformation logic, mapping rules, and source-to-target specifications.
β’ Perform functional data validation ensuring accuracy and integrity.
β’ Work with business stakeholders to interpret data and ETL requirements.
β’ Manage DevOps/CI-CD pipelines and participate in Agile/Scrum ceremonies.
β’ Communicate project status and technical findings effectively.
Requirements
β 7β10+ years of experience in big data, data engineering, or warehouse development
β Strong experience with Azure data services (ADF, ADLS, Synapse, Azure SQL DB, Blob, Storage Explorer)
β Hands-on experience with ETL/ELT, data ingestion, processing & scalable pipelines
β Proficiency in SQL performance tuning and large datasets
β Power BI reporting experience preferred
β Experience with DevOps, CI/CD pipelines, and Agile methodology
β Must have domain experience in Property & Casualty Insurance (Claims, Policy, Underwriting, etc.)
β Excellent communication, troubleshooting and stakeholder collaboration skills
π Hiring: Tech Lead β Azure Data Engineer (C2C Role) - Property and Casualty (P&C) insurance
π Location: Cheshire, CT (Work From Office)
π§ π» Experience Required: 10β12 Years
π’ Client: ValueMomentum
πΌ Engagement: C2C
About ValueMomentum
ValueMomentum is a leading solutions provider for the global Property & Casualty (P&C) insurance industry, backed by deep domain expertise and advanced technology solutions. We support major insurers with scalable, modern data and digital platforms.
Role Overview
We are seeking an experienced Azure Data Engineer / Tech Lead with strong expertise in Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake. You will be responsible for designing, developing, and maintaining advanced data engineering solutions to support analytics and BI initiatives.
This role requires deep technical knowledge, strong SQL and ETL experience, and familiarity with Agile environments, DevOps practices, and P&C insurance domain data structures.
Required Skills
Primary Skills
β’ Azure Data Factory (ADF)
β’ Azure Synapse
β’ Azure Data Lake
β’ Advanced SQL
Secondary Skills
β’ T-SQL
β’ Performance Optimization
β’ Power BI (Preferred)
β’ PySpark (Preferred)
Responsibilities
β’ Design and develop ETL solutions for large-scale data processing.
β’ Build and optimize simple to complex pipelines using ADF.
β’ Work with Azure data stack: ADLS, Azure SQL DW, Azure Functions, Synapse, etc.
β’ Write advanced SQL and PySpark code.
β’ Develop transformation logic, mapping rules, and source-to-target specifications.
β’ Perform functional data validation ensuring accuracy and integrity.
β’ Work with business stakeholders to interpret data and ETL requirements.
β’ Manage DevOps/CI-CD pipelines and participate in Agile/Scrum ceremonies.
β’ Communicate project status and technical findings effectively.
Requirements
β 7β10+ years of experience in big data, data engineering, or warehouse development
β Strong experience with Azure data services (ADF, ADLS, Synapse, Azure SQL DB, Blob, Storage Explorer)
β Hands-on experience with ETL/ELT, data ingestion, processing & scalable pipelines
β Proficiency in SQL performance tuning and large datasets
β Power BI reporting experience preferred
β Experience with DevOps, CI/CD pipelines, and Agile methodology
β Must have domain experience in Property & Casualty Insurance (Claims, Policy, Underwriting, etc.)
β Excellent communication, troubleshooting and stakeholder collaboration skills






