

New York Technology Partners
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Chicago, IL, on a contract basis. Requires 10+ years in data engineering, strong ETL and SQL skills, P&C insurance domain experience, and proficiency in cloud platforms. Excellent communication is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
October 11, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #Data Modeling #DataStage #Informatica #Snowflake #Oracle #Big Data #AWS (Amazon Web Services) #Cloud #Datasets #Data Quality #Data Engineering #Computer Science #Azure #Predictive Modeling #Redshift #Documentation #Data Framework #Matillion #Data Integration #Talend #Data Warehouse #Python #Databricks #Data Management #MDM (Master Data Management) #Scala #Spark (Apache Spark) #Compliance #GDPR (General Data Protection Regulation) #Data Lake #Synapse #Data Pipeline #Hadoop #BigQuery #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Mapping #Databases #PostgreSQL #Data Science #ML (Machine Learning) #Monitoring #Data Governance #SQL Server
Role description
Title: Data Engineer β P&C Insurance
Location: Chicago, IL
Employment Type: Contract
Job Summary:
β’ P&C domain insurance.
β’ Data ETL, SQL both should be strong.
β’ 10+ Years experience.
β’ Both Development and testing is required.
β’ Communication should be excellent.
Key Responsibilities:
β’ Design, build, and maintain scalable and reliable ETL/ELT pipelines to ingest, transform, and integrate data from policy, claims, billing, and external insurance data sources.
β’ Collaborate with business stakeholders, actuaries, underwriters, and data scientists to translate P&C insurance domain requirements into robust data models.
β’ Develop and optimize data warehouses, data lakes, and cloud-based platforms (AWS/Azure/GCP) to support reporting and analytics.
β’ Work with structured and unstructured data, including exposure, risk, and claims data.
β’ Ensure data quality, governance, and lineage are maintained across data ecosystems.
β’ Collaborate with cross-functional teams to support predictive modeling, loss reserving, fraud detection, and pricing analytics.
β’ Automate data workflows and monitoring for high performance and reliability.
β’ Maintain documentation of data pipelines, dictionaries, and insurance data mappings.
Required Skills & Qualifications:
β’ Bachelorβs/Masterβs degree in Computer Science, Data Engineering, Information Systems, or related field.
β’ 10+ years of experience in data engineering roles, preferably in the P&C insurance domain.
β’ Strong hands-on experience with ETL tools (Informatica, Talend, DataStage, Matillion, etc.) or custom ETL using Python/Scala/Spark.
β’ Proficiency with SQL, data modeling, and relational databases (Oracle, SQL Server, PostgreSQL, etc.).
β’ Experience with cloud data platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
β’ Familiarity with insurance data standards (ACORD, ISO, policy, claims, billing datasets).
β’ Knowledge of big data frameworks (Hadoop, Spark, Databricks).
β’ Strong understanding of data governance, lineage, and master data management (MDM) in an insurance context.
β’ Excellent communication and collaboration skills.
Preferred Qualifications:
β’ Experience in Actuarial/Underwriting data integration.
β’ Familiarity with Guidewire, Duck Creek, or other P&C core platforms.
β’ Exposure to machine learning pipelines for predictive modeling in insurance.
β’ Knowledge of regulatory and compliance requirements for insurance data (NAIC, GDPR, HIPAA, SOX).
Title: Data Engineer β P&C Insurance
Location: Chicago, IL
Employment Type: Contract
Job Summary:
β’ P&C domain insurance.
β’ Data ETL, SQL both should be strong.
β’ 10+ Years experience.
β’ Both Development and testing is required.
β’ Communication should be excellent.
Key Responsibilities:
β’ Design, build, and maintain scalable and reliable ETL/ELT pipelines to ingest, transform, and integrate data from policy, claims, billing, and external insurance data sources.
β’ Collaborate with business stakeholders, actuaries, underwriters, and data scientists to translate P&C insurance domain requirements into robust data models.
β’ Develop and optimize data warehouses, data lakes, and cloud-based platforms (AWS/Azure/GCP) to support reporting and analytics.
β’ Work with structured and unstructured data, including exposure, risk, and claims data.
β’ Ensure data quality, governance, and lineage are maintained across data ecosystems.
β’ Collaborate with cross-functional teams to support predictive modeling, loss reserving, fraud detection, and pricing analytics.
β’ Automate data workflows and monitoring for high performance and reliability.
β’ Maintain documentation of data pipelines, dictionaries, and insurance data mappings.
Required Skills & Qualifications:
β’ Bachelorβs/Masterβs degree in Computer Science, Data Engineering, Information Systems, or related field.
β’ 10+ years of experience in data engineering roles, preferably in the P&C insurance domain.
β’ Strong hands-on experience with ETL tools (Informatica, Talend, DataStage, Matillion, etc.) or custom ETL using Python/Scala/Spark.
β’ Proficiency with SQL, data modeling, and relational databases (Oracle, SQL Server, PostgreSQL, etc.).
β’ Experience with cloud data platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
β’ Familiarity with insurance data standards (ACORD, ISO, policy, claims, billing datasets).
β’ Knowledge of big data frameworks (Hadoop, Spark, Databricks).
β’ Strong understanding of data governance, lineage, and master data management (MDM) in an insurance context.
β’ Excellent communication and collaboration skills.
Preferred Qualifications:
β’ Experience in Actuarial/Underwriting data integration.
β’ Familiarity with Guidewire, Duck Creek, or other P&C core platforms.
β’ Exposure to machine learning pipelines for predictive modeling in insurance.
β’ Knowledge of regulatory and compliance requirements for insurance data (NAIC, GDPR, HIPAA, SOX).