

Data Architect with Property Insurance
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with Property Insurance, requiring re-insurance domain experience. Contract length and pay rate are unspecified. Key skills include Azure Data ingestion, ADF, Databricks, Python, and Airflow. Insurance and actuarial knowledge are essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Databricks #Data Layers #ADF (Azure Data Factory) #Python #Spark (Apache Spark) #Data Ingestion #Data Modeling #Airflow #Deployment #Data Engineering #Azure #Data Architecture #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Our understanding is we are looking at a Data Engineer with Re-Insurance domain experience.
The primary skills are:
- Sound knowledge of Data ingestion and Data engineering skills on Azure platform
- Hands on experience in Data pipeline design, development, testing and deployment
- Proficient in Azure ADF, Databricks, Notebooks with Python or Spark skills
- Experience in orchestrating Data pipelines using Airflow or any other scheduling tools
- Experience in analytics in the area of policy administration, claims, billing, actuarial etc.
Secondary skills:
- Insurance and Re-insurance experience with good knowledge policy administration and maintenance
- Understanding of actuarial routines and how data will be used as a service
- Integration with downstream applications for reporting and analytics including policy agents, brokers and clients
If we want a resource skilled in:
β’ Familiarity with reinsurance broking data, including placements, treaty structures, client hierarchies, and renewal workflows.
β’ Understanding of actuarial rating inputs and outputs, including exposure and experience data, layers, tags, and program structures.
β’ Experience building data pipelines that support actuarial analytics, pricing tools, and downstream reporting for brokers and clients.
β’ Azure Data Architect with data modeling experience
β’ Data ingestion, engineering on Azure using Databricks notebooks, ADF etc.
β’ Experience dealing underwriting in Insurance.