

Data Engineering
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineering position in Princeton, NJ, offering a hybrid work model. Contract length and pay rate are unspecified. Key skills include Azure Data Factory, Azure Databricks, and insurance domain experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 28, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Princeton, NJ
-
π§ - Skills detailed
#BI (Business Intelligence) #Azure Analysis Services #Data Pipeline #MongoDB #SSRS (SQL Server Reporting Services) #ADLS (Azure Data Lake Storage) #Data Lake #Azure Databricks #Data Engineering #Azure Event Hubs #Data Extraction #Data Mining #Microsoft Azure #Scrum #Databricks #Normalization #Automation #SQL (Structured Query Language) #Data Warehouse #Spark (Apache Spark) #ADF (Azure Data Factory) #SSAS (SQL Server Analysis Services) #SQL Server #Data Layers #Databases #Big Data #Azure Data Factory #Azure SQL #PySpark #Azure #"ETL (Extract #Transform #Load)" #Agile #Azure Stream Analytics #SSIS (SQL Server Integration Services) #dbt (data build tool)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi,
Hope you doing well,
Pl let me know if you are available for below position.
ONLY W2 & Insurance domain will be preferred
Job Title: Data Engineering
Location: Princeton - NJ
Remote or Onsite: Hybrid
Technical Skills:
Business Intelligence: Azure Data Factory (ADF), Azure Databricks, Azure Analysis Services (SSAS), Azure Data Lake Analytics, Azure Data Lake Store (ADLS), Azure Integration Runtime, Azure Event Hubs, Azure Stream Analytics, DBT
Database Technologies:Azure SQL, MongoDB, PySpark
Experience Required:
Data Engineering
Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services. Strong Understanding in Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory and in moving the data from flat files and SQL Server using U-SQL jobs.
β’ Expert in data warehouse development starting from inception to implementation and ongoing support, strong understanding of BI application design and development principles using Normalization and De-Normalization techniques. Experience in developing staging zone, bronze, silver and gold layers of data
β’ Good knowledge in implementing various business rules for Data Extraction, Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory (ADF).
β’ Developed notebooks for moving data from raw to stage and then to curated zones using Databricks.
β’ Involved in developing complex Azure Analysis Services tabular databases and deploying the same in Microsoft azure and scheduling the cube through Azure Automation Runbook.
β’ Extensive experience in developing tabular and multidimensional SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, Data Mining Models, deploying and Processing SSAS objects.
Domain Knowledge (Preferred)
Experience with actuarial tools or insurance is preferred. The intent is familiarity with the data terminologies and the hierarchy of data in the Insurance domain, specifically in the below areas
β’ Familiarity with reinsurance broking data, including placements, treaty structures, client hierarchies, and renewal workflows.
β’ Understanding of actuarial rating inputs and outputs, including exposure and experience data, layers, tags, and program structures.
β’ Experience building data pipelines that support actuarial analytics, pricing tools, and downstream reporting for brokers and clients.
Team skills
β’ Team builder with strong, analytical & interpersonal skills with good knowledge in Software Development Life Cycle (SDLC) and Proficient in technical writing.
β’ Experience in Agile software development and SCRUM methodology.
β’ Ability to work independently and as part of a team to accomplish critical business objectives as well as good decision-making skills under high pressure complex scenarios
Thanks, and Regards
Nishant Saurabh | Team Lead
Direct: (732) 426-0654 Ext. 574
Email: nishant@jobilitytalent.com
Jobility Talent Solutions
MBE, WOSB, SWAM, NMSDC
2 Lincoln Highway, Suite 401, Edison, New Jersey 08820| USA