

Data Engineer – ADF, SSAS, Databricks [6-Month Contract]
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with strong experience in Azure Data Factory, Databricks, and SSAS for a 6-month contract in Princeton, NJ (Hybrid). Candidates must be U.S. authorized, with insurance domain experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 5, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Documentation #SQL (Structured Query Language) #Databricks #MongoDB #PySpark #Azure SQL #Version Control #Azure Data Factory #Data Modeling #Data Framework #BI (Business Intelligence) #SSRS (SQL Server Reporting Services) #ADF (Azure Data Factory) #Data Pipeline #SSIS (SQL Server Integration Services) #Data Engineering #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #SSAS (SQL Server Analysis Services) #Unit Testing #Azure Event Hubs #Scala #Data Lake #Azure #Agile
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Note
• If shortlisted, we will contact you via email or WhatsApp – kindly respond at the earliest.
• This is a W2 position, not a C2C opportunity.
• Candidates must be authorized to work in the U.S. (No visa sponsorship provided).
Work Type: Contract (6 Months) | Location: Princeton, NJ (Hybrid)
Compensation: Hourly (Contact for details)
Visa Status: Must be authorized to work in the US (No sponsorship)
Job Summary
We are seeking an experienced Data Engineer to support actuarial and insurance analytics solutions. You’ll design and implement modern Azure BI pipelines and contribute to data modeling and transformation projects across the insurance domain.
Key Responsibilities
• Build and maintain data pipelines using Azure Data Factory and Databricks
• Develop Azure BI solutions (ADF, SSAS, DBT, SQL, Data Lake)
• Design ETL frameworks from staging to gold layer
• Support actuarial data modeling and reporting layers
• Implement business rules and transformations using PySpark and U-SQL
• Collaborate with business teams to gather and translate requirements
• Work closely with offshore/onsite team members in Agile settings
• Perform unit testing, troubleshooting, and performance tuning
• Create and maintain documentation for processes and data workflows
Technical Requirements (Must-Have)
• Strong experience with Azure Data Factory, Databricks, SSAS, Azure SQL
• Hands-on with ETL/BI tools: DBT, SSIS, SSRS
• Experience with PySpark, MongoDB, and data lake architecture
• Familiarity with insurance or actuarial data structures
• Experience in staging/bronze/silver/gold layer modeling
• Proficient in creating and scheduling tabular models in Azure
• Ability to build scalable, reusable pipelines and data frameworks
• Strong SDLC and Agile practices, version control familiarity
• Excellent documentation and communication skills
Preferred Experience
• Prior experience in insurance, actuarial analytics, or reinsurance data
• Knowledge of Metarisk, rating tools, and data hierarchies in insurance
• Exposure to Azure Event Hubs, Stream Analytics, or CI/CD tools
• Strong collaboration and communication with both technical and business users