Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in P&C Insurance, based onsite in Piscataway, NJ. Contract length exceeds 6 months, with a pay rate of $75-$80/hour. Requires 15+ years in data architecture, expertise in Azure Data Factory, Informatica PowerCenter, and knowledge of the P&C insurance domain.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
640
-
πŸ—“οΈ - Date discovered
June 5, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
1099 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Piscataway, NJ 08854
-
🧠 - Skills detailed
#Spark (Apache Spark) #SQL (Structured Query Language) #Python #"ETL (Extract #Transform #Load)" #Cloud #Informatica PowerCenter #Data Lake #Hadoop #Big Data #SQL Server #Scala #Data Access #Azure #Data Architecture #Informatica #Metadata #Data Pipeline #ADF (Azure Data Factory) #Azure Data Factory
Role description
JOB TITLE: DATA ARCHITECT – P&C INSURANCE (ONSITE) LOCATION: Piscataway, New Jersey CONTRACT (1099) including benefits CITIZENSHIP REQUIREMENT: U.S. Citizens or Green Card Holders only JOB DESCRIPTION: We are seeking an experienced DATA ARCHITECT to support a large-scale data modernization initiative for a Property & Casualty (P&C) insurance client. This is an onsite role based in Piscataway, NJ and requires hands-on experience with Azure Data Factory (ADF), Informatica PowerCenter, and Guidewire CDA. You’ll work closely with stakeholders and technical teams to design and implement modern, scalable, and efficient data solutions using cloud-native and big data tools. RESPONSIBILITIES: Design and implement enterprise metadata-driven data pipelines using ADF and Informatica PowerCenter Build and manage an Operational Data Store (ODS) from Azure Data Lake Integrate Guidewire Claim Data Access (CDA) into existing architecture for advanced reporting and analytics Optimize and troubleshoot complex data workflows and pipelines Leverage Python, T-SQL, and Spark to develop custom processes Design serverless solutions using Azure Functions for scheduled and event-driven workflows Ensure cloud cost-efficiency and performance optimization Provide guidance on Hadoop-based big data solutions where applicable Collaborate with cross-functional teams and communicate solutions clearly to stakeholders REQUIRED SKILLS: 15+ years of experience in data architecture, engineering, or ETL development Strong hands-on expertise with Azure Data Lake, ADF, and SQL Server Advanced experience with Informatica PowerCenter for ETL Proficiency in Python, T-SQL, and Spark Hands-on experience building ODS systems from data lakes Knowledge of P&C insurance domain Proven ability to troubleshoot and optimize data pipelines Excellent communication skills (verbal & written) Experience with cloud cost optimization strategies NICE TO HAVE: Familiarity with Guidewire Experience with Hadoop-based solutions HOW TO APPLY: Submit your resume and contact information through Indeed. We’re looking to fill this role quickly β€” apply today if you’re ready for a new challenge with a dynamic and impactful project. #TRFGroupInc #TRFTechnologies #PenmarkGlobalGroupInc #WeAreHiring #HiringNow #JoinUs #JoinOurTeam #JoinOurJourney #JoinOurMission Job Types: Full-time, Contract Pay: $75.00 - $80.00 per hour Benefits: 401(k) Paid time off Work Location: In person