

ValueMomentum
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in NYC, with a contract length of "unknown" and a pay rate of "unknown." Requires 12+ years of experience, expertise in P&C insurance, Azure Data Factory, Azure Synapse, and Advanced SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 21, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#Data Engineering #Deployment #Scrum #Data Lake #"ETL (Extract #Transform #Load)" #Stories #ADF (Azure Data Factory) #Big Data #Data Warehouse #Data Processing #Agile #Storage #Microsoft Power BI #Azure #Programming #Data Integration #Synapse #Migration #Data Migration #ADLS (Azure Data Lake Storage) #Spark (Apache Spark) #PySpark #Data Accuracy #Data Ingestion #SQL (Structured Query Language) #DevOps #Databases #Azure Synapse Analytics #Azure Data Factory #Azure SQL #BI (Business Intelligence) #Jira
Role description
Job Title: Senior Data Engineer
Location: NYC
Mode of work: Work from Office
Experience: 12+ Years.
Primary skills: Property and Casualty (P&C) insurance, Azure Data Factory, Azure Synapse, Azure Data Lake, Advanced SQL
Secondary skills: T-SQL, Performance Optimization
About the job
We are seeking a highly skilled Azure Data & BI Engineer with expertise in Azure Data Factory (ADF) and Azure Synapse Analytics to join our dynamic team within our organization. This role involves integrating seamlessly with Internal Development Platforms (IDP) and other tools to enhance the developer's experience on Azure. The ideal candidate will have a strong background in data integration, ETL processes, and data warehousing. This role involves designing, developing, and maintaining data solutions that support our business intelligence and analytics initiatives.
Responsibilities
• Design & Develop the ETL
• Good experience in writing SQL, and PySpark programming.
• Create the Pipelines (simple and complex) using ADF.
• Work with other Azure stack modules like Azure Data Lakes, SQL DW
• Must be extremely well versed with large volumes of data.
• Understand the business requirements for Data flow process needs.
• Understand requirements, functionality, and technical specifications of documents.
• Development of mapping documents and transformation business rules as per scope and requirements/Source to target.
• Responsible for continuous formal and informal communication on project status
• Good understanding of JIRA stories process for SQL development activities
Requirements Must Have:
• Must have domain expertise in Property and Casualty (P&C) insurance, with a solid understanding of industry-specific data, workflows, and business processes.
• 7 to 10 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, and data warehouse solutions.
• Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse, Azure SQL DB, etc.
• Experience in Data Sets, Data Frame, Azure Blob & Storage Explorer
• Proficient in Power BIwith hands-on experience in SQL performance tuningfor optimized data retrieval and reporting.
• Demonstrated expertise in functional data validation, ensuring data accuracy and integrity across business processes.
• Well versed in DevOps and CI/CD deployments.
• Have a good understanding of Agile/Scrum methodologies.
• Strong attention to detail in high-pressure situations.
• Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred
• Excellent problem-solving skills and the ability to troubleshoot complex issues in distributed systems.
• Effective communication and collaboration skills, with the ability to effectively interact with stakeholders at all levels.
Job Title: Senior Data Engineer
Location: NYC
Mode of work: Work from Office
Experience: 12+ Years.
Primary skills: Property and Casualty (P&C) insurance, Azure Data Factory, Azure Synapse, Azure Data Lake, Advanced SQL
Secondary skills: T-SQL, Performance Optimization
About the job
We are seeking a highly skilled Azure Data & BI Engineer with expertise in Azure Data Factory (ADF) and Azure Synapse Analytics to join our dynamic team within our organization. This role involves integrating seamlessly with Internal Development Platforms (IDP) and other tools to enhance the developer's experience on Azure. The ideal candidate will have a strong background in data integration, ETL processes, and data warehousing. This role involves designing, developing, and maintaining data solutions that support our business intelligence and analytics initiatives.
Responsibilities
• Design & Develop the ETL
• Good experience in writing SQL, and PySpark programming.
• Create the Pipelines (simple and complex) using ADF.
• Work with other Azure stack modules like Azure Data Lakes, SQL DW
• Must be extremely well versed with large volumes of data.
• Understand the business requirements for Data flow process needs.
• Understand requirements, functionality, and technical specifications of documents.
• Development of mapping documents and transformation business rules as per scope and requirements/Source to target.
• Responsible for continuous formal and informal communication on project status
• Good understanding of JIRA stories process for SQL development activities
Requirements Must Have:
• Must have domain expertise in Property and Casualty (P&C) insurance, with a solid understanding of industry-specific data, workflows, and business processes.
• 7 to 10 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, and data warehouse solutions.
• Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse, Azure SQL DB, etc.
• Experience in Data Sets, Data Frame, Azure Blob & Storage Explorer
• Proficient in Power BIwith hands-on experience in SQL performance tuningfor optimized data retrieval and reporting.
• Demonstrated expertise in functional data validation, ensuring data accuracy and integrity across business processes.
• Well versed in DevOps and CI/CD deployments.
• Have a good understanding of Agile/Scrum methodologies.
• Strong attention to detail in high-pressure situations.
• Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred
• Excellent problem-solving skills and the ability to troubleshoot complex issues in distributed systems.
• Effective communication and collaboration skills, with the ability to effectively interact with stakeholders at all levels.






