ValueMomentum

Technical Lead

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Technical Lead specializing in Azure with a contract length of "unknown" and a pay rate of "unknown." Located in NYC, candidates must have 10-12 years of experience in P&C insurance, Azure Data Factory, and advanced SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 30, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#Jira #Migration #Data Warehouse #ADLS (Azure Data Lake Storage) #Databases #"ETL (Extract #Transform #Load)" #Big Data #Azure Synapse Analytics #Data Accuracy #Storage #DevOps #Azure Data Factory #Synapse #Deployment #ADF (Azure Data Factory) #Azure #Data Migration #Programming #Data Ingestion #Azure SQL #PySpark #Data Integration #BI (Business Intelligence) #Data Processing #Agile #Spark (Apache Spark) #SQL (Structured Query Language) #Data Lake #Microsoft Power BI #Stories #Scrum
Role description
Job Title: Tech Lead Azure Location: NYC Mode of work: Work from Office Experience: 10 to 12 Years. Primary skills: Property and Casualty (P&C) insurance, Azure Data Factory, Azure Synapse, Azure Data Lake, Advanced SQL Secondary skills: T-SQL, Performance Optimization About the job We are seeking a highly skilled Azure Data & BI Engineer with expertise in Azure Data Factory (ADF) and Azure Synapse Analytics to join our dynamic team within our organization. This role involves integrating seamlessly with Internal Development Platforms (IDP) and other tools to enhance the developer's experience on Azure. The ideal candidate will have a strong background in data integration, ETL processes, and data warehousing. This role involves designing, developing, and maintaining data solutions that support our business intelligence and analytics initiatives. Responsibilities • Design & Develop the ETL • Good experience in writing SQL, and PySpark programming. • Create the Pipelines (simple and complex) using ADF. • Work with other Azure stack modules like Azure Data Lakes, SQL DW • Must be extremely well versed with large volumes of data. • Understand the business requirements for Data flow process needs. • Understand requirements, functionality, and technical specifications of documents. • Development of mapping documents and transformation business rules as per scope and requirements/Source to target. • Responsible for continuous formal and informal communication on project status • Good understanding of JIRA stories process for SQL development activities Requirements Must Have: • Must have domain expertise in Property and Casualty (P&C) insurance, with a solid understanding of industry-specific data, workflows, and business processes. • 7 to 10 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, and data warehouse solutions. • Extensive hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse, Azure SQL DB, etc. • Experience in Data Sets, Data Frame, Azure Blob & Storage Explorer • Proficient in Power BIwith hands-on experience in SQL performance tuningfor optimized data retrieval and reporting. • Demonstrated expertise in functional data validation, ensuring data accuracy and integrity across business processes. • Well versed in DevOps and CI/CD deployments. • Have a good understanding of Agile/Scrum methodologies. • Strong attention to detail in high-pressure situations. • Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred • Excellent problem-solving skills and the ability to troubleshoot complex issues in distributed systems. • Effective communication and collaboration skills, with the ability to effectively interact with stakeholders at all levels.