

Wall Street Consulting Services LLC
Senior Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Engineer on a contract basis, hybrid in Warren, NJ, requiring 10+ years of IT experience, 5+ years with Azure Data Services, and P&C insurance domain expertise. Pay rate is "unknown."
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 10, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Warren, NJ
-
🧠 - Skills detailed
#Data Ingestion #SQL Queries #Python #API (Application Programming Interface) #DevOps #JSON (JavaScript Object Notation) #Scripting #Data Engineering #Cloud #Data Architecture #Azure Data Factory #Databricks #Datasets #Data Lake #Microsoft Azure #SQL (Structured Query Language) #Azure Synapse Analytics #GIT #Scala #Synapse #Deployment #Azure DevOps #ADF (Azure Data Factory) #Data Processing #Azure #Data Pipeline #"ETL (Extract #Transform #Load)"
Role description
Job description:
Job Title: Senior Azure Data Engineer
Location: Hybrid role in Warren, NJ
Employment Type: Contract
Experience Level: 10+ Years
Job Summary: We are seeking a highly skilled Senior Azure Data Engineer to design, develop, and optimize data pipelines and solutions within the Microsoft Azure ecosystem. The ideal candidate will have strong experience with Azure Data Factory (ADF), SQL, JSON data processing, and Python-based ETL pipelines, along with an understanding of Property & Casualty (P&C) insurance data models and business processes.
Key Responsibilities:
• Design, build, and maintain data ingestion, transformation, and integration pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse Analytics.
• Develop scalable ETL/ELT processes to extract and transform structured and semi-structured data (JSON, CSV, Parquet) from diverse data sources.
• Write complex SQL queries, stored procedures, and performance tuning scripts for data validation, cleansing, and reporting.
Required Skills & Qualifications:
• 10+ Years of IT Industry Experience and 8+ years of hands-on experience in Data Engineering and Analytics.
• 5+ years of experience working with Microsoft Azure Data Services (ADF, Synapse, Data Lake, Databricks).
• Proficiency in SQL development and optimization across large datasets.
• Strong experience with Python scripting for ETL, API integration, and data processing.
• Experience working with JSON and semi-structured data within Azure and cloud environments.
• Proven experience in P&C Insurance domain (policy, claims, premium, underwriting data).
• Strong understanding of data architecture, pipelines, and data warehousing concepts.
• Familiarity with Azure DevOps, Git, and CI/CD pipelines for data deployments.
• Excellent problem-solving, analytical, and communication skills.
Job description:
Job Title: Senior Azure Data Engineer
Location: Hybrid role in Warren, NJ
Employment Type: Contract
Experience Level: 10+ Years
Job Summary: We are seeking a highly skilled Senior Azure Data Engineer to design, develop, and optimize data pipelines and solutions within the Microsoft Azure ecosystem. The ideal candidate will have strong experience with Azure Data Factory (ADF), SQL, JSON data processing, and Python-based ETL pipelines, along with an understanding of Property & Casualty (P&C) insurance data models and business processes.
Key Responsibilities:
• Design, build, and maintain data ingestion, transformation, and integration pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse Analytics.
• Develop scalable ETL/ELT processes to extract and transform structured and semi-structured data (JSON, CSV, Parquet) from diverse data sources.
• Write complex SQL queries, stored procedures, and performance tuning scripts for data validation, cleansing, and reporting.
Required Skills & Qualifications:
• 10+ Years of IT Industry Experience and 8+ years of hands-on experience in Data Engineering and Analytics.
• 5+ years of experience working with Microsoft Azure Data Services (ADF, Synapse, Data Lake, Databricks).
• Proficiency in SQL development and optimization across large datasets.
• Strong experience with Python scripting for ETL, API integration, and data processing.
• Experience working with JSON and semi-structured data within Azure and cloud environments.
• Proven experience in P&C Insurance domain (policy, claims, premium, underwriting data).
• Strong understanding of data architecture, pipelines, and data warehousing concepts.
• Familiarity with Azure DevOps, Git, and CI/CD pipelines for data deployments.
• Excellent problem-solving, analytical, and communication skills.






