

Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer in Blue Bell, PA (Hybrid – 3 Days Onsite) with a focus on the commercial insurance domain. Requires expertise in Databricks, Azure Data Factory, and programming languages like PySpark and SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
June 19, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Blue Bell, PA
-
🧠 - Skills detailed
#Databases #Security #Azure SQL Database #Programming #Data Management #ADF (Azure Data Factory) #Spark (Apache Spark) #Compliance #Data Engineering #Data Lake #"ETL (Extract #Transform #Load)" #Azure Databricks #Storage #Data Processing #Azure #Azure SQL #Databricks #Data Science #Python #PySpark #SQL (Structured Query Language) #Azure Data Factory #Big Data #Computer Science #Data Governance #Data Quality #Data Pipeline #Scala
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Azure Data Engineer
Location: Blue Bell, PA (Hybrid – 3 Days Onsite)
Interview Process: Final Round will be In-Person
Job Summary
We are seeking a skilled Azure Data Engineer to join our IT department.
The role focuses on leveraging Databricks, Azure Data Factory, and related toolsets to enhance systems integration within the commercial insurance domain.
The ideal candidate will have strong technical problem-solving skills, expertise in Databricks, and a strong understanding of data management and engineering principles.
Key Responsibilities
• Design, develop, and maintain scalable data pipelines using Databricks.
• Ensure data quality, integrity, and consistency across all pipelines.
• Integrate data from multiple sources, including Azure Data Lake, SQL databases, and APIs.
• Leverage Databricks to integrate with different Large and Small Language Models.
• Collaborate with data scientists, analysts, and stakeholders to gather and understand data requirements.
• Develop and maintain ETL processes using Azure Databricks.
• Implement Data Quality, Audit, Balance, and Control measures in workflows.
• Optimize data processing performance and apply best practices for storage and retrieval.
• Ensure security and compliance with industry standards and data governance policies.
• Provide technical support, training, and guidance to team members and stakeholders.
• Document data engineering processes, workflows, and best practices.
• Generate reports and dashboards to provide insights into pipeline performance and data quality.
• Stay updated with industry trends and identify opportunities for process improvements.
• Demonstrate commitment to the Company’s Code of Business Conduct and Ethics.
Required Qualifications
• Bachelor’s degree in Computer Science, Engineering, or a related field.
• Proven experience as a Data Engineer with a focus on Azure Databricks.
• Strong knowledge of Azure Databricks and its ecosystem.
• Experience with data pipeline and workflow management tools.
• Proficiency in programming languages such as PySpark, Python, Scala, or SQL.
• Familiarity with big data technologies such as Spark.
• Experience with Azure services, including Azure Data Lake and Azure SQL Database.
• Knowledge of data warehousing concepts and best practices.
• Excellent problem-solving skills and strong attention to detail.
• Strong communication and collaboration skills.
Preferred Qualifications
• Experience with integrating Databricks with Large and Small Language Models.
• Experience implementing Data Quality, Audit, Balance, and Control measures.
• Familiarity with the commercial insurance domain.
• Experience in developing and optimizing data solutions in a fast-paced environment.