

100% Remote - Databricks_Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a 100% remote Databricks Data Engineer position, offering a long-term contract with a pay rate of "unknown." Candidates should have 7-12 years of experience in data engineering, strong proficiency in PySpark, Python, SQL, and familiarity with the insurance industry.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Lake #Data Analysis #Data Pipeline #Datasets #SQL (Structured Query Language) #ADF (Azure Data Factory) #Automation #Agile #Data Modeling #Compliance #Spark (Apache Spark) #PySpark #Python #Data Engineering #Scrum #Databricks #Scala #Azure #Azure Data Factory #Azure Databricks #"ETL (Extract #Transform #Load)" #Azure cloud #Computer Science #Delta Lake #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job title: Databricks\_Data Engineer
Duration: Full Time Permanent/ Long Term Contract
Location: Preferred - Remote - US remote, Canada
Job Overview:
We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for development and optimization of data pipelines, implementing robust data checks, and ensuring the accuracy and integrity of data flows. This role is critical in supporting data-driven decision-making processes, especially in the context of our insurance-focused business operations.
Key Responsibilities:
β’ Collaborate with data analysts, reporting team and business advisors to gather requirements and define data models that effectively support business requirements
β’ Develop and maintain scalable and efficient data pipelines to ensure seamless data flow across various systems adddress any issues or bottlenecks in existing pipelines.
β’ Implement robust data checks to ensure the accuracy and integrity of data. Summarize and validate large datasets to ensure they meet quality standards.
β’ Monitor data jobs for successful completion. Troubleshoot and resolve any issues that arise to minimize downtime and ensure continuity of data processes.
β’ Regularly review and audit data processes and pipelines to ensure compliance with internal standards and regulatory requirements
β’ Familiar with working on Agile methodologies - scrum, sprint planning, backlog refinement etc.
Candidate Profile:
β’ 7-12 years experience on Data Engineering role working with Databricks & Cloud technologies.
β’ Bachelorβs degree in computer science, Information Technology, or related field.
β’ Strong proficiency in PySpark, Python, SQL.
β’ Strong experience in data modeling, ETL/ELT pipeline development, and automation
β’ Hands-on experience with performance tuning of data pipelines and workflows
β’ Proficient in working on Azure cloud components Azure Data Factory, Azure DataBricks, Azure Data Lake etc.
β’ Experience with data modeling, ETL processes, Delta Lake and data warehousing.
β’ Experience on Delta Live Tables, Autoloader & Unity Catalog.
β’ Preferred - Knowledge of the insurance industry and its data requirements.
β’ Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
β’ Excellent communication and problem-solving skills to work effectively with diverse teams
β’ Excellent problem-solving skills and ability to work under tight deadlines.