

Data Engineer (Databricks & Yaml Must)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer specializing in Azure, Databricks, and YAML, offering a remote contract position with a competitive pay rate. Key skills include Python, PySpark, Azure SQL, and Power BI. A degree in computer science and relevant certifications are preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
400
-
ποΈ - Date discovered
June 19, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Databases #Security #Azure SQL Database #Visualization #Database Administration #Microsoft Power BI #Data Extraction #ADF (Azure Data Factory) #Azure cloud #Cloud #Data Analysis #Data Integrity #Spark (Apache Spark) #Data Engineering #Data Lake #"ETL (Extract #Transform #Load)" #Data Transformations #Database Performance #Data Processing #Azure #Azure Analysis Services #Azure SQL #Data Orchestration #Databricks #YAML (YAML Ain't Markup Language) #Python #PySpark #SQL (Structured Query Language) #Azure Data Factory #DAX #Big Data #Computer Science #Data Governance #BI (Business Intelligence) #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Azure Data Engineer
Remote/NearshoreβEST hours
About the Role
MUST-HAVE SKILLS: Python, Pyspark, Databricks, Azure, YAML, Databricks Asset Bundles
Responsibilities
β’ Design, implement, and manage data solutions within the Azure cloud environment.
β’ Develop/analyze/optimize Power Query scripts for efficient data extraction, transformation, and loading (ETL) processes.
β’ Design and implement robust Azure SQL databases to support diverse data needs.
β’ Ensure data integrity, security, and optimal performance within Azure SQL.
β’ Implement & execute SQL optimizations to enhance database performance and query efficiency.
β’ Must have a detailed understanding of data warehousing techniques and implementation guidelines.
β’ Analyze/architect/implement data pipelines using Azure Data Factory for seamless data orchestration and movement.
β’ Design and maintain Azure Analysis Services models for efficient data analysis and reporting.
β’ Utilize Databricks with a focus on PySpark for big data processing and analytics.
β’ Develop and optimize PySpark scripts for large-scale data transformations and analytics.
β’ Proven experience of using Unity Catalog and other governance techniques.
β’ Develop PowerBI reports based on the requirements, along with DAX optimizations.
β’ Implement optimizations for Power BI reports to ensure efficient and responsive data visualization.
Qualifications
β’ Bachelorβs or masterβs degree in computer science, information technology, or a related field.
β’ Proven experience as a lead or senior data engineer.
β’ In-depth expertise in Azure cloud infrastructure and services.
β’ Strong proficiency in Power Query for ETL processes.
β’ Extensive knowledge of Azure SQL, SQL optimizations, and database administration.
β’ Hands-on experience with Azure Data Factory for data pipeline development.
β’ Proficiency in Azure Analysis Services and Databricks with a focus on PySpark.
β’ Strong experience with Power BI for data visualization and reporting.
β’ Strong problem-solving and analytical skills.
β’ Excellent communication and collaboration abilities.
Preferred Skills
β’ Certification in Azure Data Engineering or a related field.
β’ Familiarity with data governance and security best practices.
β’ Experience in implementing and maintaining data lakes on Azure.