

Talent Groups
Azure Databricks Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Databricks Engineer in Louisville, Kentucky, requiring 12+ years of IT experience. Key skills include SQL, Python, Databricks, and Azure services. The contract length and pay rate are unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date
November 7, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Louisville, KY
-
π§ - Skills detailed
#Data Modeling #Databricks #Cloud #Version Control #Data Engineering #"ETL (Extract #Transform #Load)" #Python #Automation #ADF (Azure Data Factory) #Azure cloud #Azure Data Factory #Azure #SQL (Structured Query Language) #Data Quality #Azure Databricks #Snowflake #Data Lake
Role description
This position is based in Louisville, Kentucky, and requires candidates to work onsite from day one. and candidates should have more than12+ years of experience in IT.
Job Description:
Key Responsibilities:
β’ Design, develop, and optimize robust ETL/ELT pipelines using Databricks, Python and SQL.
β’ Write clean, efficient, and reusable code in SQL & Python for data transformation and automation. Β·
β’ Collaborate with business stakeholders to understand data needs and deliver high-quality solutions. Β·
β’ Ensure data quality, integrity, and governance across all data platforms. Β·
β’ Monitor and troubleshoot data workflows and performance issues.
Required Qualifications:
β’ 8+ years of experience in data engineering or a similar role.
β’ Proficiency in SQL, Python, and Automation. Β·
β’ Hands-on experience with Databricks, Snowflake, and Azure cloud services (e.g., Azure Data Lake, Azure Data Factory, Azure Functions). Β·
β’ Strong understanding of data modeling, data warehousing, and ETL/ELT best practices. Β·
β’ Experience with version control and CI/CD practices. Β·
β’ Strong problem-solving skills and ability to work independently in a fast-paced environment.
This position is based in Louisville, Kentucky, and requires candidates to work onsite from day one. and candidates should have more than12+ years of experience in IT.
Job Description:
Key Responsibilities:
β’ Design, develop, and optimize robust ETL/ELT pipelines using Databricks, Python and SQL.
β’ Write clean, efficient, and reusable code in SQL & Python for data transformation and automation. Β·
β’ Collaborate with business stakeholders to understand data needs and deliver high-quality solutions. Β·
β’ Ensure data quality, integrity, and governance across all data platforms. Β·
β’ Monitor and troubleshoot data workflows and performance issues.
Required Qualifications:
β’ 8+ years of experience in data engineering or a similar role.
β’ Proficiency in SQL, Python, and Automation. Β·
β’ Hands-on experience with Databricks, Snowflake, and Azure cloud services (e.g., Azure Data Lake, Azure Data Factory, Azure Functions). Β·
β’ Strong understanding of data modeling, data warehousing, and ETL/ELT best practices. Β·
β’ Experience with version control and CI/CD practices. Β·
β’ Strong problem-solving skills and ability to work independently in a fast-paced environment.






