

ScaleneWorks INC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract in Cincinnati, OH, requiring hands-on experience with Azure Databricks, Spark, and Python, alongside strong SQL skills. Familiarity with data governance tools and Terraform is essential. On-site work is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#Azure #Automation #Data Governance #Azure cloud #Version Control #Data Engineering #SQL (Structured Query Language) #GitHub #Azure Databricks #Terraform #Python #Databricks #Cloud #Spark (Apache Spark) #Monitoring #Distributed Computing
Role description
Job Title :: Data Engineer
Location :: Cincinnati, OH
Job Type: Contract
Requirements
• Hands-on experience with Azure Databricks, Spark, and Python
• Experience with Delta Live Tables (DLT) & Databricks SQL
• Strong SQL and database background
• Experience with Azure Functions, messaging services, or orchestration tools
• Familiarity with data governance, lineage, or cataloging tools (e.g., Purview, Unity Catalog)
• Experience monitoring and optimizing Databricks clusters or workflows
• Experience working with Azure cloud data services and understanding how they integrate with Databricks and enterprise data platforms
• Experience with Terraform for cloud infrastructure provisioning
• Experience with GitHub and GitHub Actions for version control and CI/CD automation
• Strong understanding of distributed computing concepts (partitions, joins, shuffles, cluster behavior)
• Familiarity with SDLC and modern engineering practices
• Ability to balance multiple priorities, work independently, and stay organized
Job Title :: Data Engineer
Location :: Cincinnati, OH
Job Type: Contract
Requirements
• Hands-on experience with Azure Databricks, Spark, and Python
• Experience with Delta Live Tables (DLT) & Databricks SQL
• Strong SQL and database background
• Experience with Azure Functions, messaging services, or orchestration tools
• Familiarity with data governance, lineage, or cataloging tools (e.g., Purview, Unity Catalog)
• Experience monitoring and optimizing Databricks clusters or workflows
• Experience working with Azure cloud data services and understanding how they integrate with Databricks and enterprise data platforms
• Experience with Terraform for cloud infrastructure provisioning
• Experience with GitHub and GitHub Actions for version control and CI/CD automation
• Strong understanding of distributed computing concepts (partitions, joins, shuffles, cluster behavior)
• Familiarity with SDLC and modern engineering practices
• Ability to balance multiple priorities, work independently, and stay organized






