Databricks Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Engineer with a 6-month remote contract, offering a pay rate of $106k-$124k. Key skills required include Data Engineering, Python, Databricks, PySpark, Hive, AWS EMR/S3, and DataStage. A Bachelor's degree in Computer Science is mandatory.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
563.6363636364
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Corp-to-Corp (C2C)
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Python #"ETL (Extract #Transform #Load)" #Data Modeling #AWS EMR (Amazon Elastic MapReduce) #Computer Science #DataStage #Data Pipeline #AWS (Amazon Web Services) #Databricks #Scala #Automation #Data Engineering #Lambda (AWS Lambda) #Data Warehouse #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #PySpark #DevOps #Data Processing
Role description
Title: Databricks engineer Location: 100% remote Duration: 6 months Must have: Data Engineering, hands-on Python development, Databricks, PySpark, Hive, AWS EMR/S3, and DataStage… Job Title: Databricks Engineer Key Responsibilities β€’ Design, build, and optimize data pipelines and ETL processes for high-volume data environments. β€’ Develop and maintain solutions leveraging Databricks, Spark, Hive, AWS EMR/S3, and DataStage (or similar systems). β€’ Implement scalable and efficient data processing workflows in AWS environments, including Lambda and S3. β€’ Apply strong Python development skills to build reusable components and automation. β€’ Collaborate with DevOps teams to integrate solutions into CI/CD pipelines and follow modern engineering best practices. β€’ Work on data modeling within a data warehouse environment to ensure accuracy, consistency, and performance. β€’ Partner with cross-functional teams to analyze requirements, troubleshoot issues, and deliver high-quality solutions. Required Skills & Experience β€’ 10+ years of Data Engineering experience (or equivalent demonstrated). β€’ 6+ years of hands-on Python development experience. β€’ Proven expertise with Databricks, Spark, Hive, AWS EMR/S3, and DataStage or equivalent tools. β€’ Strong working knowledge of AWS technologies (Lambda, S3). β€’ Experience with modern build pipelines, CI/CD tools, and automation frameworks. β€’ Strong background in data modeling within a data warehouse environment. Required: Bachelor's degree in Computer Science or a related field This role is open to W2 or those seeking Corp-Corp employment. The salary range for this role is 106k-124k. For corp-Corp rates please contact the recruiter. In addition to other benefits, Accion Labs offers a comprehensive benefits package, with Accion covering 65% of the medical, dental, and Vision Premiums for employees, their spouses, and dependent children enrolling in the Accion-provided plans.