

Databricks Developer ___ Quincy, MA (100% Onsite) ___ Contract
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Databricks Developer" in Quincy, MA, offering a 6-12+ month contract. Required skills include Databricks, Hadoop, Python, Spark, and Airflow. Candidates should have 12+ years of experience, with 8 years in relevant technologies and AWS services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 16, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Quincy, MA
-
π§ - Skills detailed
#S3 (Amazon Simple Storage Service) #Databricks #RDS (Amazon Relational Database Service) #Scala #API (Application Programming Interface) #Java #Spark (Apache Spark) #Kafka (Apache Kafka) #SQL (Structured Query Language) #RDBMS (Relational Database Management System) #PySpark #Hadoop #ML (Machine Learning) #Spark SQL #Database Architecture #Big Data #Airflow #EC2 #Lambda (AWS Lambda) #Programming #Migration #AWS (Amazon Web Services) #Data Engineering #BI (Business Intelligence) #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Databricks Developer
Location: Quincy, MA (100% Onsite)
Duration: 6-12+ Months Contract (with possible extension)
Databricks Engineer: 12+ years of total experience with 8 years relevant experience in the mandatory skills.
β’ Mandatory Skills: Databricks, Hadoop, Python, Spark, Spark SQL, PySpark, AirFlow and IBM StreamSet
Required Skills & Experience
β’ Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications
β’ Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse).
β’ Knowledge on medallion architecture, DLT and unity catalog within Databricks.
β’ Experience in migrating data from on-prem Hadoop to Databricks/AWS
β’ Understanding of core AWS services, uses, and AWS architecture best practices
β’ Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
β’ Solid knowledge on Airflow
β’ Solid knowledge on CI/CD pipeline in AWS technologies
β’ Application migration of RDBMS, java/python applications, model code, elastic etc.
β’ Solid programming background on scala, python, SQL