

Asamanta Technologies LLC
JD: Databricks Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Databricks Engineer in Quincy, MA, with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, Spark, Scala, and Databricks. Requires a Bachelor’s degree and 5+ years in cloud technologies. AWS and/or Databricks certification is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 16, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Quincy, MA
-
🧠 - Skills detailed
#Deployment #ML (Machine Learning) #RDS (Amazon Relational Database Service) #PySpark #SQL (Structured Query Language) #RDBMS (Relational Database Management System) #Scala #Docker #Data Engineering #Airflow #Database Architecture #Databricks #BI (Business Intelligence) #S3 (Amazon Simple Storage Service) #Kafka (Apache Kafka) #Python #Computer Science #Spark (Apache Spark) #Lambda (AWS Lambda) #Big Data #Kubernetes #AWS (Amazon Web Services) #Cloud #Migration #Programming #EC2 #Java #Hadoop #API (Application Programming Interface)
Role description
Spark/Databrick/Scala/Python) is a Must Have.
Sr Databrick Engineer
Quincy, MA
Databrick Engineer
Key Skills: Python/Pyspark, Spark, Scala, Databricks, AWS.
Education and experience Qualification
Bachelors degree in Computer Science, Information Systems, or equivalent education or work experience
Around 5+ years of experience as a developer on cloud technologies
Any AWS and/or Databricks certification will be a plus
Roles & Responsibilities
Recognize the current application infrastructure and suggest new concepts to improve performance
Document the best practices and strategies associated with application deployment and infrastructure support
Produce reusable, efficient, and scalable programs, and also cost-effective migration strategies
Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications
Work jointly with the IT team and other departments to migrate data engineering and ML applications to Databricks/AWS
Comfortable to work on tight timelines, when required.
Skill Sets Required
Good decision-making and problem solving skills
Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse).
Knowledge on medallion architecture, DLT and unity catalog within Databricks.
Experience in migrating data from on-prem Hadoop to Databricks/AWS
Understanding of core AWS services, uses, and AWS architecture best practices
Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
Solid knowledge on Airflow
Solid knowledge on CI/CD pipeline in AWS technologies
Application migration of RDBMS, java/python applications, model code, elastic etc.
Solid programming background on scala, python
Experience with Docker and Kubernetes is a plus
Please share resumes at: careers@asamanta.com
Spark/Databrick/Scala/Python) is a Must Have.
Sr Databrick Engineer
Quincy, MA
Databrick Engineer
Key Skills: Python/Pyspark, Spark, Scala, Databricks, AWS.
Education and experience Qualification
Bachelors degree in Computer Science, Information Systems, or equivalent education or work experience
Around 5+ years of experience as a developer on cloud technologies
Any AWS and/or Databricks certification will be a plus
Roles & Responsibilities
Recognize the current application infrastructure and suggest new concepts to improve performance
Document the best practices and strategies associated with application deployment and infrastructure support
Produce reusable, efficient, and scalable programs, and also cost-effective migration strategies
Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications
Work jointly with the IT team and other departments to migrate data engineering and ML applications to Databricks/AWS
Comfortable to work on tight timelines, when required.
Skill Sets Required
Good decision-making and problem solving skills
Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse).
Knowledge on medallion architecture, DLT and unity catalog within Databricks.
Experience in migrating data from on-prem Hadoop to Databricks/AWS
Understanding of core AWS services, uses, and AWS architecture best practices
Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
Solid knowledge on Airflow
Solid knowledge on CI/CD pipeline in AWS technologies
Application migration of RDBMS, java/python applications, model code, elastic etc.
Solid programming background on scala, python
Experience with Docker and Kubernetes is a plus
Please share resumes at: careers@asamanta.com




