

Databricks Data Engineer (Hybrid Onsite - W2 Contract)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Databricks Data Engineer on a long-term W2 contract in Boston, MA (hybrid onsite 3-4 days/week). Key skills include Databricks, AWS services, data migration, and programming in Python and Scala.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Boston, MA
-
π§ - Skills detailed
#SQL (Structured Query Language) #Database Architecture #Data Engineering #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Migration #Kubernetes #RDBMS (Relational Database Management System) #S3 (Amazon Simple Storage Service) #Java #ML (Machine Learning) #RDS (Amazon Relational Database Service) #Docker #Scala #Hadoop #Databricks #BI (Business Intelligence) #Programming #Airflow #Deployment #API (Application Programming Interface) #Big Data #EC2 #Lambda (AWS Lambda) #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Xoriant Corporation, is seeking the following. Apply via Dice today!
Job Title : AWS Databricks Data Engineer
Location: Boston, MA (3- 4 days a week Hybrid onsite)
Duration : Long term contract
Contract: W2 / C2C IC (Own Individual corp only)
Job Description:
Roles & Responsibilities
Recognize the current application infrastructure and suggest new concepts to improve performance
Document the best practices and strategies associated with application deployment and infrastructure support
Produce reusable, efficient, and scalable programs, and cost-effective migration strategies
Develop Data Engineering and Machine Learning pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda.
Work jointly with the IT team and other departments to migrate data engineering and Machine Learning applications to Databricks/AWS
Comfortable to work on tight timelines, when required.
Skill Sets Required
Good decision-making and problem-solving skills
Solid understanding of Databricks fundamentals/architecture and have hands on experience in Databricks modules (Data Engineering, Machine Learning and SQL warehouse).
Knowledge of medallion architecture, DLT and unity catalog within Databricks.
Knowledge of Machine learning model development process.
Experience in migrating data from on-prem Hadoop to Databricks/AWS
Understanding of core AWS services, uses, and AWS architecture best practices
Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
Solid knowledge on Airflow
Solid knowledge on CI/CD pipeline in AWS technologies
Application migration of RDBMS, java/python applications, model code, elastic etc.
Solid programming background on scala, python
Experience with Docker and Kubernetes is a plus