Data Engineer with Databricks

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Databricks in Boston, MA, on a contract basis. Requires 12+ years of experience, including 8 years in Databricks, Hadoop, Python, Spark, and AirFlow. On-site work only; no relocation candidates.
🌎 - Country
United States
πŸ’± - Currency
Unknown
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 7, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #EC2 #Airflow #Spark (Apache Spark) #Spark SQL #SQL (Structured Query Language) #Database Architecture #API (Application Programming Interface) #Java #Kafka (Apache Kafka) #RDBMS (Relational Database Management System) #ML (Machine Learning) #Databricks #S3 (Amazon Simple Storage Service) #Python #AWS (Amazon Web Services) #DevOps #BI (Business Intelligence) #RDS (Amazon Relational Database Service) #Migration #Data Engineering #PySpark #Programming #Hadoop #Scala #Big Data
Role description
Job Type: Contract Job Category: IT Job Description Role: Data Engineer with Databricks Location: Boston, MA (Locals Only) Contract Please share Databricks Data Engineer profiles who are already in Boston. We don’t want any candidates who are ready for relocation. Databricks Engineer: 12+ years of total experience with 8 years relevant experience in the mandatory skills. Mandatory Skills: Databricks, Hadoop, Python, Spark, Spark SQL, PySpark, AirFlow and IBM StreamSet Required Skills & Experience: Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse). Knowledge on medallion architecture, DLT and unity catalog within Databricks. Experience in migrating data from on-prem Hadoop to Databricks/AWS Understanding of core AWS services, uses, and AWS architecture best practices Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc. Solid knowledge on Airflow Solid knowledge on CI/CD pipeline in AWS technologies Application migration of RDBMS, java/python applications, model code, elastic etc. Solid programming background on scala, python, SQL Required Skills DevOps Engineer