

ContractStaffingRecruiters.com
Data Engineer - Databricks - REMOTE
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Databricks expertise, offering a remote contract. Key skills include ETL process design, cloud technologies (Azure), and programming in Python and SQL. Experience with data lakes and big data technologies is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Mart #Programming #Azure #Python #Azure SQL #"ETL (Extract #Transform #Load)" #Cloud #Data Engineering #Databricks #SQL (Structured Query Language) #Big Data #SQL Server #Databases #Documentation #Synapse #MySQL #Data Lake #Oracle
Role description
Databricks experience is a must. Looking for a leader that loves Databricks and can work on fast moving team.
• Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases
• Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake
• Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server)
• Maintain comprehensive project documentation
• Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement.
• Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members.
• Manage multiple projects, responsibilities and competing priorities.
Experience Needed:
• Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB
• Database platforms such as: Oracle, SQL Server, MySQL
• Big data concepts and technologies such as Synapse & Databricks
Databricks experience is a must. Looking for a leader that loves Databricks and can work on fast moving team.
• Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases
• Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake
• Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server)
• Maintain comprehensive project documentation
• Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement.
• Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members.
• Manage multiple projects, responsibilities and competing priorities.
Experience Needed:
• Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB
• Database platforms such as: Oracle, SQL Server, MySQL
• Big data concepts and technologies such as Synapse & Databricks






