

Databricks Data Engineer - REMOTE
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a remote Databricks Data Engineer for a 1-year contract, requiring expertise in Databricks, ETL processes, and cloud technologies like Azure SQL. Proficiency in Python, SQL, and big data concepts is essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 7, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Cloud #Documentation #MySQL #SQL Server #Big Data #"ETL (Extract #Transform #Load)" #Azure SQL #SQL (Structured Query Language) #Azure #Databricks #Data Lake #Python #Programming #Data Engineering #Databases #Data Mart #Synapse #Oracle
Role description
This is a remote long term 1 year contract position plus for our client.
Databricks experience is a must. Looking for a leader that loves Databricks and can work on fast moving team.
• Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases
• Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake
• Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server)
• Maintain comprehensive project documentation
• Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement.
• Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members.
• Manage multiple projects, responsibilities and competing priorities.
Experience Needed:
• Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB
• Database platforms such as: Oracle, SQL Server, MySQL
• Big data concepts and technologies such as Synapse & Databricks
This is a remote long term 1 year contract position plus for our client.
Databricks experience is a must. Looking for a leader that loves Databricks and can work on fast moving team.
• Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases
• Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake
• Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server)
• Maintain comprehensive project documentation
• Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement.
• Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members.
• Manage multiple projects, responsibilities and competing priorities.
Experience Needed:
• Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB
• Database platforms such as: Oracle, SQL Server, MySQL
• Big data concepts and technologies such as Synapse & Databricks