ContractStaffingRecruiters.com

Data Engineer - Databricks - 65/hr Remote

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in Databricks, offering $65/hr for a remote contract. Key skills include Python, SQL, ETL processes, and cloud technologies (AWS, Azure). Experience with data lakes and big data concepts is essential.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
January 30, 2026
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
Unknown
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
New Jersey, United States
-
๐Ÿง  - Skills detailed
#Big Data #Oracle #MySQL #Azure #Data Lake #Data Ingestion #Programming #Azure SQL #SQL Server #Integration Testing #"ETL (Extract #Transform #Load)" #Batch #SQL (Structured Query Language) #Python #BI (Business Intelligence) #Data Engineering #Scala #dbt (data build tool) #Databases #Databricks #AWS (Amazon Web Services) #Cloud #Azure cloud #Documentation #Synapse #Data Mart
Role description
he Opportunity: We are seeking a Data Engineer with experience in projects emphasizing complex data solutions. This person will be responsible for supporting the data ingestion, transformation, and distribution to end consumers. Candidate will perform requirements analysis, design/develop process flow, and unit/integration testing, to deliver complex data products. โ€ข Work with the Business Intelligence team and operational stakeholders to design and implement both the data presentation layer available to the user community, as well as the underlying technical architecture of the data warehousing environment. ยท Develop scalable and reliable data solutions to move data across systems from multiple sources in real time as well as batch modes. โ€ข Design and develop database objects, tables, stored procedures, views, etc. โ€ข Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases โ€ข Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake โ€ข Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server) โ€ข Maintain comprehensive project documentation ยท Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement. ยท Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members. โ€ข Manage multiple projects, responsibilities and competing priorities. Experience Needed: โ€ข Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, DBT โ€ข Database platforms such as: Oracle, SQL Server, MySQL โ€ข Big data concepts and technologies such as Synapse & Databricks โ€ข AWS and Azure cloud computing