

Data Engineer - Databricks
β - Featured Role | Apply direct with Data Freelance Hub
This role is a remote Data Engineer - Databricks position with a contract length of unspecified duration, offering a pay rate of $100-130/hr. Requires a Bachelor's degree, 7+ years in enterprise solutions, cloud expertise, and deep SQL skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
1040
-
ποΈ - Date discovered
June 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#RDS (Amazon Relational Database Service) #SQL (Structured Query Language) #Data Governance #AWS RDS (Amazon Relational Database Service) #Data Engineering #Data Modeling #Database Management #Hadoop #Computer Science #Data Pipeline #BigQuery #Python #Redshift #Server Administration #Big Data #Databricks #MIS Systems (Management Information Systems) #Bash #Teradata #Presto #Agile #Databases #Security #Scripting #Talend #Data Warehouse #Kafka (Apache Kafka) #Informatica #"ETL (Extract #Transform #Load)" #Cloud #DynamoDB #Spark (Apache Spark) #AWS (Amazon Web Services) #Unix
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
A financial firm is looking for a Data Engineer - Databricks to join their team. This role is remote.
Pay: $100-130/hr
US Citizens or GC Holders Only
No c2c - W2 Only
Qualifications: A Bachelor's Degree in Computer Science, Management Information Systems, Computer Engineering, or related field 7+ years of experience in designing and building large-scale solutions in an enterprise setting 3+ years designing and building solutions in the cloud Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB or analogous architectures Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures Deep SQL expertise, data modeling, and experience with data governance in relational databases Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies Refined skills using one or more scripting languages (e.g., Python, bash, etc.) Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind Expertise in relational and dimensional data modeling UNIX admin and general server administration experience Experience with leveraging CI/CD pipelines Presto, Hive, SparkSQL, Cassandra, or Solr other Big Data query and transformation experience is a plus Experience using Spark, Kafka, Hadoop, or similar distributed data technologies is a plus Experience with Agile methodologies and able to work in an Agile manner is preferred Experience using ETL/ELT tools and technologies such as Talend, Informatica is a plus Expertise in Cloud Data Warehouses in Redshift, BigQuery or analogous architectures is a plus