

Data Bricks Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Bricks Engineer, offering a contract longer than 6 months at a pay rate of "TBD". It requires skills in Databricks, Spark, SQL, Python, and cloud platforms, with experience in ETL and data governance. Work location is hybrid in Atlanta, USA.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
288.8095238095
-
ποΈ - Date discovered
July 4, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Georgia, United States
-
π§ - Skills detailed
#Cloud #GCP (Google Cloud Platform) #Business Analysis #"ETL (Extract #Transform #Load)" #Data Bricks #Delta Lake #AWS (Amazon Web Services) #Data Pipeline #Azure #PySpark #Data Quality #Python #Spark (Apache Spark) #Scala #SQL (Structured Query Language) #Databricks
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hiring: Databricks Resources (Engineer | Architect | BA | QA)
Weβre expanding our Databricks team and looking for skilled professionals across roles β Engineers, Architects, Business Analysts, and QA specialists β to deliver scalable data solutions on modern cloud platforms.
π Location: [Remote/Hybrid/Onsite] - Atlanta, USA
πΌ Type: [Full-Time / Contract]
π§ Key Skills:
β’ Databricks, Spark (PySpark/Scala), Delta Lake, Unity Catalog
β’ SQL, Python, Cloud Platforms (AWS/Azure/GCP)
β’ Experience in ETL, Data Quality, Testing, or Business Analysis
β’ Strong understanding of data pipelines, governance, and performance tuning