Sr. Data Pipeline Engineer (AWS/Databricks)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Pipeline Engineer (AWS/Databricks) with a 12+ month contract in San Jose, CA. Key skills include building data pipelines, Databricks, Spark/PySpark, Airflow, and Kafka. Local candidates only; face-to-face interview required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
720
-
πŸ—“οΈ - Date discovered
August 2, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Observability #PySpark #Azure #S3 (Amazon Simple Storage Service) #Data Warehouse #AI (Artificial Intelligence) #Base #Storage #GCP (Google Cloud Platform) #Logging #Data Pipeline #Apache Spark #Airflow #Kafka (Apache Kafka) #Data Processing #Python #Scala #Cloud #Data Engineering #Security #AWS (Amazon Web Services) #Datasets #ML (Machine Learning) #Spark (Apache Spark) #Databricks #Apache Airflow #API (Application Programming Interface) #Compliance #Batch
Role description

Heading 1

Heading 2

Heading 3

Heading 4

Heading 5
Heading 6

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.

Block quote

Ordered list

  1. Item 1
  2. Item 2
  3. Item 3

Unordered list

  • Item A
  • Item B
  • Item C

Text link

Bold text

Emphasis

Superscript

Subscript