Big Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer in Phoenix, AZ (Hybrid) with a contract length of unspecified duration, offering $42.00 - $45.00 per hour. Key skills include GCP, Hadoop, NIFI, Java, and experience with data pipelines and REST APIs.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
360
-
πŸ—“οΈ - Date discovered
July 31, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Phoenix, AZ 85003
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #GitHub #RDBMS (Relational Database Management System) #Scrum #Dataflow #Big Data #Splunk #Spark (Apache Spark) #AWS (Amazon Web Services) #Programming #Deployment #Jenkins #Dynatrace #Computer Science #REST API #Data Pipeline #Cloud #Docker #Java #Data Engineering #Hadoop #GCP (Google Cloud Platform) #NiFi (Apache NiFi) #Storage #Agile #Shell Scripting #Scripting #Debugging #SQL (Structured Query Language) #Kubernetes #REST (Representational State Transfer)
Role description

Heading 1

Heading 2

Heading 3

Heading 4

Heading 5
Heading 6

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.

Block quote

Ordered list

  1. Item 1
  2. Item 2
  3. Item 3

Unordered list

  • Item A
  • Item B
  • Item C

Text link

Bold text

Emphasis

Superscript

Subscript