

Big Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer in Phoenix, AZ (Hybrid) with a contract length of unspecified duration, offering $42.00 - $45.00 per hour. Key skills include GCP, Hadoop, NIFI, Java, and experience with data pipelines and REST APIs.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
360
-
ποΈ - Date discovered
July 31, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ 85003
-
π§ - Skills detailed
#Kafka (Apache Kafka) #GitHub #RDBMS (Relational Database Management System) #Scrum #Dataflow #Big Data #Splunk #Spark (Apache Spark) #AWS (Amazon Web Services) #Programming #Deployment #Jenkins #Dynatrace #Computer Science #REST API #Data Pipeline #Cloud #Docker #Java #Data Engineering #Hadoop #GCP (Google Cloud Platform) #NiFi (Apache NiFi) #Storage #Agile #Shell Scripting #Scripting #Debugging #SQL (Structured Query Language) #Kubernetes #REST (Representational State Transfer)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi,
We need a very strong : Big Data Engineer it would be Location at Phoenix, AZ (Hybrid).
Big Data Engineer
Location: Phoenix, AZ (Hybrid)
Contact W2
Visaβs: USC, GC/GC EAD, H4 EAD
Work Setup: Hybrid β expected to be in-office Tuesday through Thursday, per company policy
Job DescriptionLooking for a seasoned Software Engineer with big data and full-stack development expertise. This role will be pivotal in maintaining and modifying existing NIFI jobs, migrating existing big data, NIFI and java rest APIs to the Google Cloud Platform (GCP).
Minimum Qualifications
BS in computer science, computer engineering, or other technical discipline, or equivalent work experience.
Hands-on software development experience with GCP, AWS or other Cloud/Big Data solutions.
Experience working with Hadoop, MapR, Hive, Spark, shell scripting, GCP clusters and distributed (multi-tiered) systems.
Proficiency in developing and optimizing data pipelines using NIFI or GCP Cloud Dataflow.
Experience building event processing pipelines with Kafka or GCP Pub Sub.
Hands-on experience with SQL and HSQL, and multiple storage technologies including RDBMS, document stores, and search indices. Hands-on experience with cloud services for application development and deployment such as Kubernetes, docker, etc.
Experience in developing REST APIs using Springboot or Apache Camel.
Hands-on experience setting up instrumentation, analyzing performance, distributed tracing, and debugging using tools like Dynatrace, Splunk, etc.
Strong Object-Oriented Programming skills, SOLID principles, and design patterns; preferably Java.
Good knowledge of CICD pipelines and source code management tools (XLR, Jenkins, GitHub).
Familiarity with Agile & scrum ceremonies.
Preferred Qualifications
Certifications in cloud platform (GCP Professional Data Engineer) is a plus.
Salesforce knowledge or prior experience integrating with the salesforce platform is a major plus.
Best Regards,
Sarvan Singh
Sr Technical Recruiter| eHub Global
sarvan.s@ehub.global | 214-751-8343
ehub.Global
Job Type: Contract
Pay: $42.00 - $45.00 per hour
Work Location: Hybrid remote in Phoenix, AZ 85003