

Big Data Developer (GCP, Apache NIFI, PubSub) W2
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Developer (GCP, Apache NIFI, PubSub) with a contract length of "unknown" and a pay rate of "$/hour." Key skills include GCP, NIFI, Java REST APIs, and experience with big data technologies. Certifications in GCP Professional Data Engineer are a plus.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 31, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Kafka (Apache Kafka) #GitHub #RDBMS (Relational Database Management System) #Scrum #Dataflow #Big Data #Splunk #Spark (Apache Spark) #AWS (Amazon Web Services) #Programming #Deployment #Jenkins #Dynatrace #Computer Science #REST API #Data Pipeline #Cloud #Docker #Java #Data Engineering #Hadoop #GCP (Google Cloud Platform) #NiFi (Apache NiFi) #Storage #Apache NiFi #Agile #Shell Scripting #Scripting #Debugging #SQL (Structured Query Language) #Kubernetes #REST (Representational State Transfer)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Full Stack Developer - GCP
β’ Certifications in cloud platform (GCP Professional Data Engineer) is a plus.
Job Description
Looking for a seasoned Software Engineer with big data and full-stack development expertise. This role will be pivotal in maintaining and modifying existing NIFI jobs, migrating existing big data, NIFI and java rest APIs to the Google Cloud Platform (GCP).
Minimum Qualifications
β’ BS in computer science, computer engineering, or other technical discipline, or equivalent work experience.
β’ Hands-on software development experience with GCP, AWS or other Cloud/Big Data solutions.
β’ Experience working with Hadoop, MapR, Hive, Spark, shell scripting, GCP clusters and distributed (multi-tiered) systems.
β’ Proficiency in developing and optimizing data pipelines using NIFI or GCP Cloud Dataflow.
β’ Experience building event processing pipelines with Kafka or GCP Pub Sub.
β’ Hands-on experience with SQL and HSQL, and multiple storage technologies including RDBMS, document stores, and search indices.
β’ Hands-on experience with cloud services for application development and deployment such as Kubernetes, docker, etc.
β’ Experience in developing REST APIs using Springboot or Apache Camel.
β’ Hands-on experience setting up instrumentation, analyzing performance, distributed tracing, and debugging using tools like Dynatrace, Splunk, etc.
β’ Strong Object-Oriented Programming skills, SOLID principles, and design patterns; preferably Java.
β’ Good knowledge of CICD pipelines and source code management tools (XLR, Jenkins, GitHub).
β’ Familiarity with Agile & scrum ceremonies.
Preferred Qualifications
β’ Salesforce knowledge or prior experience integrating with the salesforce platform is a major plus.