Big Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer on a 1099 contract in New York, NY, or Phoenix, AZ, offering $50.00 - $55.00 per hour. Requires strong GCP Big Data tools, Java, REST APIs, SQL, and DevOps experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
440
🗓️ - Date discovered
April 27, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
1099 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Phoenix, AZ 85074
🧠 - Skills detailed
#DevOps #GitHub #Cloud #Data Engineering #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Databases #NoSQL #REST API #Scala #Data Pipeline #Big Data #API (Application Programming Interface) #Java #Security #REST (Representational State Transfer) #Dataflow #Kubernetes #BigQuery #Spring Boot
Role description
Big Data EngineerPLEASE READ THOROUGHLY TO AVOID INSTANT REJECTION. Eligibility: USC & GC ONLY - Passport Number Required Location: New York, NY & Phoenix, AZ (In-Person Interview & Onsite Work) Skills: Big Data, ETL - Big Data / Data Warehousing, GCP, Java, REST APIs, Spring Boot, SQL, Cloud Infrastructure Role Overview:Seeking a skilled Big Data Engineer to build real-time, scalable data pipelines and REST APIs using Java and GCP. This role supports critical data-driven initiatives and requires deep technical expertise in Big Data, cloud-native tools, and backend API development. Onsite work and an in-person interview are mandatory. Qualifications: Strong experience with GCP Big Data tools (BigQuery, Dataflow, Dataproc, Pub/Sub) Skilled in backend development with Java, Spring Boot, RESTful API creation Proficient in SQL, ETL, and GCP SDK/API usage Experience with DevOps practices and CI/CD (Cloud Build, GitHub Actions, etc.) Familiarity with Kubernetes, GKE, and containerization (preferred) Strong communication and cross-functional collaboration skills Ability to problem-solve and innovate in a fast-paced, evolving tech environment Key Responsibilities: Design, implement, and optimize scalable Big Data pipelines using GCP tools Develop backend services and APIs using Java (Spring Boot) Implement authentication and security using JWT/OAuth Work with SQL and NoSQL databases Monitor and maintain production data flows and CI/CD pipelines Collaborate closely with analysts, stakeholders, and fellow engineers Continuously explore and adapt to emerging technologies Job Type: Contract Pay: $50.00 - $55.00 per hour Compensation Package: 1099 contract Schedule: 8 hour shift Work Location: In person