AVS LLC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Iselin, NJ, on a contract for 40 hours/week, paying $51.84 - $62.43 per hour. Requires 9 years of experience, proficiency in Java, Kafka, GCP, PySpark, and SQL, and familiarity with Agile practices.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
496
-
🗓️ - Date
November 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Iselin, NJ 08830
-
🧠 - Skills detailed
#Spark (Apache Spark) #SQL (Structured Query Language) #Unit Testing #Data Processing #Data Transformations #TypeScript #MongoDB #MySQL #Python #Data Science #Agile #Angular #Java #Scala #GCP (Google Cloud Platform) #React #Microservices #JavaScript #Databases #Kafka (Apache Kafka) #Apache Airflow #Storage #PySpark #Cloud #Airflow #Compliance #PostgreSQL #Spring Boot #Oracle #Security #Databricks #Data Engineering #"ETL (Extract #Transform #Load)" #Apache Kafka #Dataflow #Code Reviews
Role description
Note: Apply only If you are willing to work on our W2 payroll and you are USC, GC, GCEAD, H4EAD Role: Data Engineer with Java and Kafka Location: Iselin, NJ Looking for a Data Engineer with expertise in Java, Kafka, Python, PySpark, Scala, SQL, and Google Cloud Platform (GCP). Key Responsibilities: Develop and maintain ETL processes using PySpark on GCP. Optimize and troubleshoot data processing jobs for performance and reliability. Implement data transformations and create pipelines to support analytics. Collaborate with data scientists and analysts to deliver business insights. Monitor and maintain cloud infrastructure related to data processing on GCP. Document technical solutions and provide support for data-related issues. Build and maintain enterprise-grade applications using Java, Spring Boot, and Microservices. Implement event-driven architectures leveraging Apache Kafka for real-time data processing. Utilize Apache BeanUtils for efficient object manipulation and data transformation. Develop responsive UI components using React.js or Angular. Collaborate with cross-functional teams to define architecture and best practices. Optimize application performance and ensure security compliance. Participate in code reviews, unit testing, and CI/CD pipeline implementation. Required Skills: Hands-on experience with GCP services (Big Query, Dataproc, Dataflow, Cloud Storage, Pub/Sub). Proficiency in PySpark, Scala, and SQL for large-scale data processing. Experience with workflow orchestration tools like Apache Airflow or Cloud Composer. Familiarity with Databricks and CI/CD practices. Strong problem-solving and analytical skills. Core Java, J2EE, Spring Boot, Microservices. Apache Kafka (Producer/Consumer APIs, Streams, Connect). Apache BeanUtils or similar frameworks. Frontend: React.js / Angular, HTML5, CSS3, JavaScript, TypeScript. Databases: Oracle, MySQL, MongoDB, PostgreSQL. Familiarity with Agile, TDD, and SDLC best practices. Job Type: Contract Pay: $51.84 - $62.43 per hour Expected hours: 40 per week Application Question(s): We need your Passport Number for submission are comfortable? Do you have USC, GC, GCEAD, H4EAD ? Experience: Data Engineer: 9 years (Required) Work Location: In person