Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position in Alpharetta, GA or Menlo Park, CA, lasting 12+ months at $77.30/hr. Requires 5 years of application development, 2 years in data engineering with Kafka, and proficiency in Python, SQL, and Linux.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
616
-
πŸ—“οΈ - Date discovered
August 29, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Menlo Park, CA
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Elasticsearch #Hadoop #AWS (Amazon Web Services) #C++ #Cloud #Java #Snowflake #Deployment #Data Engineering #Python #Ruby #SQL (Structured Query Language) #Linux #Kafka (Apache Kafka)
Role description
Please find the Job Position Details; Position: Data Engineer Location: Alpharetta, GA or Menlo Park, CA (3 Days Onsite - Hybrid) Duration: 12+ Months (with possible extension) Pay Rate: $77.30/hr. on W2 Job Description: β€’ RTOI Data Engineer β€’ Realtime operation intelligence team in Client Enterprise Computing is responsible to stream terabytes of data daily. β€’ We have built job frameworks to run large scale ETL pipelines with Kafka, Elasticsearch ELK, Snowflake, Hadoop. β€’ Our applications run both on perm and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items at real-time. β€’ Client is looking for streaming, data engineer β€’ Understand distributed systems architecture, design and tradeoff. β€’ Design and develop ETL pipelines with a wide range of technologies. β€’ Able to work on full cycle of development including defining requirement, design, implementation, testing, deployment. β€’ Strong communication skills to collaborate with various teams. β€’ Able to learn new technologies and work independently. Requirements: β€’ 5 years of application development experience, at least 2 years data engineering with Kafka β€’ Working experience writing and running applications on Linux β€’ 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C++, GO β€’ SQL and database experience Optional: β€’ AWS or other cloud technologies β€’ Elasticsearch ELK