Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of application development and 2+ years of Kafka experience. It is a long-term hybrid contract in Alpharetta, GA or Menlo Park, CA, offering a competitive pay rate.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 18, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#Deployment #Requirements Gathering #SQL (Structured Query Language) #Python #Data Engineering #Java #Hadoop #Snowflake #Programming #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Ruby #C++ #Linux #"ETL (Extract #Transform #Load)" #Elasticsearch #Cloud
Role description
W2 Opportunity | Data Engineer – Real-Time Operations Intelligence (RTOI) πŸ“ Hybrid: Alpharetta, GA OR Menlo Park, CA (3 days onsite required) πŸ’Ό Investment Banking Client | Long-Term Contract About the Team: The Real-Time Operations Intelligence (RTOI) team within Client Enterprise Computing is responsible for streaming terabytes of data daily. We design large-scale ETL pipelines leveraging Kafka, ElasticSearch ELK, Snowflake, and Hadoop, delivering hundreds of dashboards that provide real-time insights and actionable intelligence for business and operations. Role Overview: We are seeking a Streaming Data Engineer with strong distributed systems knowledge to help build and scale enterprise-level ETL frameworks. The ideal candidate will be hands-on, collaborative, and comfortable working across the full SDLCβ€”from requirements gathering to deployment. Key Responsibilities: β€’ Design and develop distributed ETL pipelines using modern data technologies. β€’ Build applications that run both on-prem and in cloud environments. β€’ Work across the full development lifecycle: requirements, design, coding, testing, and deployment. β€’ Collaborate with cross-functional teams and communicate technical solutions effectively. β€’ Continuously learn and adapt to emerging technologies. Requirements: βœ” 5+ years application development experience βœ” 2+ years data engineering experience with Kafka βœ” Strong experience with Linux environments βœ” Proficiency in at least one programming language: Python, Ruby, Java, C/C++, or Go βœ” Solid SQL and database background Nice-to-Have Skills: β€’ Cloud experience (AWS or similar) β€’ ElasticSearch / ELK Stack Additional Details: β€’ Must be local to Alpharetta, GA OR Menlo Park, CA β€’ Resume must state which location the candidate is applying for