

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of application development and 2+ years of Kafka experience. It is a long-term hybrid contract in Alpharetta, GA or Menlo Park, CA, offering a competitive pay rate.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 18, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Alpharetta, GA
-
π§ - Skills detailed
#Deployment #Requirements Gathering #SQL (Structured Query Language) #Python #Data Engineering #Java #Hadoop #Snowflake #Programming #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Ruby #C++ #Linux #"ETL (Extract #Transform #Load)" #Elasticsearch #Cloud
Role description
W2 Opportunity | Data Engineer β Real-Time Operations Intelligence (RTOI)
π Hybrid: Alpharetta, GA OR Menlo Park, CA (3 days onsite required)
πΌ Investment Banking Client | Long-Term Contract
About the Team:
The Real-Time Operations Intelligence (RTOI) team within Client Enterprise Computing is responsible for streaming terabytes of data daily. We design large-scale ETL pipelines leveraging Kafka, ElasticSearch ELK, Snowflake, and Hadoop, delivering hundreds of dashboards that provide real-time insights and actionable intelligence for business and operations.
Role Overview:
We are seeking a Streaming Data Engineer with strong distributed systems knowledge to help build and scale enterprise-level ETL frameworks. The ideal candidate will be hands-on, collaborative, and comfortable working across the full SDLCβfrom requirements gathering to deployment.
Key Responsibilities:
β’ Design and develop distributed ETL pipelines using modern data technologies.
β’ Build applications that run both on-prem and in cloud environments.
β’ Work across the full development lifecycle: requirements, design, coding, testing, and deployment.
β’ Collaborate with cross-functional teams and communicate technical solutions effectively.
β’ Continuously learn and adapt to emerging technologies.
Requirements:
β 5+ years application development experience
β 2+ years data engineering experience with Kafka
β Strong experience with Linux environments
β Proficiency in at least one programming language: Python, Ruby, Java, C/C++, or Go
β Solid SQL and database background
Nice-to-Have Skills:
β’ Cloud experience (AWS or similar)
β’ ElasticSearch / ELK Stack
Additional Details:
β’ Must be local to Alpharetta, GA OR Menlo Park, CA
β’ Resume must state which location the candidate is applying for
W2 Opportunity | Data Engineer β Real-Time Operations Intelligence (RTOI)
π Hybrid: Alpharetta, GA OR Menlo Park, CA (3 days onsite required)
πΌ Investment Banking Client | Long-Term Contract
About the Team:
The Real-Time Operations Intelligence (RTOI) team within Client Enterprise Computing is responsible for streaming terabytes of data daily. We design large-scale ETL pipelines leveraging Kafka, ElasticSearch ELK, Snowflake, and Hadoop, delivering hundreds of dashboards that provide real-time insights and actionable intelligence for business and operations.
Role Overview:
We are seeking a Streaming Data Engineer with strong distributed systems knowledge to help build and scale enterprise-level ETL frameworks. The ideal candidate will be hands-on, collaborative, and comfortable working across the full SDLCβfrom requirements gathering to deployment.
Key Responsibilities:
β’ Design and develop distributed ETL pipelines using modern data technologies.
β’ Build applications that run both on-prem and in cloud environments.
β’ Work across the full development lifecycle: requirements, design, coding, testing, and deployment.
β’ Collaborate with cross-functional teams and communicate technical solutions effectively.
β’ Continuously learn and adapt to emerging technologies.
Requirements:
β 5+ years application development experience
β 2+ years data engineering experience with Kafka
β Strong experience with Linux environments
β Proficiency in at least one programming language: Python, Ruby, Java, C/C++, or Go
β Solid SQL and database background
Nice-to-Have Skills:
β’ Cloud experience (AWS or similar)
β’ ElasticSearch / ELK Stack
Additional Details:
β’ Must be local to Alpharetta, GA OR Menlo Park, CA
β’ Resume must state which location the candidate is applying for