

Clevanoo LLC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 12+ month contract in Alpharetta, GA, and Menlo Park, CA (Hybrid). Requires 5 years of application development, 2 years in data engineering with Kafka, and proficiency in Python, SQL, and Linux.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 25, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #C++ #Snowflake #Linux #Data Engineering #Ruby #Cloud #Python #SQL (Structured Query Language) #Hadoop #Java #Elasticsearch #Kafka (Apache Kafka) #AWS (Amazon Web Services)
Role description
Position: Data Engineer
Location: Alpharetta, GA AND Menlo Park, CA (Hybrid)
Duration: 12+ Month Contract
Visas Accepted: Green Card , GC EAD and USC
Must be comfortable to work on W2
The client has built job frameworks to run large scale ETL pipelines with Kafka, Elastic Search ELK, Snowflake, Hadoop.
• Understand distributed systems architecture, design and trade-off.
- Design and develop ETL pipelines with a wide range of technologies.
• 5 years of application development experience, at least 2 years data engineering with Kafka
- Working experience writing and running applications on Linux
- 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C++, GO
- SQL and database experience
Optional:
- AWS or other cloud technologies
- ElasticSearch ELK
Position: Data Engineer
Location: Alpharetta, GA AND Menlo Park, CA (Hybrid)
Duration: 12+ Month Contract
Visas Accepted: Green Card , GC EAD and USC
Must be comfortable to work on W2
The client has built job frameworks to run large scale ETL pipelines with Kafka, Elastic Search ELK, Snowflake, Hadoop.
• Understand distributed systems architecture, design and trade-off.
- Design and develop ETL pipelines with a wide range of technologies.
• 5 years of application development experience, at least 2 years data engineering with Kafka
- Working experience writing and running applications on Linux
- 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C++, GO
- SQL and database experience
Optional:
- AWS or other cloud technologies
- ElasticSearch ELK





