Strivernet IT Services

Big Data Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Developer in Alpharetta, GA, offering a contract for 40 hours per week at $55.00 - $60.00 per hour. Requires 8 years of experience in Data Engineering, expertise in GCP and Dataflow, and proficiency in SQL, Python, or Java.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA 30009
-
🧠 - Skills detailed
#Scala #Data Processing #Data Quality #Java #Data Engineering #Data Modeling #Kafka (Apache Kafka) #Dataflow #Python #Security #Data Pipeline #Cloud #Programming #Data Security #Batch #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Hadoop #GCP (Google Cloud Platform) #Datasets #SQL (Structured Query Language) #Big Data
Role description
Position: Big Data Engineer (GCP & Dataflow)Location: Alpharetta, GA (Onsite)Preference: Need Local to GAClient Interview: Next WeekRelocation: NoTech Stack: Big Data, Google Cloud Platform (GCP), Dataflow VISA: USC or GC Job Description We are seeking an experienced Senior Data Engineer with strong expertise in Big Data technologies, Google Cloud Platform (GCP), and Dataflow. The ideal candidate will have around 8 years of hands-on experience designing, building, and maintaining scalable data pipelines and cloud-based data solutions. Key Responsibilities Design, develop, and optimize scalable Big Data pipelines on GCP Build and manage data processing workflows using Google Cloud Dataflow Work with large datasets for batch and real-time data processing Ensure data quality, reliability, and performance across pipelines Collaborate with cross-functional teams including analytics, product, and engineering Monitor, troubleshoot, and resolve data pipeline issues Implement best practices for data security and governance on GCP Required Skills & Qualifications ~8 years of experience in Data Engineering / Big Data Strong hands-on experience with GCP services Expertise in Google Cloud Dataflow Experience with Big Data tools and frameworks (e.g., Spark, Hadoop, Kafka) Strong knowledge of SQL and data modeling Proficiency in programming languages such as Python or Java Experience building ETL/ELT pipelines Job Type: Contract Pay: $55.00 - $60.00 per hour Expected hours: 40 per week Work Location: In person