

Yorkshire Global Solutions Inc.
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 13+ years of IT experience, focusing on GCP and Python. Contract length is W2 only, remote work. Key skills include Spark, Kubernetes, Docker, and ML. H4 EAD or USC visa required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Programming #Data Pipeline #Code Reviews #Cloud #Data Engineering #Docker #GCP (Google Cloud Platform) #Jenkins #Kubernetes #Scala #Big Data #Python #ML (Machine Learning) #Hadoop #Airflow #BigQuery #Spark (Apache Spark)
Role description
Job Title: Lead GCP Data Engineer(Remote)
Experience: 13+ Years
Visa: H4 EAD & USC
Contract: W2 Only (No Chance of C2C/1099)
Required Experience & Skills:
• 13+ years of total IT experience
• Strong Python programming expertise
• Extensive GCP experience: Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Functions, Serverless, GCR
• Hands-on experience tuning Spark jobs & building scalable data pipelines
• Design & manage complex DAGs using Cloud Composer (Airflow)
• Big Data experience (Hadoop/Dataproc) including admin-level familiarity
• Kubernetes (GKE/EKS) and Docker experience
• CI/CD pipelines using Jenkins or similar
• Strong understanding of ML model lifecycle, algorithms & performance metrics
• Lead solution design discussions, code reviews, and mentor junior team members
Job Title: Lead GCP Data Engineer(Remote)
Experience: 13+ Years
Visa: H4 EAD & USC
Contract: W2 Only (No Chance of C2C/1099)
Required Experience & Skills:
• 13+ years of total IT experience
• Strong Python programming expertise
• Extensive GCP experience: Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Functions, Serverless, GCR
• Hands-on experience tuning Spark jobs & building scalable data pipelines
• Design & manage complex DAGs using Cloud Composer (Airflow)
• Big Data experience (Hadoop/Dataproc) including admin-level familiarity
• Kubernetes (GKE/EKS) and Docker experience
• CI/CD pipelines using Jenkins or similar
• Strong understanding of ML model lifecycle, algorithms & performance metrics
• Lead solution design discussions, code reviews, and mentor junior team members






