

VeriiPro
GCP Data Engineer | W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a W2 contract, requiring 3–6+ years of data engineering experience, strong GCP and SQL skills, and proficiency in Python or Spark. Preferred: GCP Professional Data Engineer certification.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 2, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Python #SQL (Structured Query Language) #Cloud #Terraform #BigQuery #Spark (Apache Spark) #Storage #Looker #Security #Apache Beam #DevOps #"ETL (Extract #Transform #Load)" #Airflow #Data Quality #Version Control #Dataflow #Data Engineering #GCP (Google Cloud Platform)
Role description
Key Responsibilities
• Develop and maintain ETL/ELT pipelines using GCP tools (Dataflow, BigQuery, Cloud Composer, Dataproc, etc.).
• Design and optimize data models and storage solutions.
• Implement data quality, governance, and security best practices.
• Collaborate with cross-functional teams to enable data-driven insights.
• Monitor and troubleshoot data workflows and performance issues.
Required Skills & Qualifications
• 3–6+ years of experience in data engineering.
• Strong proficiency with GCP data services and SQL.
• Experience with Python, Apache Beam, or Spark.
• Familiarity with CI/CD, version control, and DevOps practices.
• Excellent problem-solving and communication skills.
Preferred
• GCP Professional Data Engineer certification.
• Experience with Airflow, Terraform, or Looker.
Key Responsibilities
• Develop and maintain ETL/ELT pipelines using GCP tools (Dataflow, BigQuery, Cloud Composer, Dataproc, etc.).
• Design and optimize data models and storage solutions.
• Implement data quality, governance, and security best practices.
• Collaborate with cross-functional teams to enable data-driven insights.
• Monitor and troubleshoot data workflows and performance issues.
Required Skills & Qualifications
• 3–6+ years of experience in data engineering.
• Strong proficiency with GCP data services and SQL.
• Experience with Python, Apache Beam, or Spark.
• Familiarity with CI/CD, version control, and DevOps practices.
• Excellent problem-solving and communication skills.
Preferred
• GCP Professional Data Engineer certification.
• Experience with Airflow, Terraform, or Looker.






