

A2C
Data Engineer W/ GCP
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 3–5+ years of experience, focusing on GCP, BigQuery, and Dataflow. Contract length is unspecified, with a pay rate of "unknown." Strong skills in Python, SQL, and ETL/ELT are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#dbt (data build tool) #ML (Machine Learning) #BigQuery #Data Modeling #GCP (Google Cloud Platform) #Security #Terraform #Cloud #Infrastructure as Code (IaC) #Data Engineering #Docker #Python #"ETL (Extract #Transform #Load)" #Airflow #Data Science #Data Pipeline #Data Quality #Kafka (Apache Kafka) #Dataflow #Scala #Spark (Apache Spark) #Data Governance #Batch #AI (Artificial Intelligence) #SQL (Structured Query Language) #Monitoring
Role description
We’re looking for a Data Engineer to build and maintain scalable data pipelines and cloud data infrastructure on GCP. The role focuses on BigQuery, Dataflow, and modern ETL/ELT to support analytics and ML workflows.
MUST HAVES
• A problem solver with ability to analyze and research complex issues and problems; and proposing actionable solutions and/or strategies.
• Solid understanding and hands on experience with major cloud platforms.
• Experience in designing and implementing data pipelines.
• Must have strong Python, SQL & GCP skills
Responsibilities
• Build and optimize batch/streaming pipelines using Dataflow, Pub/Sub, Composer.
• Develop and tune BigQuery models, queries, and ingestion processes.
• Implement IaC (Terraform), CI/CD, monitoring, and data quality checks.
• Ensure data governance, security, and reliable pipeline operations.
• Collaborate with data science teams and support Vertex AI–based ML workflows.
Must-Have
• Must have strong Python, SQL & GCP skills
• 3–5+ years of data engineering experience.
• Hands-on GCP experience (BigQuery, Dataflow, Pub/Sub).
• Solid ETL/ELT and data modeling experience.
Nice-to-Have
• GCP certifications, Spark, Kafka, Airflow, dbt/Dataform, Docker/K8s.
We’re looking for a Data Engineer to build and maintain scalable data pipelines and cloud data infrastructure on GCP. The role focuses on BigQuery, Dataflow, and modern ETL/ELT to support analytics and ML workflows.
MUST HAVES
• A problem solver with ability to analyze and research complex issues and problems; and proposing actionable solutions and/or strategies.
• Solid understanding and hands on experience with major cloud platforms.
• Experience in designing and implementing data pipelines.
• Must have strong Python, SQL & GCP skills
Responsibilities
• Build and optimize batch/streaming pipelines using Dataflow, Pub/Sub, Composer.
• Develop and tune BigQuery models, queries, and ingestion processes.
• Implement IaC (Terraform), CI/CD, monitoring, and data quality checks.
• Ensure data governance, security, and reliable pipeline operations.
• Collaborate with data science teams and support Vertex AI–based ML workflows.
Must-Have
• Must have strong Python, SQL & GCP skills
• 3–5+ years of data engineering experience.
• Hands-on GCP experience (BigQuery, Dataflow, Pub/Sub).
• Solid ETL/ELT and data modeling experience.
Nice-to-Have
• GCP certifications, Spark, Kafka, Airflow, dbt/Dataform, Docker/K8s.






