

CloudIngest
GCP Data Engineer (W2 Only)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Atlanta, GA, with a contract length of "unknown" and a pay rate of "unknown." Candidates must have strong experience in Python, BigQuery, and Cloud Data Fusion, and must work on W2.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 11, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Georgia, United States
-
🧠 - Skills detailed
#Cloud #GIT #SQL (Structured Query Language) #AI (Artificial Intelligence) #Python #Data Ingestion #Monitoring #Data Architecture #Datasets #Snowflake #Model Deployment #ML (Machine Learning) #BigQuery #Data Engineering #Automation #Storage #Scala #DevOps #GCP (Google Cloud Platform) #Spark (Apache Spark) #Logging #Data Modeling #Data Pipeline #"ETL (Extract #Transform #Load)" #Data Quality #Hadoop #Deployment
Role description
Professionals with suitable experience and expertise may send your up-to-date resume to Dilip@cloudingest.com
Note: Only who can work on W2
.
Job Description: GCP Data Engineer
Location: Atlanta, GA (On-site/Hybrid as applicable)
Summary: We are seeking a highly skilled GCP Data Engineer to design, build, and optimize cloud-native data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate has strong experience with Python, BigQuery, Cloud Data Fusion, and core GCP services such as Cloud Composer, Cloud Storage, Cloud Functions, and Pub/Sub. This role requires a strong foundation in data warehousing concepts and scalable data engineering practices.
Responsibilities
• Design, develop, and maintain robust ETL/ELT pipelines on Google Cloud Platform.
• Build and optimize data workflows using Cloud Data Fusion, BigQuery, and Cloud Composer.
• Write efficient and maintainable Python code to support data ingestion, transformation, and automation.
• Develop optimized BigQuery SQL for analytics, reporting, and large-scale data modeling.
• Utilize GCP services such as Cloud Storage, Pub/Sub, and Cloud Functions to build event-driven and scalable data solutions.
• Ensure data quality, governance, and reliability across all pipelines.
• Collaborate with cross-functional teams to deliver clean, trusted, production-ready datasets.
• Monitor, troubleshoot, and resolve performance issues in cloud data pipelines and workflows.
Must-Have Skills
• Strong experience with GCP BigQuery (data modeling, SQL development, performance tuning).
• Proficiency in Python for data engineering and pipeline automation.
• Hands-on experience with Cloud Data Fusion for ETL/ELT development.
• Working experience with key GCP services:
• Cloud Composer
• Cloud Storage
• Cloud Functions
• Pub/Sub
• Strong understanding of data warehousing concepts, star/snowflake schemas, and best practices.
• Solid understanding of cloud data architecture and distributed processing.
Good-to-Have Skills
• Experience with Vertex AI for ML pipeline integration or model deployment.
• Familiarity with Dataproc (Spark/Hadoop) for large-scale processing.
• Knowledge of CI/CD workflows, Git, and DevOps best practices.
• Experience with Cloud Logging/Monitoring tools.
Professionals with suitable experience and expertise may send your up-to-date resume to Dilip@cloudingest.com
Note: Only who can work on W2
.
Job Description: GCP Data Engineer
Location: Atlanta, GA (On-site/Hybrid as applicable)
Summary: We are seeking a highly skilled GCP Data Engineer to design, build, and optimize cloud-native data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate has strong experience with Python, BigQuery, Cloud Data Fusion, and core GCP services such as Cloud Composer, Cloud Storage, Cloud Functions, and Pub/Sub. This role requires a strong foundation in data warehousing concepts and scalable data engineering practices.
Responsibilities
• Design, develop, and maintain robust ETL/ELT pipelines on Google Cloud Platform.
• Build and optimize data workflows using Cloud Data Fusion, BigQuery, and Cloud Composer.
• Write efficient and maintainable Python code to support data ingestion, transformation, and automation.
• Develop optimized BigQuery SQL for analytics, reporting, and large-scale data modeling.
• Utilize GCP services such as Cloud Storage, Pub/Sub, and Cloud Functions to build event-driven and scalable data solutions.
• Ensure data quality, governance, and reliability across all pipelines.
• Collaborate with cross-functional teams to deliver clean, trusted, production-ready datasets.
• Monitor, troubleshoot, and resolve performance issues in cloud data pipelines and workflows.
Must-Have Skills
• Strong experience with GCP BigQuery (data modeling, SQL development, performance tuning).
• Proficiency in Python for data engineering and pipeline automation.
• Hands-on experience with Cloud Data Fusion for ETL/ELT development.
• Working experience with key GCP services:
• Cloud Composer
• Cloud Storage
• Cloud Functions
• Pub/Sub
• Strong understanding of data warehousing concepts, star/snowflake schemas, and best practices.
• Solid understanding of cloud data architecture and distributed processing.
Good-to-Have Skills
• Experience with Vertex AI for ML pipeline integration or model deployment.
• Familiarity with Dataproc (Spark/Hadoop) for large-scale processing.
• Knowledge of CI/CD workflows, Git, and DevOps best practices.
• Experience with Cloud Logging/Monitoring tools.






