

TekValue IT Solutions
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include GCP tools, Python, Java, SQL, data architecture, and leadership. Industry experience in data engineering and compliance is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 10, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Leadership #"ETL (Extract #Transform #Load)" #Metadata #Data Pipeline #Python #Data Quality #Data Architecture #Programming #Storage #Terraform #Logging #Cloud #Infrastructure as Code (IaC) #API (Application Programming Interface) #SQL (Structured Query Language) #Data Engineering #Data Management #Data Science #Scala #Compliance #Batch #Data Ingestion #GCP (Google Cloud Platform) #Monitoring #BigQuery #Dataflow #Java
Role description
• Lead the design and implementation of data ingestion, transformation, and processing pipelines.
• Develop and operate scalable distributed data systems using GCP tools such as BigQuery, DataFusion, Dataflow, DataProc, BigQuery, Cloud Spanner, Cloud SQL, Pub/Sub, and Cloud Storage.
• Strong programming skills in languages such as Python, Java, Scala, and proficiency with SQL.
• Build solutions to support batch and streaming data workflows, including API interfaces.
• Guide and mentor data engineers, establish best practices, and ensure high-quality code and system performance.
• Enable data quality, governance, and compliance with industry standards.
• Troubleshoot and optimize data pipelines and infrastructure for performance and reliability.
• Collaborate with data scientists, analysts, and business teams to understand and fulfill their data needs.
• Implement logging, monitoring, and alerting for data jobs and infrastructure.
• Drive the adoption of Infrastructure as Code (IaC) practices using tools like Terraform.
• Solid understanding of data architecture, ETL/ELT processes, data warehousing, and metadata management.
• Strong problem-solving, communication, and team leadership skills.
• Lead the design and implementation of data ingestion, transformation, and processing pipelines.
• Develop and operate scalable distributed data systems using GCP tools such as BigQuery, DataFusion, Dataflow, DataProc, BigQuery, Cloud Spanner, Cloud SQL, Pub/Sub, and Cloud Storage.
• Strong programming skills in languages such as Python, Java, Scala, and proficiency with SQL.
• Build solutions to support batch and streaming data workflows, including API interfaces.
• Guide and mentor data engineers, establish best practices, and ensure high-quality code and system performance.
• Enable data quality, governance, and compliance with industry standards.
• Troubleshoot and optimize data pipelines and infrastructure for performance and reliability.
• Collaborate with data scientists, analysts, and business teams to understand and fulfill their data needs.
• Implement logging, monitoring, and alerting for data jobs and infrastructure.
• Drive the adoption of Infrastructure as Code (IaC) practices using tools like Terraform.
• Solid understanding of data architecture, ETL/ELT processes, data warehousing, and metadata management.
• Strong problem-solving, communication, and team leadership skills.




