

TekValue IT Solutions
GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Python, Java, Scala, SQL, and GCP tools. Industry experience in data architecture and ETL/ELT processes is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 10, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Atlanta, GA
-
π§ - Skills detailed
#Scala #"ETL (Extract #Transform #Load)" #Data Ingestion #Data Management #Programming #Data Engineering #SQL (Structured Query Language) #Python #Monitoring #BigQuery #Data Quality #Storage #Java #Data Architecture #Terraform #Logging #Data Science #Leadership #GCP (Google Cloud Platform) #Cloud #Infrastructure as Code (IaC) #Metadata #Batch #Data Pipeline #Dataflow #API (Application Programming Interface) #Compliance
Role description
β’ Lead the design and implementation of data ingestion, transformation, and processing pipelines.
β’ Develop and operate scalable distributed data systems using GCP tools such as BigQuery, DataFusion, Dataflow, DataProc, BigQuery, Cloud Spanner, Cloud SQL, Pub/Sub, and Cloud Storage.
β’ Strong programming skills in languages such as Python, Java, Scala, and proficiency with SQL.
β’ Build solutions to support batch and streaming data workflows, including API interfaces.
β’ Guide and mentor data engineers, establish best practices, and ensure high-quality code and system performance.
β’ Enable data quality, governance, and compliance with industry standards.
β’ Troubleshoot and optimize data pipelines and infrastructure for performance and reliability.
β’ Collaborate with data scientists, analysts, and business teams to understand and fulfill their data needs.
β’ Implement logging, monitoring, and alerting for data jobs and infrastructure.
β’ Drive the adoption of Infrastructure as Code (IaC) practices using tools like Terraform.
β’ Solid understanding of data architecture, ETL/ELT processes, data warehousing, and metadata management.
β’ Strong problem-solving, communication, and team leadership skills.
β’ Lead the design and implementation of data ingestion, transformation, and processing pipelines.
β’ Develop and operate scalable distributed data systems using GCP tools such as BigQuery, DataFusion, Dataflow, DataProc, BigQuery, Cloud Spanner, Cloud SQL, Pub/Sub, and Cloud Storage.
β’ Strong programming skills in languages such as Python, Java, Scala, and proficiency with SQL.
β’ Build solutions to support batch and streaming data workflows, including API interfaces.
β’ Guide and mentor data engineers, establish best practices, and ensure high-quality code and system performance.
β’ Enable data quality, governance, and compliance with industry standards.
β’ Troubleshoot and optimize data pipelines and infrastructure for performance and reliability.
β’ Collaborate with data scientists, analysts, and business teams to understand and fulfill their data needs.
β’ Implement logging, monitoring, and alerting for data jobs and infrastructure.
β’ Drive the adoption of Infrastructure as Code (IaC) practices using tools like Terraform.
β’ Solid understanding of data architecture, ETL/ELT processes, data warehousing, and metadata management.
β’ Strong problem-solving, communication, and team leadership skills.





