

Lead GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead GCP Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Requires 10+ years in software development, strong Python skills, and experience with GCP services. Leadership and ETL expertise essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 15, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Concord, NC
-
π§ - Skills detailed
#Storage #NoSQL #Deployment #Data Storage #DevOps #Databases #GCP (Google Cloud Platform) #Code Reviews #Dataflow #BigQuery #Datasets #Data Modeling #Leadership #Data Pipeline #Schema Design #GIT #Python #Cloud #Data Integration #Programming #Data Science #Scala #Data Engineering #Data Quality #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)"
Role description
Job Summary:
We are seeking a highly skilled Lead GCP Data Engineer to design, build, and optimize large-scale data pipelines and solutions on the Google Cloud Platform. The ideal candidate will bring extensive experience in ETL development, Python programming, and cloud-based architectures, with a proven ability to lead development teams and deliver high-quality, scalable solutions.
Experience Required: 10+ years in software development (including at least 2 years in a senior/lead role)
Key Responsibilities:
β’ Lead the design, development, and deployment of data pipelines, ETL workflows, and data integration solutions on GCP.
β’ Architect efficient, secure, and cost-effective data storage and processing systems.
β’ Collaborate with data scientists, analysts, and stakeholders to define data requirements and implement solutions.
β’ Implement data quality, governance, and performance optimization best practices.
β’ Mentor and guide junior engineers, providing technical leadership and code reviews.
β’ Troubleshoot and resolve complex data engineering and pipeline issues.
β’ Stay current with emerging technologies, tools, and best practices in data engineering and GCP services.
Required Skills & Qualifications:
β’ 10+ years of professional software development experience, with at least 2 years in a senior or lead role.
β’ Strong proficiency in Python for data engineering tasks.
β’ Hands-on experience with Google Cloud Platform services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer.
β’ Proven expertise in building and maintaining ETL pipelines for large datasets.
β’ Strong SQL skills and experience with relational and NoSQL databases.
β’ Knowledge of data modeling, schema design, and data warehousing concepts.
β’ Familiarity with CI/CD pipelines, Git, and DevOps practices for data engineering.
β’ Excellent problem-solving, communication, and leadership skills.
Job Summary:
We are seeking a highly skilled Lead GCP Data Engineer to design, build, and optimize large-scale data pipelines and solutions on the Google Cloud Platform. The ideal candidate will bring extensive experience in ETL development, Python programming, and cloud-based architectures, with a proven ability to lead development teams and deliver high-quality, scalable solutions.
Experience Required: 10+ years in software development (including at least 2 years in a senior/lead role)
Key Responsibilities:
β’ Lead the design, development, and deployment of data pipelines, ETL workflows, and data integration solutions on GCP.
β’ Architect efficient, secure, and cost-effective data storage and processing systems.
β’ Collaborate with data scientists, analysts, and stakeholders to define data requirements and implement solutions.
β’ Implement data quality, governance, and performance optimization best practices.
β’ Mentor and guide junior engineers, providing technical leadership and code reviews.
β’ Troubleshoot and resolve complex data engineering and pipeline issues.
β’ Stay current with emerging technologies, tools, and best practices in data engineering and GCP services.
Required Skills & Qualifications:
β’ 10+ years of professional software development experience, with at least 2 years in a senior or lead role.
β’ Strong proficiency in Python for data engineering tasks.
β’ Hands-on experience with Google Cloud Platform services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer.
β’ Proven expertise in building and maintaining ETL pipelines for large datasets.
β’ Strong SQL skills and experience with relational and NoSQL databases.
β’ Knowledge of data modeling, schema design, and data warehousing concepts.
β’ Familiarity with CI/CD pipelines, Git, and DevOps practices for data engineering.
β’ Excellent problem-solving, communication, and leadership skills.