

Programmers.io
Lead GCP Data Engineer/Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead GCP Data Engineer/Architect, with a contract length of "unknown," offering a pay rate of "$X per hour." Key skills include GCP, Data Engineering, ETL, and leadership. Industry experience in data architecture and compliance is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
496
-
🗓️ - Date
November 19, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Richardson, TX
-
🧠 - Skills detailed
#Data Management #Scala #Data Quality #Strategy #Documentation #Compliance #Data Engineering #Cloud #"ETL (Extract #Transform #Load)" #Storage #Leadership #Dataflow #BigQuery #Deployment #Data Architecture #Data Ingestion #Spark (Apache Spark) #Code Reviews #Data Pipeline #Data Integration #GCP (Google Cloud Platform) #Data Security #Security #Metadata
Role description
Job Summary
We are seeking a highly experienced Lead GCP Data Engineer to design, build, and optimize scalable data engineering solutions on Google Cloud Platform. The ideal candidate will take ownership of building robust data pipelines, ensuring best practices, and leading engineering teams to deliver high-quality data solutions for analytics, reporting, and business operations.
Key Responsibilities
• Lead the design, development, and deployment of data pipelines and data integration workflows on GCP.
• Build and optimize data ingestion, transformation, and storage using tools such as Dataflow, Dataproc, Pub/Sub, Composer, BigQuery, Cloud Storage, and Cloud Functions.
• Collaborate with data architects, analysts, and business teams to translate requirements into technical solutions.
• Develop and maintain ETL/ELT frameworks, ensuring scalability, performance, and reliability.
• Implement and enforce best practices around data quality, data validation, metadata management, and documentation.
• Conduct performance tuning for BigQuery, Dataflow, Spark jobs, and data pipelines.
• Drive cost optimization strategies for GCP data workloads.
• Ensure compliance with data security, governance, and access control policies.
• Provide technical leadership, mentoring, and code reviews for the data engineering team.
• Contribute to architecture discussions and technology strategy for cloud data platforms.
Job Summary
We are seeking a highly experienced Lead GCP Data Engineer to design, build, and optimize scalable data engineering solutions on Google Cloud Platform. The ideal candidate will take ownership of building robust data pipelines, ensuring best practices, and leading engineering teams to deliver high-quality data solutions for analytics, reporting, and business operations.
Key Responsibilities
• Lead the design, development, and deployment of data pipelines and data integration workflows on GCP.
• Build and optimize data ingestion, transformation, and storage using tools such as Dataflow, Dataproc, Pub/Sub, Composer, BigQuery, Cloud Storage, and Cloud Functions.
• Collaborate with data architects, analysts, and business teams to translate requirements into technical solutions.
• Develop and maintain ETL/ELT frameworks, ensuring scalability, performance, and reliability.
• Implement and enforce best practices around data quality, data validation, metadata management, and documentation.
• Conduct performance tuning for BigQuery, Dataflow, Spark jobs, and data pipelines.
• Drive cost optimization strategies for GCP data workloads.
• Ensure compliance with data security, governance, and access control policies.
• Provide technical leadership, mentoring, and code reviews for the data engineering team.
• Contribute to architecture discussions and technology strategy for cloud data platforms.



