E-Solutions

GCP Data Lead Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Lead Engineer, a 3+ month remote contract position. Requires expertise in GCP services, SQL, Python, and strong analytical skills. Must have completed 2-3 end-to-end projects, focusing on data pipeline development and cloud security.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 14, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Monitoring #Scala #Infrastructure as Code (IaC) #BigQuery #Database Design #Security #DevOps #Batch #Data Quality #Leadership #Compliance #Data Engineering #Data Modeling #Data Science #Kubernetes #Data Pipeline #Cloud #GCP (Google Cloud Platform) #Docker #Data Lake #Automation #SQL (Structured Query Language) #Python #Dataflow #Data Integration #Data Integrity #"ETL (Extract #Transform #Load)" #NoSQL #Storage
Role description
Sr. GCP Data Engineer Remote Designs, builds, and maintains scalable data pipelines and architectures on Google Cloud, transforming raw data into accessible insights using tools like BigQuery, Dataflow, and Cloud Storage, ensuring quality, security, and performance, while collaborating with analysts and scientists to meet business needs. Key responsibilities include ETL/ELT, data warehousing, monitoring, and optimizing cloud data solutions, requiring skills in SQL, Python, and GCP services, along with strong analytical and communication abilities. Responsibilities β€’ Pipeline Development: Design, build, and maintain robust, scalable data pipelines (batch/streaming) using GCP services. β€’ Architecture: Develop and manage data warehousing and data lake solutions (e.g., BigQuery, Cloud Storage). β€’ Data Quality & Governance: Ensure data integrity, consistency, security, and compliance. β€’ Collaboration: Work with data scientists, analysts, and business stakeholders to define requirements and deliver solutions. β€’ Optimization: Monitor and optimize data infrastructure for performance, cost, and reliability. β€’ Automation: Implement CI/CD, Infrastructure as Code (IaC), and automation for data workflows. Core Skills & Qualifications β€’ At least 3+ years of experience with GCP data services (BigQuery, Dataflow, Dataproc, Cloud Storage, etc.). β€’ Must have delivered al least 2 to 3 end to end projects as a data engineer using GCP Services β€’ Strong understanding of database design, data modeling (relational, dimensional, NoSQL). β€’ Expertise in data integration, ETL/ELT, and data pipeline development. β€’ Knowledge of cloud security best practices, identity management, and networking. β€’ Familiarity with DevOps, CI/CD, and containerization (Docker, Kubernetes). β€’ Excellent communication, problem-solving, and leadership skills.