

NLB Services
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer for a 6-month contract in Issaquah, WA, offering a competitive pay rate. Requires 8+ years in data engineering, 6+ years in GCP, and 4+ years in Python. GCP certification preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Issaquah, WA
-
🧠 - Skills detailed
#Scrum #Airflow #Datasets #Apache Beam #BigQuery #Scala #Compliance #Data Quality #Infrastructure as Code (IaC) #Security #Data Processing #"ETL (Extract #Transform #Load)" #Data Modeling #Terraform #Dataflow #GitHub #Spark (Apache Spark) #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Docker #DevOps #Kubernetes #Python #Storage #Code Reviews #Data Pipeline #Leadership #Agile #Cloud #Programming #Data Engineering
Role description
Technical Lead – GCP Data Engineering (Python)
Issaquah, WA (3 days onsite every week)
Mandatory Areas
Skill 1 – 6 + Years of exp in GCP
Skill 2 – 8 + Years of Exp in Data Engineer
Skill 3- 4+ Years exp in Python
Job Summary
We are seeking an experienced Technical Lead with strong expertise in Google Cloud Platform (GCP), data engineering, and Python development. The ideal candidate will lead the design, development, and optimization of scalable data pipelines and cloud-based data solutions. This role requires both hands-on technical expertise and leadership capabilities to mentor team members and drive best practices.
Key Responsibilities
• Lead architecture, design, and implementation of scalable data solutions on GCP
• Develop and maintain data pipelines using Python and GCP services
• Work with large-scale structured and unstructured datasets
• Implement ETL/ELT workflows using modern data engineering best practices
• Optimize data processing performance and ensure data quality
• Collaborate with stakeholders, architects, and product teams to translate business requirements into technical solutions
• Conduct code reviews and enforce coding standards and best practices
• Mentor and guide data engineers and junior developers
• Ensure security, governance, and compliance standards are maintained
• Provide technical leadership in troubleshooting and production support
• Develop CI/CD pipelines for data workflows using GitHub, Terraform, and Cloud Build.
• Perform performance tuning, root cause analysis, and system optimization to improve data flow efficiency.
• Work closely with DevOps and security teams to maintain compliance, security, and governance across data systems.
• Stay current with emerging technologies, tools, and best practices in data engineering and cloud computing
Required Skills & Qualifications
• 8+ years of experience in Data Engineering
• 3+ years of hands-on experience with GCP services such as:
o BigQuery
o Dataflow
o Pub/Sub
o Cloud Storage
o Cloud Composer (Airflow)
o Dataproc
• Strong programming skills in Python
• Experience building ETL/ELT pipelines
• Strong SQL knowledge
• Experience with distributed data processing frameworks (Apache Beam / Spark)
• Experience with CI/CD pipelines and DevOps practices
• Knowledge of data modeling and data warehousing concepts
Preferred Qualifications
• Experience with containerization (Docker, Kubernetes)
• Familiarity with Terraform or Infrastructure as Code
• Experience with real-time streaming architecture
• GCP certifications (Professional Data Engineer preferred)
• Experience working in Agile/Scrum environments
Technical Lead – GCP Data Engineering (Python)
Issaquah, WA (3 days onsite every week)
Mandatory Areas
Skill 1 – 6 + Years of exp in GCP
Skill 2 – 8 + Years of Exp in Data Engineer
Skill 3- 4+ Years exp in Python
Job Summary
We are seeking an experienced Technical Lead with strong expertise in Google Cloud Platform (GCP), data engineering, and Python development. The ideal candidate will lead the design, development, and optimization of scalable data pipelines and cloud-based data solutions. This role requires both hands-on technical expertise and leadership capabilities to mentor team members and drive best practices.
Key Responsibilities
• Lead architecture, design, and implementation of scalable data solutions on GCP
• Develop and maintain data pipelines using Python and GCP services
• Work with large-scale structured and unstructured datasets
• Implement ETL/ELT workflows using modern data engineering best practices
• Optimize data processing performance and ensure data quality
• Collaborate with stakeholders, architects, and product teams to translate business requirements into technical solutions
• Conduct code reviews and enforce coding standards and best practices
• Mentor and guide data engineers and junior developers
• Ensure security, governance, and compliance standards are maintained
• Provide technical leadership in troubleshooting and production support
• Develop CI/CD pipelines for data workflows using GitHub, Terraform, and Cloud Build.
• Perform performance tuning, root cause analysis, and system optimization to improve data flow efficiency.
• Work closely with DevOps and security teams to maintain compliance, security, and governance across data systems.
• Stay current with emerging technologies, tools, and best practices in data engineering and cloud computing
Required Skills & Qualifications
• 8+ years of experience in Data Engineering
• 3+ years of hands-on experience with GCP services such as:
o BigQuery
o Dataflow
o Pub/Sub
o Cloud Storage
o Cloud Composer (Airflow)
o Dataproc
• Strong programming skills in Python
• Experience building ETL/ELT pipelines
• Strong SQL knowledge
• Experience with distributed data processing frameworks (Apache Beam / Spark)
• Experience with CI/CD pipelines and DevOps practices
• Knowledge of data modeling and data warehousing concepts
Preferred Qualifications
• Experience with containerization (Docker, Kubernetes)
• Familiarity with Terraform or Infrastructure as Code
• Experience with real-time streaming architecture
• GCP certifications (Professional Data Engineer preferred)
• Experience working in Agile/Scrum environments






