

Infosoft, Inc.
Lead GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead GCP Data Engineer with a 12-month contract, paying $65-70/HR. Located in Charlotte, NC or Dallas, TX, it requires 10+ years of experience in GCP, data architecture, Python, and ETL/ELT pipeline development.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
March 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Scala #Data Modeling #SQL (Structured Query Language) #Data Architecture #Airflow #DevOps #Cloud #SSIS (SQL Server Integration Services) #"ETL (Extract #Transform #Load)" #GitHub #Storage #Data Lake #Data Framework #Data Transformations #GCP (Google Cloud Platform) #PySpark #Jenkins #SAS #BigQuery #Data Quality #Data Pipeline #Dataflow #Data Lakehouse #Automation #Leadership #Spark (Apache Spark) #Delta Lake #Deployment #GitLab #Python #Data Engineering
Role description
Job Title: Lead GCP Data Engineer
Pay Rate: $65-70/HR on W-2
Duration: 12 months
Location: Charlotte, NC or Dallas, TX (5 days onsite) (Onsite interview)
Key Skills/Experience - GCP (BigQuery, Dataflow), Delta Lake, and Dataplex, PYTHON, Airflow, Spark (10+ years Overall experience)
We are looking for Lead GCP Data Engineer for a Banking Client with strong hands-on technical depth, data architecture and design experience, and proven technical leadership skills. This role will play a key part in modernizing the enterprise data platform, leading complex data engineering initiatives, and guiding design decisions as workloads transition from on-prem to Google Cloud Platform (GCP).
The ideal candidate is comfortable operating as a technical lead, owning end-to-end data solutions, mentoring team members, and partnering closely with product owners and stakeholders.Required Qualifications & Skills Core Technical Skills
• Expert-level data engineering experience in modern data stacks
• Strong hands-on expertise with:
• SQL (advanced transformations and optimization)
• Python
• PySpark / Spark
• Deep experience building ETL / ELT pipelines at scale
• Strong understanding of data transformations, data quality, and lineage
Cloud & Platform (Must Have)
• Strong Google Cloud Platform (GCP) experience, including:
• Designing and deploying data pipelines on GCP
• Cloud-native storage and processing patterns
• Performance, cost optimization, and scalability considerations
Data Architecture & Design
• Strong experience in:
• Data modeling (operational and analytical)
• Delta Lake / modern data lakehouse concepts
• Designing cloud-based data architectures for reporting and downstream systems
• Ability to translate business requirements into clean, scalable technical designs
DevOps / CI-CD
• Hands-on experience with CI/CD pipelines using tools such as:
• Jenkins
• GitHub Actions
• GitLab
• Experience implementing automation and deployment standards for data platforms
Additional Experience
• Familiarity with tools like SSIS, SAS, or similar, to understand and migrate legacy code
• Configuration-driven frameworks and understanding how enterprise data frameworks operate
• Strong problem-solving, critical thinking, and adaptability
• Proven ability to operate in a lead / senior capacity, influencing technical direction
Job Title: Lead GCP Data Engineer
Pay Rate: $65-70/HR on W-2
Duration: 12 months
Location: Charlotte, NC or Dallas, TX (5 days onsite) (Onsite interview)
Key Skills/Experience - GCP (BigQuery, Dataflow), Delta Lake, and Dataplex, PYTHON, Airflow, Spark (10+ years Overall experience)
We are looking for Lead GCP Data Engineer for a Banking Client with strong hands-on technical depth, data architecture and design experience, and proven technical leadership skills. This role will play a key part in modernizing the enterprise data platform, leading complex data engineering initiatives, and guiding design decisions as workloads transition from on-prem to Google Cloud Platform (GCP).
The ideal candidate is comfortable operating as a technical lead, owning end-to-end data solutions, mentoring team members, and partnering closely with product owners and stakeholders.Required Qualifications & Skills Core Technical Skills
• Expert-level data engineering experience in modern data stacks
• Strong hands-on expertise with:
• SQL (advanced transformations and optimization)
• Python
• PySpark / Spark
• Deep experience building ETL / ELT pipelines at scale
• Strong understanding of data transformations, data quality, and lineage
Cloud & Platform (Must Have)
• Strong Google Cloud Platform (GCP) experience, including:
• Designing and deploying data pipelines on GCP
• Cloud-native storage and processing patterns
• Performance, cost optimization, and scalability considerations
Data Architecture & Design
• Strong experience in:
• Data modeling (operational and analytical)
• Delta Lake / modern data lakehouse concepts
• Designing cloud-based data architectures for reporting and downstream systems
• Ability to translate business requirements into clean, scalable technical designs
DevOps / CI-CD
• Hands-on experience with CI/CD pipelines using tools such as:
• Jenkins
• GitHub Actions
• GitLab
• Experience implementing automation and deployment standards for data platforms
Additional Experience
• Familiarity with tools like SSIS, SAS, or similar, to understand and migrate legacy code
• Configuration-driven frameworks and understanding how enterprise data frameworks operate
• Strong problem-solving, critical thinking, and adaptability
• Proven ability to operate in a lead / senior capacity, influencing technical direction






