

Persistent Systems
GCP Data Enigneer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with AI/ML integration, requiring 8+ years of experience. The contract is onsite in Irving, Texas, offering a competitive pay rate. Key skills include BigQuery, Google Cloud Storage, Python, and data pipeline engineering.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
566
-
ποΈ - Date
March 27, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Irving, TX
-
π§ - Skills detailed
#Clustering #Data Pipeline #Predictive Modeling #GitHub #Airflow #Scala #Google Cloud Storage #Data Governance #Big Data #Storage #Data Warehouse #Cloud #Apache Iceberg #GCP (Google Cloud Platform) #Data Lake #Java #Programming #ML (Machine Learning) #Security #Terraform #BigQuery #Capacity Management #Data Orchestration #Data Lakehouse #AI (Artificial Intelligence) #Data Engineering #Data Science #Python #SQL (Structured Query Language) #Apache Airflow
Role description
About Persistent
We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate whatβs next. Our offerings and proven solutions create unique competitive advantage for our clients by giving them the power to see beyond and rise above.
We are experiencing tremendous growth, with $566 million in revenue in FY21, representing 12.9% year-over-year growth. Along with that growth, we onboarded over 3,000 new employees in the past year, bringing our total employee count to over 15,000 people located in 18 countries across the globe.
At Persistent, our values are more than a list of ideals to improve our corporate image. Weβre dedicated to building an inclusive culture that reflects whatβs important to our employees and is based on what they value. As a result, 95% of our employees approve of the CEO and 83% recommend working at Persistent to a friend.
About Position:
We are seeking a GCP Data Engineer with deep, hands-on architectural and development
experience in Google Cloud Platformβs big data ecosystem. You will be responsible for
designing, building, and optimizing a modern data lakehouse architecture. Your primary focus
will be leveraging BigLake, BigQuery, Google Cloud Storage (GCS), and Vertex AI to create
seamless, scalable data pipelines and machine learning integrations that drive business
intelligence and predictive analytics.
About Position:
Role: GCP Data Engineer with AI/ML Integration
Location: Irving, Texas 75039 (100% onsite)
Hire Type: contract
Experience: 8+ years of experience
What You'll Do:
Design and manage complex, highly scalable data models within Big Query.
Perform deep performance tuning and cost optimization of Big Query jobs utilizing clustering, partitioning, materialized views, and slot capacity management.
Collaborate with Data Scientists to operationalize machine learning models using
Vertex AI.
Build robust data pipelines to feed Vertex AI Feature Store, manage model
training workflows and deploy ML models into production.
Utilize Big Query ML (BQML) for in-database predictive modeling and analytics
where appropriate.
Expertise You'll Bring:
Lakehouse Architecture & Development:
Architect and maintain a scalable data lakehouse using Google Cloud Storage
(GCS) as the foundational data lake and BigLake to unify data warehouses and data lakes.
Implement fine-grained security (row-level and column-level access controls) and
data governance across open file formats (Parquet, Iceberg, ORC) using BigLake.
Data Warehousing & Optimization:
Data Pipeline Engineering:
Big Query: Expert-level knowledge of Big Query architecture, advanced SQL, analytical functions, query profiling, and optimization techniques.
Big Lake: Proven experience utilizing Big Lake for multi-cloud or lake house architectures, managing open-source formats (e.g., Apache Iceberg/Parquet), and enforcing unified security policies.
GCS: Deep understanding of GCS storage classes, object lifecycle management, and optimizing GCS for big data workloads.
Vertex AI: Hands-on experience with Vertex AI pipelines, endpoints, feature stores, or deploying ML models into scalable data environments.
Programming Skills: Advanced proficiency in Python and SQL. Familiarity with Java, Scala, or Go is a plus.
Data Orchestration & CI/CD: Experience with orchestration tools (e.g., Apache Airflow,
Cloud Composer) and modern CI/CD pipelines (e.g., GitHub Actions, Terraform, Cloud Build).
Benefits:
β’ Competitive salary and benefits package
β’ Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications
β’ Opportunity to work with cutting-edge technologies
β’ Employee engagement initiatives such as project parties, flexible work hours, and βLong Serviceβ awards
β’ Annual health check-ups as well as insurance:
β’ Group term life insurance
β’ Personal accident insurance
β’ Mediclaim hospitalization insurance for self, spouse, two children, and parents
Why Persistent is an employer of choice
β’ Technology Innovation: culture of innovation using cutting-edge technology to bring value to clients.
β’ Growth and Career Progression: learning opportunities for growth, including quarterly promotion cycles.
β’ One Persistent Culture: global outlook with diversity and inclusion at its core.
β’ Mental and Physical Wellness: employee health and mindfulness programs
About Persistent
We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate whatβs next. Our offerings and proven solutions create unique competitive advantage for our clients by giving them the power to see beyond and rise above.
We are experiencing tremendous growth, with $566 million in revenue in FY21, representing 12.9% year-over-year growth. Along with that growth, we onboarded over 3,000 new employees in the past year, bringing our total employee count to over 15,000 people located in 18 countries across the globe.
At Persistent, our values are more than a list of ideals to improve our corporate image. Weβre dedicated to building an inclusive culture that reflects whatβs important to our employees and is based on what they value. As a result, 95% of our employees approve of the CEO and 83% recommend working at Persistent to a friend.
About Position:
We are seeking a GCP Data Engineer with deep, hands-on architectural and development
experience in Google Cloud Platformβs big data ecosystem. You will be responsible for
designing, building, and optimizing a modern data lakehouse architecture. Your primary focus
will be leveraging BigLake, BigQuery, Google Cloud Storage (GCS), and Vertex AI to create
seamless, scalable data pipelines and machine learning integrations that drive business
intelligence and predictive analytics.
About Position:
Role: GCP Data Engineer with AI/ML Integration
Location: Irving, Texas 75039 (100% onsite)
Hire Type: contract
Experience: 8+ years of experience
What You'll Do:
Design and manage complex, highly scalable data models within Big Query.
Perform deep performance tuning and cost optimization of Big Query jobs utilizing clustering, partitioning, materialized views, and slot capacity management.
Collaborate with Data Scientists to operationalize machine learning models using
Vertex AI.
Build robust data pipelines to feed Vertex AI Feature Store, manage model
training workflows and deploy ML models into production.
Utilize Big Query ML (BQML) for in-database predictive modeling and analytics
where appropriate.
Expertise You'll Bring:
Lakehouse Architecture & Development:
Architect and maintain a scalable data lakehouse using Google Cloud Storage
(GCS) as the foundational data lake and BigLake to unify data warehouses and data lakes.
Implement fine-grained security (row-level and column-level access controls) and
data governance across open file formats (Parquet, Iceberg, ORC) using BigLake.
Data Warehousing & Optimization:
Data Pipeline Engineering:
Big Query: Expert-level knowledge of Big Query architecture, advanced SQL, analytical functions, query profiling, and optimization techniques.
Big Lake: Proven experience utilizing Big Lake for multi-cloud or lake house architectures, managing open-source formats (e.g., Apache Iceberg/Parquet), and enforcing unified security policies.
GCS: Deep understanding of GCS storage classes, object lifecycle management, and optimizing GCS for big data workloads.
Vertex AI: Hands-on experience with Vertex AI pipelines, endpoints, feature stores, or deploying ML models into scalable data environments.
Programming Skills: Advanced proficiency in Python and SQL. Familiarity with Java, Scala, or Go is a plus.
Data Orchestration & CI/CD: Experience with orchestration tools (e.g., Apache Airflow,
Cloud Composer) and modern CI/CD pipelines (e.g., GitHub Actions, Terraform, Cloud Build).
Benefits:
β’ Competitive salary and benefits package
β’ Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications
β’ Opportunity to work with cutting-edge technologies
β’ Employee engagement initiatives such as project parties, flexible work hours, and βLong Serviceβ awards
β’ Annual health check-ups as well as insurance:
β’ Group term life insurance
β’ Personal accident insurance
β’ Mediclaim hospitalization insurance for self, spouse, two children, and parents
Why Persistent is an employer of choice
β’ Technology Innovation: culture of innovation using cutting-edge technology to bring value to clients.
β’ Growth and Career Progression: learning opportunities for growth, including quarterly promotion cycles.
β’ One Persistent Culture: global outlook with diversity and inclusion at its core.
β’ Mental and Physical Wellness: employee health and mindfulness programs






