

LTIMindtree
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a contract basis in Philadelphia, PA, requiring expertise in GCP services, data engineering tools, and programming languages like Java and Python. GCP Certification is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 31, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Philadelphia, PA
-
🧠 - Skills detailed
#Deployment #Dataflow #BigQuery #Monitoring #GitHub #Terraform #Airflow #DevOps #Docker #Python #Data Engineering #Java #Scala #Jenkins #SQL (Structured Query Language) #Data Pipeline #Cloud #GCP (Google Cloud Platform) #Security #DMS (Data Migration Service) #Hadoop #Storage #Kubernetes #Apache Airflow
Role description
🔹 Job Details
Job Title : GCP Data Engineer
Job Location : Philadelphia, PA
Job Type : Contract
Client : LTIMindtree
🔹 Role Overview
Role Overview:
Seeking a GCP Data Engineer with expertise in GCP services and data engineering tools for a media domain client.
Skills Required:
• GCP Storage, BigQuery, DataProc, Cloud Composer, DMS
• Apache Airflow, Dataflow, Pub/Sub
• Java, Python, Scala
• ANSI-SQL, Hadoop ecosystem
Skills Nice-to-Have:
• Terraform or Cloud Deployment Manager
• Docker, Kubernetes
• Jenkins, GitHub Actions, Cloud Build
• GCP Certification (Professional Data Engineer)
Responsibilities:
• Design and deploy scalable systems on GCP
• Build data pipelines and workflows
• Implement DevOps CI/CD pipelines
• Ensure monitoring and security using GCP tools
🔹 Job Details
Job Title : GCP Data Engineer
Job Location : Philadelphia, PA
Job Type : Contract
Client : LTIMindtree
🔹 Role Overview
Role Overview:
Seeking a GCP Data Engineer with expertise in GCP services and data engineering tools for a media domain client.
Skills Required:
• GCP Storage, BigQuery, DataProc, Cloud Composer, DMS
• Apache Airflow, Dataflow, Pub/Sub
• Java, Python, Scala
• ANSI-SQL, Hadoop ecosystem
Skills Nice-to-Have:
• Terraform or Cloud Deployment Manager
• Docker, Kubernetes
• Jenkins, GitHub Actions, Cloud Build
• GCP Certification (Professional Data Engineer)
Responsibilities:
• Design and deploy scalable systems on GCP
• Build data pipelines and workflows
• Implement DevOps CI/CD pipelines
• Ensure monitoring and security using GCP tools






