

GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Phoenix, AZ, with a contract length of "unknown" and a pay rate of "unknown." Requires 6 years of data engineering experience, strong Python skills, and expertise in GCP technologies, data governance, and ML Ops.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
May 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Phoenix, AZ
🧠 - Skills detailed
#Public Cloud #Data Engineering #DevOps #Scala #Data Pipeline #Dataflow #Compliance #Cloud #Data Governance #Metadata #Airflow #AI (Artificial Intelligence) #Documentation #Python #"ETL (Extract #Transform #Load)" #Data Lake #Data Warehouse #ML (Machine Learning) #GCP (Google Cloud Platform) #Data Management #Storage
Role description
Job Title – GCP Data Engineer
Location – Phoenix, AZ ( Onsite)
Responsibilities -
• A solid experience and understanding of considerations for largescale solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must
• Works on creating a framework for ML and LLM Ops
• Works on processes and flow of AI Compliance Governance framework
• Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times
• Create reports to monitor usage data for billing and SLA tracking
• Work with business and cross functional teams to gather and document requirements to meet business needs
• Provide support as required to ensure the availability and performance of ETLELT jobs
• Provide technical assistance and cross training to business and internal team members
• Collaborate with business partners for continuous improvement opportunities
Required Skills -
• 6 years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics
• Very strong hands-on Python expertise
• Strong Data and GCP Vertex AI knowledge
• 6 years of experience with one of the leading public clouds
• 6 years of experience in design and build of salable data pipelines that deal with extraction transformation and loading
• 6 years of experience with Python Scala with working knowledge on Notebooks
• 6 years hands on experience on GCP Cloud data implementation projects Dataflow DataProc Cloud Composer Big Query Cloud Storage GKE Airflow etc.
• At least 5 years of experience in Data governance and Metadata Management
• Ability to work independently solve problems update the stake holders
• Analyse design develop and deploy solutions as per business requirements
• Strong understanding of relational and dimensional data modelling
• Experience in DevOps and CICD related technologies
• Excellent written verbal communication skills including experience in technical documentation and ability to communicate with senior business managers and executives
Job Title – GCP Data Engineer
Location – Phoenix, AZ ( Onsite)
Responsibilities -
• A solid experience and understanding of considerations for largescale solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must
• Works on creating a framework for ML and LLM Ops
• Works on processes and flow of AI Compliance Governance framework
• Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times
• Create reports to monitor usage data for billing and SLA tracking
• Work with business and cross functional teams to gather and document requirements to meet business needs
• Provide support as required to ensure the availability and performance of ETLELT jobs
• Provide technical assistance and cross training to business and internal team members
• Collaborate with business partners for continuous improvement opportunities
Required Skills -
• 6 years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics
• Very strong hands-on Python expertise
• Strong Data and GCP Vertex AI knowledge
• 6 years of experience with one of the leading public clouds
• 6 years of experience in design and build of salable data pipelines that deal with extraction transformation and loading
• 6 years of experience with Python Scala with working knowledge on Notebooks
• 6 years hands on experience on GCP Cloud data implementation projects Dataflow DataProc Cloud Composer Big Query Cloud Storage GKE Airflow etc.
• At least 5 years of experience in Data governance and Metadata Management
• Ability to work independently solve problems update the stake holders
• Analyse design develop and deploy solutions as per business requirements
• Strong understanding of relational and dimensional data modelling
• Experience in DevOps and CICD related technologies
• Excellent written verbal communication skills including experience in technical documentation and ability to communicate with senior business managers and executives