

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Phoenix, AZ, on a 6+ month W2 contract, hybrid (3 days onsite). Requires 6+ years of data engineering experience, 2+ years with GCP, and proficiency in SQL, Python, and Apache Beam.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 21, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Dataflow #SQL (Structured Query Language) #Data Security #Data Integration #Data Pipeline #Data Engineering #Looker #Apache Beam #Monitoring #Data Analysis #Terraform #Python #Security #dbt (data build tool) #BI (Business Intelligence) #Scala #Airflow #Data Modeling #Automation #Storage #GCP (Google Cloud Platform) #Cloud #Data Orchestration #"ETL (Extract #Transform #Load)" #BigQuery #Infrastructure as Code (IaC)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hiring Now: GCP Data Engineer | Hybrid β Phoenix, AZ
Location: Phoenix, Arizona (Hybrid β 3 days onsite per week)
Job Type: W2 Contract
Start Date: ASAP
Duration: 6+ Months (with possible extension)
About the Role:
We are looking for a highly skilled GCP Data Engineer to join our team in Phoenix, AZ. This hybrid position requires 3 days onsite per week and involves building scalable, secure, and high-performance data solutions using Google Cloud Platform (GCP) technologies.
Key Responsibilities:
β’ Design and implement large-scale data pipelines using GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Functions
β’ Develop ETL/ELT workflows and automate data integration processes
β’ Collaborate with data analysts, scientists, and product teams to deliver clean, reliable data
β’ Optimize performance and cost of data infrastructure
β’ Ensure data security, monitoring, and governance
β’ Use tools like Apache Beam, Airflow (Cloud Composer), and Terraform for orchestration and automation
Required Qualifications:
β’ 6+ years of experience in Data Engineering, including 2+ years with Google Cloud Platform
β’ Proficiency in SQL, Python, and Apache Beam
β’ Hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage
β’ Familiarity with data orchestration tools (e.g., Cloud Composer / Airflow)
β’ Experience with CI/CD and Infrastructure as Code (e.g., Terraform)
β’ Strong understanding of data warehousing, data modeling, and performance tuning
Preferred Qualifications:
β’ GCP Professional Data Engineer Certification
β’ Experience with real-time streaming data solutions
β’ Exposure to Looker, dbt, or other analytics/BI tools
β’ Background in healthcare, finance, or regulated environments is a plus