GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Dallas, TX, on a contract basis. Requires 5–8 years of experience, 3+ years in Databricks & GCP, and certification in Databricks/GCP. Key skills include ETL/ELT, Python, SQL, and data migration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 18, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Data Warehouse #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Databricks #Data Lake #SQL (Structured Query Language) #Programming #Apache Beam #Storage #BigQuery #Dataflow #Cloud #Data Engineering #ADF (Azure Data Factory) #Airflow #Data Ingestion #Data Migration #Python #Synapse #Migration
Role description
One of my clients is looking for a GCP Data Engineer - Dallas, TX for a contract role. Note: Only Independent Candidates – No C2C or Third Parties Certification in Databricks/GCP is required. Required Skills & Qualifications: • 5–8 years of overall experience with 3+ years hands-on in Databricks & GCP. • Strong expertise in building data migration pipelines using Databricks, ADF, and Synapse. • Proven ability to design and implement data ingestion, transformation, and storage solutions using GCP-native tools. • Expertise with BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Functions. • Experience developing and optimizing ETL/ELT pipelines with Dataflow (Apache Beam), Composer (Airflow), and BigQuery. • Skilled in managing and maintaining data warehouses and data lakes in GCP. • Strong programming experience in Python, SQL, and Apache Beam. If interested, please send your resume at harsh@hireplusinfotech.com