

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Dallas, TX, with a long-term contract. Requires 12+ years of experience, strong Python skills, healthcare industry experience, and familiarity with GCP, BigQuery, and PySpark. Face-to-face interviews are mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Python #SAS #Spark (Apache Spark) #Migration #BigQuery #Java #SQL (Structured Query Language) #PySpark #GCP (Google Cloud Platform) #Data Engineering #Teradata #Cloud #Computer Science
Role description
Hi
Our client is looking for GCP Data Engineer for a long-term project in Dallas, TX, below is the detailed requirement.
Job Title : GCP Data Engineer
Location : Dallas, TX
Duration : Long term Contract
Mandatory skills: The candidate should have a strong experience in Python and Health care
Mode of Interview: Face to Face
Job Description:
β’ Bachelor's degree in Computer science or equivalent, with minimum 12+ Years of relevant experience.
β’ Development: Python (main skill need to be very good), PySpark, GCP-GCS, Big Query, BigQuery Migration Service SQL Translation, Some knowledge on Cloud Composer, Some knowledge on SAS coding ( also can be picked up by a good Python programmer ), familiarity with Teradata.
β’ In future may be Java/Sprint Boot Programmer depending on which CIM tool is selected.
Hi
Our client is looking for GCP Data Engineer for a long-term project in Dallas, TX, below is the detailed requirement.
Job Title : GCP Data Engineer
Location : Dallas, TX
Duration : Long term Contract
Mandatory skills: The candidate should have a strong experience in Python and Health care
Mode of Interview: Face to Face
Job Description:
β’ Bachelor's degree in Computer science or equivalent, with minimum 12+ Years of relevant experience.
β’ Development: Python (main skill need to be very good), PySpark, GCP-GCS, Big Query, BigQuery Migration Service SQL Translation, Some knowledge on Cloud Composer, Some knowledge on SAS coding ( also can be picked up by a good Python programmer ), familiarity with Teradata.
β’ In future may be Java/Sprint Boot Programmer depending on which CIM tool is selected.