

ISITE TECHNOLOGIES
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Dallas, Texas, with a contract length of "unknown" and a pay rate of "unknown." Candidates should have 8-10 years of healthcare experience, strong skills in GCP, FastAPI, Python, and SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Data Quality #Cloud #GCP (Google Cloud Platform) #FastAPI #Data Science #API (Application Programming Interface) #Python #Airflow #Data Access #Kubernetes #SQL (Structured Query Language) #BigQuery #Version Control #Data Processing #Scala #Docker #GIT #Data Governance #Security #Dataflow #Data Engineering #Datasets #Data Pipeline #Storage
Role description
JOB ROLE: GCP Data engineer
Location: Dallas, Texas, United States
Experience: 8 to 10
Mandatory skills Gcp , FastAPI
Experience Health care Domin
Skills:
• Design and implement scalable data pipelines using GCP-native tools.
• Develop and maintain RESTful APIs using FastAPI for data access and integration.
• Work with large-scale datasets from various sources (structured and unstructured).
• Optimize data workflows for performance, scalability, and reliability.
• Collaborate with cross-functional teams including data scientists, analysts, and backend engineers.
• Ensure data quality, integrity, and security across all pipelines and systems.
Required Skills:
• Strong experience with FastAPI and general API development.
• Proficiency in Python and SQL.
• Hands-on experience with GCP services such as:
o BigQuery
o Cloud Storage
o Cloud Functions
o Pub/Sub
o Dataflow
o Composer (Airflow on GCP)
• Familiarity with CI/CD pipelines and version control (Git).
Preferred Qualifications:
• Experience with containerization (Docker, Kubernetes).
• Understanding of data governance and security best practices.
• Exposure to real-time data processing and streaming architectures.
JOB ROLE: GCP Data engineer
Location: Dallas, Texas, United States
Experience: 8 to 10
Mandatory skills Gcp , FastAPI
Experience Health care Domin
Skills:
• Design and implement scalable data pipelines using GCP-native tools.
• Develop and maintain RESTful APIs using FastAPI for data access and integration.
• Work with large-scale datasets from various sources (structured and unstructured).
• Optimize data workflows for performance, scalability, and reliability.
• Collaborate with cross-functional teams including data scientists, analysts, and backend engineers.
• Ensure data quality, integrity, and security across all pipelines and systems.
Required Skills:
• Strong experience with FastAPI and general API development.
• Proficiency in Python and SQL.
• Hands-on experience with GCP services such as:
o BigQuery
o Cloud Storage
o Cloud Functions
o Pub/Sub
o Dataflow
o Composer (Airflow on GCP)
• Familiarity with CI/CD pipelines and version control (Git).
Preferred Qualifications:
• Experience with containerization (Docker, Kubernetes).
• Understanding of data governance and security best practices.
• Exposure to real-time data processing and streaming architectures.






