

Data Engineer with FastAPI & GCP Experience (EX-CVS Candidates Only +15 Years)
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position in Dallas, TX, requiring mid/senior level expertise. Candidates must have over 15 years of experience, specifically with FastAPI and GCP. Key skills include Python, SQL, and data pipeline development.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#Version Control #Storage #Airflow #Kubernetes #Scala #Data Pipeline #Data Quality #Data Processing #API (Application Programming Interface) #Data Governance #Data Engineering #Dataflow #SQL (Structured Query Language) #Security #Data Science #GCP (Google Cloud Platform) #BigQuery #Data Access #Datasets #FastAPI #Python #GIT #Docker #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer with FastAPI & GCP Experience
Location: Dallas, TX (3 days onsite)
Type: Contract
Experience Level: Mid/Senior
Note : Need EX-CVS candidates only.
Job Summary:
We are seeking a talented Data Engineer with strong experience in FastAPI and a solid background in Google Cloud Platform (GCP) services. This role involves building scalable data pipelines, developing APIs, and leveraging GCP tools to support data infrastructure and analytics.
Key Responsibilities:
β’ Design and implement scalable data pipelines using GCP-native tools.
β’ Develop and maintain RESTful APIs using FastAPI for data access and integration.
β’ Work with large-scale datasets from various sources (structured and unstructured).
β’ Optimize data workflows for performance, scalability, and reliability.
β’ Collaborate with cross-functional teams including data scientists, analysts, and backend engineers.
β’ Ensure data quality, integrity, and security across all pipelines and systems.
Required Skills:
β’ Strong experience with FastAPI and general API development.
β’ Proficiency in Python and SQL.
β’ Hands-on experience with GCP services such as:
β’ BigQuery
β’ Cloud Storage
β’ Cloud Functions
β’ Pub/Sub
β’ Dataflow
β’ Composer (Airflow on GCP)
β’ Familiarity with CI/CD pipelines and version control (Git).
Preferred Qualifications:
β’ Experience with containerization (Docker, Kubernetes).
β’ Understanding of data governance and security best practices.
β’ Exposure to real-time data processing and streaming architectures.