

Aptino, Inc.
GCP API Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP API Developer with a contract length of "unknown," offering a pay rate of "$XX/hour." Requires 5+ years in Data Engineering, 3+ years in API development, and strong skills in GCP, Python/Java/Scala, and SQL.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date
May 7, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Nashville, TN
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #API (Application Programming Interface) #Python #Scala #BigQuery #Data Quality #Cloud #Kafka (Apache Kafka) #SQL (Structured Query Language) #Data Engineering #Automation #Data Pipeline #FastAPI #Java #ML (Machine Learning) #Data Architecture #Spark (Apache Spark) #AI (Artificial Intelligence)
Role description
Weβre looking for a Senior Staff Data Engineer to design, build, and scale API-driven data solutions on GCP. Youβll work with cross-functional teams to develop robust data pipelines, enable self-service analytics, and drive best practices in API development, CI/CD, and data architecture.
Key Responsibilities:
β’ Design and build scalable data pipelines and APIs
β’ Develop cloud-based solutions using GCP (BigQuery, Cloud Run, GKE, etc.)
β’ Implement automation, CI/CD, and data quality best practices
β’ Collaborate with data teams to deliver insights and AI/ML solutions
β’ Mentor engineers and promote engineering excellence
Requirements:
β’ 5+ years in Data Engineering, 3+ years in API development
β’ Strong experience with GCP, Python/Java/Scala, SQL
β’ Hands-on with APIs (FastAPI, Apigee), Kafka/Spark, CI/CD
β’ Experience with modern data architecture and cloud best practices
Weβre looking for a Senior Staff Data Engineer to design, build, and scale API-driven data solutions on GCP. Youβll work with cross-functional teams to develop robust data pipelines, enable self-service analytics, and drive best practices in API development, CI/CD, and data architecture.
Key Responsibilities:
β’ Design and build scalable data pipelines and APIs
β’ Develop cloud-based solutions using GCP (BigQuery, Cloud Run, GKE, etc.)
β’ Implement automation, CI/CD, and data quality best practices
β’ Collaborate with data teams to deliver insights and AI/ML solutions
β’ Mentor engineers and promote engineering excellence
Requirements:
β’ 5+ years in Data Engineering, 3+ years in API development
β’ Strong experience with GCP, Python/Java/Scala, SQL
β’ Hands-on with APIs (FastAPI, Apigee), Kafka/Spark, CI/CD
β’ Experience with modern data architecture and cloud best practices






