Aptino, Inc.

GCP API Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP API Developer with a contract length of "unknown," offering a pay rate of "$XX/hour." Requires 5+ years in Data Engineering, 3+ years in API development, and strong skills in GCP, Python/Java/Scala, and SQL.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
560
-
πŸ—“οΈ - Date
May 7, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Nashville, TN
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #API (Application Programming Interface) #Python #Scala #BigQuery #Data Quality #Cloud #Kafka (Apache Kafka) #SQL (Structured Query Language) #Data Engineering #Automation #Data Pipeline #FastAPI #Java #ML (Machine Learning) #Data Architecture #Spark (Apache Spark) #AI (Artificial Intelligence)
Role description
We’re looking for a Senior Staff Data Engineer to design, build, and scale API-driven data solutions on GCP. You’ll work with cross-functional teams to develop robust data pipelines, enable self-service analytics, and drive best practices in API development, CI/CD, and data architecture. Key Responsibilities: β€’ Design and build scalable data pipelines and APIs β€’ Develop cloud-based solutions using GCP (BigQuery, Cloud Run, GKE, etc.) β€’ Implement automation, CI/CD, and data quality best practices β€’ Collaborate with data teams to deliver insights and AI/ML solutions β€’ Mentor engineers and promote engineering excellence Requirements: β€’ 5+ years in Data Engineering, 3+ years in API development β€’ Strong experience with GCP, Python/Java/Scala, SQL β€’ Hands-on with APIs (FastAPI, Apigee), Kafka/Spark, CI/CD β€’ Experience with modern data architecture and cloud best practices