

Acumenz Consulting
GCP/API Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP/API Data Engineer in Danbury, CT, on a contract basis. Key skills include GCP services, SQL, Python, Java, and strong API expertise. Preferred qualifications include GCP certifications and experience with streaming technologies.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 5, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Danbury, CT
-
🧠 - Skills detailed
#Compliance #Automation #Cloud #Data Engineering #Docker #API (Application Programming Interface) #Apache Spark #Data Pipeline #GDPR (General Data Protection Regulation) #IAM (Identity and Access Management) #Swagger #Kafka (Apache Kafka) #Apache Beam #Security #Dataflow #Python #Spark (Apache Spark) #Clustering #Scala #Airflow #BigQuery #Data Modeling #Distributed Computing #Batch #Data Ingestion #Spring Boot #GCP (Google Cloud Platform) #Data Processing #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Java #Storage #Kubernetes
Role description
Position: GCP/API - Data Engineer
Location: Danbury, CT (Onsite)
Contract
Skills:
• Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.
• Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.
• Optimize BigQuery performance through partitioning, clustering, and query tuning.
• Work with Cloud Storage, Pub/Sub,Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing.
• Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.
• Strong expertise in GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).
• Proficiency in SQL, Python, and Java for data processing and automation.
• Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.
• Strong understanding of data modeling, warehousing, and distributed computing.
• Experience with real-time and batch processing architectures.
• Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).
Strong API Skills:
• Strong Core Java & Spring Boot
• APIGEE & Security patterns
• Swagger Designing
• Microservice Architecture and Patterns
Preferred Qualifications:
• GCP certifications (e.g., Professional Data Engineer, Associate Cloud Engineer).
• Exposure to Kafka, Ni-Fi, or other streaming technologies.
• Experience with containerization and orchestration (Docker, Kubernetes, GKE)
Position: GCP/API - Data Engineer
Location: Danbury, CT (Onsite)
Contract
Skills:
• Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.
• Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.
• Optimize BigQuery performance through partitioning, clustering, and query tuning.
• Work with Cloud Storage, Pub/Sub,Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing.
• Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.
• Strong expertise in GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).
• Proficiency in SQL, Python, and Java for data processing and automation.
• Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.
• Strong understanding of data modeling, warehousing, and distributed computing.
• Experience with real-time and batch processing architectures.
• Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).
Strong API Skills:
• Strong Core Java & Spring Boot
• APIGEE & Security patterns
• Swagger Designing
• Microservice Architecture and Patterns
Preferred Qualifications:
• GCP certifications (e.g., Professional Data Engineer, Associate Cloud Engineer).
• Exposure to Kafka, Ni-Fi, or other streaming technologies.
• Experience with containerization and orchestration (Docker, Kubernetes, GKE)






