

ISITE TECHNOLOGIES
GCP Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Danbury, CT, with a contract length of unspecified duration. The position requires 10 years of experience, mandatory Google Cloud Platform certification, and expertise in data pipelines, APIs, SQL, Python, and Java.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Danbury, CT
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #POSTMAN #Docker #Documentation #Data Pipeline #Spring Boot #Compliance #SQL (Structured Query Language) #Apache Spark #Flask #Distributed Computing #GDPR (General Data Protection Regulation) #Cloud #Microservices #Security #Airflow #Dataflow #Data Modeling #Deployment #Apache Beam #Batch #Scala #Data Engineering #Java #Swagger #API (Application Programming Interface) #Clustering #BigQuery #Spark (Apache Spark) #Data Processing #REST (Representational State Transfer) #Storage #Data Ingestion #GCP (Google Cloud Platform) #IAM (Identity and Access Management) #Automation #Python #GraphQL #FastAPI
Role description
Job Role: Google Cloud Platform Data Engineer
Job Location: Danbury, CT
Experience: 10years
NOTE : Google Cloud Platform certification mandatory
Job Description:
1. Designing or building REST/GraphQL APIs
1. Working with API frameworks (FastAPI, Flask, Express, Spring Boot)
1. API Gateway configuration, rate limiting, or traffic management
1. Authentication/authorization implementation (OAuth 2.0, JWT)
1. API documentation (Swagger/OpenAPI)
1. Container-based API deployment (Cloud Run, GKE, Docker)
1. API testing (Postman, Newman, contract testing)
1. Microservices design patterns
• Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.
• Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.
• Optimize BigQuery performance through partitioning, clustering, and query tuning.
• Work with Cloud Storage, Pub/Sub,Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing.
• Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.
• Strong expertise in Google Cloud Platform services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).
• Proficiency in SQL, Python, and Java for data processing and automation.
• Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.
• Strong understanding of data modeling, warehousing, and distributed computing.
• Experience with real-time and batch processing architectures.
• Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).
Strong API Skills:
• Strong Core Java & Spring Boot
• APIGEE & Security patterns
• Swagger Designing
• Microservice Architecture and Patterns
• Google Cloud Platform certifications (e.g., Professional Data Engineer, Associate Cloud Engineer).
Job Role: Google Cloud Platform Data Engineer
Job Location: Danbury, CT
Experience: 10years
NOTE : Google Cloud Platform certification mandatory
Job Description:
1. Designing or building REST/GraphQL APIs
1. Working with API frameworks (FastAPI, Flask, Express, Spring Boot)
1. API Gateway configuration, rate limiting, or traffic management
1. Authentication/authorization implementation (OAuth 2.0, JWT)
1. API documentation (Swagger/OpenAPI)
1. Container-based API deployment (Cloud Run, GKE, Docker)
1. API testing (Postman, Newman, contract testing)
1. Microservices design patterns
• Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.
• Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.
• Optimize BigQuery performance through partitioning, clustering, and query tuning.
• Work with Cloud Storage, Pub/Sub,Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing.
• Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.
• Strong expertise in Google Cloud Platform services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).
• Proficiency in SQL, Python, and Java for data processing and automation.
• Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.
• Strong understanding of data modeling, warehousing, and distributed computing.
• Experience with real-time and batch processing architectures.
• Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).
Strong API Skills:
• Strong Core Java & Spring Boot
• APIGEE & Security patterns
• Swagger Designing
• Microservice Architecture and Patterns
• Google Cloud Platform certifications (e.g., Professional Data Engineer, Associate Cloud Engineer).






