

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer with a 6-month contract to hire, offering a hybrid work environment (3 days onsite). Requires GCP Professional Data Engineer certification, 5+ years in data engineering, and expertise in Python, SQL, and cloud technologies.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 10, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Monitoring #SQL (Structured Query Language) #Snowflake #Storage #Apache Beam #Data Pipeline #Data Migration #Scala #AWS (Amazon Web Services) #Python #Terraform #Cloud #Migration #GCP (Google Cloud Platform) #Airflow #Data Quality #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #BigQuery #Datasets #Data Engineering #Dataflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Need Local as for onsite interview
Senior Data Engineer
Location: Hybrid 3 days a week onsite
Duration: 6 month contract to hire
We're looking for a Senior Data Engineer to lead the development of scalable, cloud-native data pipelines. You'll be responsible for building efficient ETL/ELT workflows, guiding cloud migrations, and delivering clean, high-performance data systems to support business goals.
What Youβll Do
β’ Design and maintain pipelines using Dataflow (Apache Beam), Pub/Sub, and BigQuery
β’ Orchestrate workflows with Cloud Composer (Airflow)
β’ Lead data migrations from Snowflake/AWS to GCP
β’ Enforce data quality, monitoring, and performance optimization
β’ Collaborate with cross-functional teams to align data solutions with business needs
What You Bring
β’ GCP Certification (Professional Data Engineer required)
β’ 5+ years in data engineering with strong Python and SQL skills
β’ Deep experience with GCP: BigQuery, Dataflow, Pub/Sub, and Cloud Storage
β’ Background in Airflow, CI/CD pipelines, and Infrastructure as Code (Terraform)
β’ Proven ability to manage large datasets and drive data performance
This hybrid role requires onsite presence 3 days a week. If you're certified and excited to shape cloud data infrastructure at scale, apply now to join a collaborative and forward-thinking team.
If you are interested or have any references please share resume at mukul@brightmindsol.com.