

Remote Sr. GCP DevOps/Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
September 9, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#SQL (Structured Query Language) #Azure #Dataflow #AWS (Amazon Web Services) #BigQuery #Python #Databases #GCP (Google Cloud Platform) #Cloud #GitLab #Deployment #Terraform #Infrastructure as Code (IaC) #Airflow #"ETL (Extract #Transform #Load)" #Data Access #Jenkins #Kubernetes #Migration #Public Cloud #Security #Data Engineering #GitHub #AI (Artificial Intelligence) #DevOps
Role description
JOB DESCRIPTION
Architect, build, and manage a secure GCP infrastructure platform to reduce infrastructure provisioning time and improve operational consistency Collaborate with fellow DevOps and Data Engineers on AI driven initiative to automate healthcare data accesses and flows Manage cloud resources via Terraform and sync with developers, QA, and operational teams on deployments and issues Tune queries, manage partitions, and reduce compute costs within the GCP environment Orchestrate continuous migration of multiple databases to GCP Lead security audit and remediation efforts
REQUIRED SKILLS AND EXPERIENCE
6+ years of experience as a DevOps Engineer, specific to data in public cloud environments 3+ years of experience in GCP, understanding all tools of platform including BigQuery, DataProc, DataFlow, Cloud Composer, Airflow, and Google Kubernetes Engine (GKE), etc. Strong data engineering experience leveraging SQL and Python tech stacks Experience with Terraform for IaC provisioning of cloud resources and building CI/CD pipelines via Jenkins/Gitlab/Github Actions Experience interfacing with a Gen AI team/project
NICE TO HAVE SKILLS AND EXPERIENCE
ETL/ELT pipeline creation experience Experience with GCP/AWS/Azure AI tooling Experience doing hands-on RAG and automated pipeline engineering GCP certifications
Pay Rate: $60-65/hr
β’ this is a 12 month contract role
β’
JOB DESCRIPTION
Architect, build, and manage a secure GCP infrastructure platform to reduce infrastructure provisioning time and improve operational consistency Collaborate with fellow DevOps and Data Engineers on AI driven initiative to automate healthcare data accesses and flows Manage cloud resources via Terraform and sync with developers, QA, and operational teams on deployments and issues Tune queries, manage partitions, and reduce compute costs within the GCP environment Orchestrate continuous migration of multiple databases to GCP Lead security audit and remediation efforts
REQUIRED SKILLS AND EXPERIENCE
6+ years of experience as a DevOps Engineer, specific to data in public cloud environments 3+ years of experience in GCP, understanding all tools of platform including BigQuery, DataProc, DataFlow, Cloud Composer, Airflow, and Google Kubernetes Engine (GKE), etc. Strong data engineering experience leveraging SQL and Python tech stacks Experience with Terraform for IaC provisioning of cloud resources and building CI/CD pipelines via Jenkins/Gitlab/Github Actions Experience interfacing with a Gen AI team/project
NICE TO HAVE SKILLS AND EXPERIENCE
ETL/ELT pipeline creation experience Experience with GCP/AWS/Azure AI tooling Experience doing hands-on RAG and automated pipeline engineering GCP certifications
Pay Rate: $60-65/hr
β’ this is a 12 month contract role
β’