

Yeshnex IT Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a pay rate of "unknown." The position requires expertise in GCP, GoLang, Terraform, Airflow, and Spark, along with 7+ years of relevant experience in data engineering. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Virginia, United States
-
🧠 - Skills detailed
#Kubernetes #Scala #GitHub #GCP (Google Cloud Platform) #Big Data #Airflow #API (Application Programming Interface) #Golang #Data Engineering #"ETL (Extract #Transform #Load)" #Data Pipeline #Terraform #Data Processing #Data Integration #Spark (Apache Spark) #Infrastructure as Code (IaC) #IAM (Identity and Access Management) #Security #Cloud
Role description
Description:
We are seeking a highly skilled GCP Data Engineer with strong expertise in Google Cloud Platform, GoLang, and data engineering architectures. The ideal candidate will design and build scalable data pipelines, implement cloud-native solutions, and support API-driven data integrations in a secure and high-performance environment.
Key Responsibilities:
Design, develop, and maintain data pipelines and ETL workflows on .
Build and manage solutions using GCP services, GKE, and Big Data technologies.
Develop backend services and integrations using GoLang and APIs.
Implement Infrastructure as Code (IaC) using Terraform or equivalent tools.
Manage and optimize Airflow workflows for scheduling and orchestration.
Work with Spark and large-scale data processing frameworks.
Ensure secure and scalable cloud architecture, including IAM, networking, and third-party integrations.
Develop and maintain CI/CD pipelines using GitHub Actions.
Troubleshoot and optimize data pipelines for performance and reliability.
Collaborate with cross-functional teams and stakeholders to deliver data solutions.
Required Skills & Qualifications:
7+ years of experience in data engineering on Google Cloud Platform (GCP).
Strong hands-on experience with GCP services and architecture.
Proficiency in GoLang development and API integrations.
Experience with Terraform (Infrastructure as Code).
Strong knowledge of GKE (Google Kubernetes Engine).
Experience with Airflow for workflow orchestration.
Hands-on experience with Spark and big data processing.
Strong understanding of cloud networking, IAM, and security best practices.
Experience with CI/CD pipelines (GitHub Actions or similar).
Excellent communication and interpersonal skills.
Key Skills / Keywords:
GCP Data Engineering
Terraform / Infrastructure as Code
GKE (Kubernetes)
GoLang / API Development
Airflow / ETL Pipelines
Spark / Big Data
Cloud Networking & IAM
CI/CD (GitHub Actions)
Description:
We are seeking a highly skilled GCP Data Engineer with strong expertise in Google Cloud Platform, GoLang, and data engineering architectures. The ideal candidate will design and build scalable data pipelines, implement cloud-native solutions, and support API-driven data integrations in a secure and high-performance environment.
Key Responsibilities:
Design, develop, and maintain data pipelines and ETL workflows on .
Build and manage solutions using GCP services, GKE, and Big Data technologies.
Develop backend services and integrations using GoLang and APIs.
Implement Infrastructure as Code (IaC) using Terraform or equivalent tools.
Manage and optimize Airflow workflows for scheduling and orchestration.
Work with Spark and large-scale data processing frameworks.
Ensure secure and scalable cloud architecture, including IAM, networking, and third-party integrations.
Develop and maintain CI/CD pipelines using GitHub Actions.
Troubleshoot and optimize data pipelines for performance and reliability.
Collaborate with cross-functional teams and stakeholders to deliver data solutions.
Required Skills & Qualifications:
7+ years of experience in data engineering on Google Cloud Platform (GCP).
Strong hands-on experience with GCP services and architecture.
Proficiency in GoLang development and API integrations.
Experience with Terraform (Infrastructure as Code).
Strong knowledge of GKE (Google Kubernetes Engine).
Experience with Airflow for workflow orchestration.
Hands-on experience with Spark and big data processing.
Strong understanding of cloud networking, IAM, and security best practices.
Experience with CI/CD pipelines (GitHub Actions or similar).
Excellent communication and interpersonal skills.
Key Skills / Keywords:
GCP Data Engineering
Terraform / Infrastructure as Code
GKE (Kubernetes)
GoLang / API Development
Airflow / ETL Pipelines
Spark / Big Data
Cloud Networking & IAM
CI/CD (GitHub Actions)






