

Tenth Revolution Group
Senior Data Engineer (GCP)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (GCP) with a minimum 6-month contract, starting in January. Pay rate is competitive. Requires strong GCP, BigQuery, Python, SQL, and Apache Airflow skills. Hybrid work: 2 days onsite in Charlotte, 3 remote.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
December 20, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte Metro
-
🧠 - Skills detailed
#Docker #Data Modeling #Prometheus #BigQuery #Spark (Apache Spark) #Apache Airflow #Observability #Apache Beam #Data Architecture #"ETL (Extract #Transform #Load)" #Grafana #Data Engineering #Kubernetes #Data Pipeline #AWS (Amazon Web Services) #Azure #Databricks #SQL (Structured Query Language) #Kafka (Apache Kafka) #Dataflow #Cloud #GCP (Google Cloud Platform) #Airflow #PySpark #Python #Monitoring #Batch #Terraform
Role description
We are working with a well-established organization seeking an experienced Senior Data Engineer to support the design, build, and optimization of large-scale data platforms. This role is highly hands-on and suited to someone who has built end-to-end data pipelines across both batch and real-time workloads in cloud-native environments.
Hours per week: 40
Start Date: January
Length of Contract: Minimum 6 months with likely extension
Onsite requirements: 2 days per week in Charlotte, 3 days remote
Required / Core Skills
• Strong experience as a Data Engineer in cloud-based environments
• Solid hands-on experience with Google Cloud Platform (GCP)
• Strong experience with BigQuery, including data modeling and performance optimization
• Proficient Python experience for building and maintaining data pipelines
• Experience building streaming and batch data pipelines (for example using Pub/Sub, Kafka, or similar technologies)
• Strong SQL skills for data transformation and analytics use cases
• Experience orchestrating workflows using Apache Airflow (or Cloud Composer)
• Experience working with modern data architectures and end-to-end pipelines
Preferred Skills
• Exposure to Dataflow, Dataproc, or Apache Beam
• Experience with Databricks or Spark / PySpark
• Infrastructure-as-Code experience using Terraform
• CI/CD exposure for data engineering workflows
• Containerization experience with Docker and Kubernetes
• Monitoring and observability tools such as Prometheus or Grafana
• Exposure to other cloud platforms (AWS or Azure)
We are working with a well-established organization seeking an experienced Senior Data Engineer to support the design, build, and optimization of large-scale data platforms. This role is highly hands-on and suited to someone who has built end-to-end data pipelines across both batch and real-time workloads in cloud-native environments.
Hours per week: 40
Start Date: January
Length of Contract: Minimum 6 months with likely extension
Onsite requirements: 2 days per week in Charlotte, 3 days remote
Required / Core Skills
• Strong experience as a Data Engineer in cloud-based environments
• Solid hands-on experience with Google Cloud Platform (GCP)
• Strong experience with BigQuery, including data modeling and performance optimization
• Proficient Python experience for building and maintaining data pipelines
• Experience building streaming and batch data pipelines (for example using Pub/Sub, Kafka, or similar technologies)
• Strong SQL skills for data transformation and analytics use cases
• Experience orchestrating workflows using Apache Airflow (or Cloud Composer)
• Experience working with modern data architectures and end-to-end pipelines
Preferred Skills
• Exposure to Dataflow, Dataproc, or Apache Beam
• Experience with Databricks or Spark / PySpark
• Infrastructure-as-Code experience using Terraform
• CI/CD exposure for data engineering workflows
• Containerization experience with Docker and Kubernetes
• Monitoring and observability tools such as Prometheus or Grafana
• Exposure to other cloud platforms (AWS or Azure)






