TMT IT Solutions

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience, focused on GCP, ETL/ELT workflows, and data pipeline development. Contract length is unspecified, pay rate is "unknown," and work is on-site in Plano. Key skills include Python, SQL, and Apache Spark.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
February 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Dallas-Fort Worth Metroplex
-
🧠 - Skills detailed
#Airflow #"ETL (Extract #Transform #Load)" #Kafka (Apache Kafka) #Storage #GCP (Google Cloud Platform) #Deployment #Dataflow #Automation #Apache Spark #Infrastructure as Code (IaC) #Python #Cloud #Datasets #Hadoop #Data Lake #BigQuery #Scala #Data Engineering #GitHub #Data Pipeline #DevOps #SQL (Structured Query Language) #Spark (Apache Spark) #BI (Business Intelligence) #Terraform #Data Processing #Jenkins #Big Data
Role description
Role Overview We are seeking a skilled Data Engineer with a minimum of 6 years of experience to join our team in Plano. You will be responsible for designing, building, and maintaining the scalable data infrastructure that powers our analytics and business intelligence. The ideal candidate is a GCP enthusiast who thrives on building robust ETL/ELT workflows and managing infrastructure through code. Key Responsibilities • Pipeline Development: Design and deploy high-performance, scalable data pipelines to process large-scale datasets. • GCP Management: Architect solutions leveraging the full Google Cloud suite, specifically BigQuery for warehousing and Cloud Storage for data lakes. • Orchestration & Transformation: Build and manage complex workflows using Cloud Composer (Airflow) and develop efficient ETL/ELT processes using Python and SQL. • Big Data Processing: Utilize Apache Spark and Hadoop (via Dataproc) to handle distributed data processing tasks. • Automation (IaC): Provision and manage cloud resources using Terraform or Cloud Deployment Manager to ensure environment consistency. • DevOps Integration: Maintain and improve CI/CD pipelines using Cloud Build, GitHub Actions, or Jenkins to streamline deployments. • Real-time Streaming: Implement messaging and stream-processing systems using Pub/Sub, Dataflow, or Kafka.