A2C

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include GCP, Python, SQL, and data modeling. Requires 5+ years of data engineering experience and preferred certification in GCP. Experience in the Energy Sector is a plus.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 21, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#Agile #IAM (Identity and Access Management) #Data Modeling #Data Quality #Dataflow #Scrum #Spark (Apache Spark) #ML (Machine Learning) #Cloud #Automation #GCP (Google Cloud Platform) #Security #Azure DevOps #Data Engineering #Computer Science #Monitoring #Code Reviews #Data Processing #Python #Data Governance #DevOps #Data Security #"ETL (Extract #Transform #Load)" #Scala #SQL (Structured Query Language) #Azure #BigQuery #Documentation #Data Pipeline #Logging #Terraform
Role description
Job Summary The Data Engineer will design, build, and optimize scalable data solutions across cloud (GCP) and on-prem environments. This role focuses on re-platforming data services, enabling real-time streaming, and ensuring data quality, performance, and reliability. You’ll help shape our enterprise data and analytics platform to support modern analytics lifecyclesβ€”enabling data monetization, feature engineering, model training, reporting, and predictive insights. Key Responsibilities β€’ Design and implement cloud-native data pipelines and architectures using Google Cloud Platform (BigQuery, Pub/Sub, Dataflow, Dataform, BigTable, Cloud Composer, Cloud Run, IAM, Terraform). β€’ Develop and optimize ETL processes, data curation, and modeling using Python and SQL. β€’ Lead data solution design sessions, code reviews, and CI/CD pipeline automation (Azure DevOps, Terraform/Terragrunt). β€’ Collaborate with architecture and analytics teams to define scalable, secure data foundations and frameworks for both certified and self-service analytics. β€’ Improve system efficiency, data quality, and SLA management for high-performance data processing and ML model support. β€’ Ensure strong governance, monitoring, and data security practices (Secret Manager, IAM, Logging, Monitoring). Qualifications β€’ Bachelor’s degree in Computer Science, Engineering, or related field. β€’ 5+ years in data engineering, modeling, and architecture; 3+ years in data analytics solution design. β€’ Proven expertise in GCP data tools, Python, SQL, and distributed frameworks like Spark. β€’ Strong understanding of data modeling, data governance, and cloud data infrastructure. β€’ Experience with Agile/Scrum delivery, CI/CD, and DevOps best practices. Preferred β€’ GCP Professional Data Engineer Certification. β€’ Experience in the Energy Sector. β€’ Strong documentation and communication skills to translate technical concepts for business stakeholders.