A2C

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "$XX/hour". Requires 5+ years in data engineering, expertise in GCP tools, advanced Python and SQL skills, and industry experience in energy or utilities.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 1, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#RDBMS (Relational Database Management System) #Databases #SQL (Structured Query Language) #Data Quality #NumPy #Data Modeling #Computer Science #Statistics #Python #Dataflow #Scala #Security #Cloud #Automation #Compliance #Azure #Data Engineering #DevOps #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Data Processing #Data Pipeline #BigQuery #Deployment #Data Architecture #Pandas #Terraform #GCP (Google Cloud Platform) #AI (Artificial Intelligence) #TensorFlow #Azure DevOps #NoSQL
Role description
Role Overview We are seeking a Senior Data Engineer to design and deliver scalable data solutions with a strong focus on Google Cloud Platform (GCP). You will lead key initiatives such as re-platforming data services to the cloud, building real-time streaming pipelines, and optimizing large-scale data processing systems. This role is accountable for the quality, performance, and usability of enterprise data solutions. Key Responsibilities β€’ Design and build data pipelines and foundations using GCP tools (BigQuery, Dataflow, Pub/Sub, DataStream, GCS). β€’ Develop scalable ETL/ELT pipelines with Python and SQL across RDBMS and NoSQL databases. β€’ Drive data modeling, profiling, and curation for analytics and AI/ML workloads. β€’ Partner with business and technical teams to gather requirements and translate into data architecture. β€’ Support CI/CD pipelines with Terraform/Terragrunt and Azure DevOps. β€’ Collaborate with architects to define optimal cloud data architectures. β€’ Build and support MLOps pipelines for AI/ML deployment, testing, and performance tuning. β€’ Ensure data quality, governance, and compliance standards are met. Required Skills & Experience β€’ 5+ years in data engineering, with 4+ years in data modeling and architecture. β€’ Deep expertise in GCP data engineering tools (BigQuery, Pub/Sub, Dataflow, DataStream, GCS). β€’ Advanced Python and SQL development skills. β€’ Strong knowledge of data warehousing, governance, and security practices. β€’ Experience with CI/CD, infrastructure-as-code, and cloud deployment automation. β€’ Solid understanding of statistics and analytics to support AI/ML initiatives. Preferred Qualifications β€’ Master’s degree in Computer Science, Engineering, or related field. β€’ Industry experience in energy or utilities. β€’ Familiarity with machine learning frameworks (TensorFlow, Scikit-Learn, Pandas, NumPy).