

DATA Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a DATA Engineer in Alpharetta, GA, on a contract basis for 8 hours per week, requiring 6-8 years of experience in data engineering, GCP services, SQL, Python, and Terraform expertise. Preferred certifications and domain experience in energy or utilities are bonuses.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
June 18, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Alpharetta, GA 30004
-
π§ - Skills detailed
#Data Quality #Scala #Statistics #Cloud #Automation #Computer Science #Docker #Data Integrity #BigQuery #AI (Artificial Intelligence) #GCP (Google Cloud Platform) #ML (Machine Learning) #Clustering #Apache Airflow #Storage #Kubernetes #Spark (Apache Spark) #Airflow #Terraform #SQL (Structured Query Language) #Data Engineering #Jenkins #dbt (data build tool) #Data Science #"ETL (Extract #Transform #Load)" #Data Pipeline #Schema Design #Snowflake #Security #Data Architecture #Apache Spark #Data Modeling #Batch #Kafka (Apache Kafka) #Python #Dataflow #Databricks #Infrastructure as Code (IaC)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: DATA Engineer
Location: Alpharetta, Ga
Core Responsibilities:
Data Pipeline Development: Build scalable batch and real-time pipelines using Dataflow, Pub/Sub, Cloud Composer, and Dataproc.
Data Warehousing: Design and optimize analytical models in BigQuery, implementing best practices in schema design and performance tuning.
Infrastructure & CI/CD: Deploy data infrastructure with Terraform; create and manage CI/CD pipelines for workflow automation.
Data Quality & Governance: Implement validation checks, ensure data integrity, and enforce security and governance practices.
AI/ML Integration: Collaborate with data scientists to support Vertex AI workflows and explore the use of Gemini models (via BigQuery ML or Vertex AI APIs) for advanced data transformation.
Cross-Team Collaboration: Work with analysts, scientists, and business stakeholders to deliver impactful data solution.
Technical Skills:
Strong experience with GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage
Expertise in SQL, Python, and ETL/ELT development
Knowledge of Infrastructure as Code tools (e.g., Terraform)
Familiarity with CI/CD tools: Google Cloud Build, Jenkins
Understanding of data modeling, partitioning, clustering, and materialized views
Working knowledge of data quality frameworks and governance principles
Experience with Vertex AI, or a strong interest in ML/AI workflows on GCP
Apache Spark, Kafka, Apache Airflow
DBT or Dataform for transformations
Docker, Kubernetes for Containerization
Snowflake, Databricks or other modern platforms
Soft Skills:
Strong communication and collaboration skills
Ability to manage priorities and work independently
Analytical thinking and problem-solving mindset
Preferred Skills (Bonus):
Google Cloud certifications (e.g., Professional Data Engineer, ML Engineer)
Domain experience in energy, utilities, or industrial sectors
Experience Requirements:
Education: Bachelorβs degree in computer science, Engineering, Statistics, or related technical field (or equivalent experience)
Professional Experience:
6-8 years of experience in data engineering
Proven hands-on experience with GCP and modern data architecture
Job Type: Contract
Expected hours: 8 per week
Schedule:
8 hour shift
Experience:
GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage: 7 years (Required)
SQL, Python, and ETL/ELT development: 7 years (Required)
Infrastructure as Code tools (e.g., Terraform): 8 years (Required)
Location:
Alpharetta, GA 30004 (Required)
Work Location: In person