

Curate Partners
Entry Level Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is an Entry Level Data Engineer position for 6 months, offering a pay rate of "unknown." Remote work is available. Key skills include Python, SQL, Apache Airflow, and GCP experience. A background in Data Engineering or Data Platforms is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 28, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Quality #GCP (Google Cloud Platform) #Python #ML (Machine Learning) #Spark (Apache Spark) #Scripting #Data Modeling #Data Processing #Apache Airflow #Cloud #Automation #"ETL (Extract #Transform #Load)" #DevOps #Data Transformations #Data Engineering #AI (Artificial Intelligence) #Data Science #Scala #Airflow #Data Pipeline #BigQuery #SQL (Structured Query Language)
Role description
Overview
We are seeking an entry to mid-level Data Engineer to join our engineering team and help design and build scalable data pipelines and automation frameworks that support analytics, quality engineering, and platform reliability initiatives.
This is a hands-on engineering role, not a traditional QA/testing position. You will focus on building data workflows, orchestration pipelines, and automation systems using modern cloud and data engineering tools within GCP.
This is a great opportunity for early-career engineers who enjoy building, automating, and solving real-world data problems while working closely with platform and engineering teams.
What You’ll Do
• Design, build, and maintain data pipelines and ETL/ELT workflows
• Develop and schedule Airflow DAGs for automated processing
• Write scalable data transformations using Python, SQL, and Spark
• Work within GCP services to deploy and manage data workflows
• Build automation frameworks to support data validation and system reliability
• Partner with engineering teams to improve data quality and operational efficiency
• Contribute to POCs and internal tools, including AI/ML or data-driven initiatives when applicable
• Support occasional cross-team efforts when high-priority delivery requires collaboration
Required Skills
• 1–3 years of hands-on experience in Data Engineering or Data Platforms
• Strong Python (data processing, scripting, automation)
• Strong SQL
• Experience building ETL/ELT pipelines
• Experience with Apache Airflow (DAG creation, scheduling, orchestration)
• Familiarity with Google Cloud Platform (GCP) services
• Basic experience with Spark or distributed data processing
• Understanding of data modeling and pipeline best practices
Nice to Have
• Experience with cloud-native workflows in GCP (Composer, BigQuery, GCS, etc.)
• Exposure to ML/data science workflows
• Experience building internal automation or reliability tooling
• Academic or project experience building pipelines end-to-end
• CI/CD or DevOps fundamentals
Overview
We are seeking an entry to mid-level Data Engineer to join our engineering team and help design and build scalable data pipelines and automation frameworks that support analytics, quality engineering, and platform reliability initiatives.
This is a hands-on engineering role, not a traditional QA/testing position. You will focus on building data workflows, orchestration pipelines, and automation systems using modern cloud and data engineering tools within GCP.
This is a great opportunity for early-career engineers who enjoy building, automating, and solving real-world data problems while working closely with platform and engineering teams.
What You’ll Do
• Design, build, and maintain data pipelines and ETL/ELT workflows
• Develop and schedule Airflow DAGs for automated processing
• Write scalable data transformations using Python, SQL, and Spark
• Work within GCP services to deploy and manage data workflows
• Build automation frameworks to support data validation and system reliability
• Partner with engineering teams to improve data quality and operational efficiency
• Contribute to POCs and internal tools, including AI/ML or data-driven initiatives when applicable
• Support occasional cross-team efforts when high-priority delivery requires collaboration
Required Skills
• 1–3 years of hands-on experience in Data Engineering or Data Platforms
• Strong Python (data processing, scripting, automation)
• Strong SQL
• Experience building ETL/ELT pipelines
• Experience with Apache Airflow (DAG creation, scheduling, orchestration)
• Familiarity with Google Cloud Platform (GCP) services
• Basic experience with Spark or distributed data processing
• Understanding of data modeling and pipeline best practices
Nice to Have
• Experience with cloud-native workflows in GCP (Composer, BigQuery, GCS, etc.)
• Exposure to ML/data science workflows
• Experience building internal automation or reliability tooling
• Academic or project experience building pipelines end-to-end
• CI/CD or DevOps fundamentals






