

Python Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Data Engineer, remote for 12 months, offering a competitive pay rate. Key skills include strong Python and SQL proficiency, experience with data pipelines, REST APIs, and cloud data environments like Snowflake and Airflow.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Airflow #Data Engineering #Scala #Data Pipeline #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #BigQuery #Cloud #REST API #Data Warehouse #Databricks #REST (Representational State Transfer) #Python #Data Lake #Redshift #Data Quality #S3 (Amazon Simple Storage Service) #Snowflake
Role description
Position: Python Data Engineer
Location: Remote
Duration: 12 Months
Job Description:
The Enterprise Data Platform team is seeking a Senior Data Engineer with proven experience in designing and building data pipelines for cloud data warehouses and data lakes.
Responsibilities:
β’ Collaborate with cross-functional stakeholders to gather and understand data requirements.
β’ Design and develop robust data pipelines in Python to import, transform, and load data into Snowflake and Data Lake environments.
β’ Ensure data quality and integrity throughout the pipeline lifecycle.
Required:
β’ Strong proficiency in Python and SQL.
β’ Demonstrated experience in building scalable data pipelines using Python.
β’ Hands-on experience extracting data from REST APIs and ingesting it into cloud data warehouses or data lakes.
β’ Experience working with Amazon S3 and building, scheduling, and maintaining DAGs in Airflow.
β’ Practical experience with programmatically interacting with any major cloud data warehouse or data lake (e.g., Snowflake, Redshift, BigQuery, Databricks, etc.).
Position: Python Data Engineer
Location: Remote
Duration: 12 Months
Job Description:
The Enterprise Data Platform team is seeking a Senior Data Engineer with proven experience in designing and building data pipelines for cloud data warehouses and data lakes.
Responsibilities:
β’ Collaborate with cross-functional stakeholders to gather and understand data requirements.
β’ Design and develop robust data pipelines in Python to import, transform, and load data into Snowflake and Data Lake environments.
β’ Ensure data quality and integrity throughout the pipeline lifecycle.
Required:
β’ Strong proficiency in Python and SQL.
β’ Demonstrated experience in building scalable data pipelines using Python.
β’ Hands-on experience extracting data from REST APIs and ingesting it into cloud data warehouses or data lakes.
β’ Experience working with Amazon S3 and building, scheduling, and maintaining DAGs in Airflow.
β’ Practical experience with programmatically interacting with any major cloud data warehouse or data lake (e.g., Snowflake, Redshift, BigQuery, Databricks, etc.).