Python Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Data Engineer, remote for 12 months, offering a competitive pay rate. Key skills include strong Python and SQL proficiency, experience with data pipelines, REST APIs, and cloud data environments like Snowflake and Airflow.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Airflow #Data Engineering #Scala #Data Pipeline #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #BigQuery #Cloud #REST API #Data Warehouse #Databricks #REST (Representational State Transfer) #Python #Data Lake #Redshift #Data Quality #S3 (Amazon Simple Storage Service) #Snowflake
Role description
Position: Python Data Engineer Location: Remote Duration: 12 Months Job Description: The Enterprise Data Platform team is seeking a Senior Data Engineer with proven experience in designing and building data pipelines for cloud data warehouses and data lakes. Responsibilities: β€’ Collaborate with cross-functional stakeholders to gather and understand data requirements. β€’ Design and develop robust data pipelines in Python to import, transform, and load data into Snowflake and Data Lake environments. β€’ Ensure data quality and integrity throughout the pipeline lifecycle. Required: β€’ Strong proficiency in Python and SQL. β€’ Demonstrated experience in building scalable data pipelines using Python. β€’ Hands-on experience extracting data from REST APIs and ingesting it into cloud data warehouses or data lakes. β€’ Experience working with Amazon S3 and building, scheduling, and maintaining DAGs in Airflow. β€’ Practical experience with programmatically interacting with any major cloud data warehouse or data lake (e.g., Snowflake, Redshift, BigQuery, Databricks, etc.).