

Recurring Decimal
Python Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include 6+ years in Data Engineering, 4+ years in Python and data pipeline design, and 2+ years in GCP implementations.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 31, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Phoenix, AZ
-
🧠 - Skills detailed
#Airflow #Data Pipeline #Dataflow #Cloud #Data Engineering #DevOps #Docker #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Kubernetes #Python #Public Cloud #Storage #Kafka (Apache Kafka) #Data Modeling #Documentation
Role description
Key Skills Required:
• 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
• 4+ years of experience with Python with working knowledge on Notebooks.
• 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
• 4+ years of experience with one of the leading public clouds and GCP: 2+ years
• 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
• 2+ years of experience with Kafka, Pub/Sub, Docker, Kubernetes
• Architecture design and documentation experience of 2+ years
• Troubleshoot, optimize data platform capabilities
• Ability to work independently, solve problems, update the stake holders.
• Analyze, design, develop and deploy solutions as per business requirements.
• Strong understanding of relational and dimensional data modeling.
• Experience in DevOps and CI/CD related technologies.
• Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.
Key Skills Required:
• 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
• 4+ years of experience with Python with working knowledge on Notebooks.
• 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
• 4+ years of experience with one of the leading public clouds and GCP: 2+ years
• 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
• 2+ years of experience with Kafka, Pub/Sub, Docker, Kubernetes
• Architecture design and documentation experience of 2+ years
• Troubleshoot, optimize data platform capabilities
• Ability to work independently, solve problems, update the stake holders.
• Analyze, design, develop and deploy solutions as per business requirements.
• Strong understanding of relational and dimensional data modeling.
• Experience in DevOps and CI/CD related technologies.
• Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.






