

Python Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Data Engineer with 6+ years in Data Engineering, 4+ years in Python and data pipeline design, and 2+ years in GCP. Requires experience with Kafka, Docker, Kubernetes, and strong communication skills. Contract length and pay rate are unspecified.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 11, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#Data Modeling #Kafka (Apache Kafka) #Python #Dataflow #"ETL (Extract #Transform #Load)" #Kubernetes #Docker #GCP (Google Cloud Platform) #Cloud #Public Cloud #Data Pipeline #Storage #Documentation #Data Engineering #DevOps #Airflow
Role description
Key Skills Required:
β’ 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
β’ 4+ years of experience with Python with working knowledge on Notebooks.
β’ 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
β’ 4+ years of experience with one of the leading public clouds and GCP: 2+ years
β’ 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
β’ 2+ years of experience with Kafka, Pub/Sub, Docker, Kubernetes
β’ Architecture design and documentation experience of 2+ years
β’ Troubleshoot, optimize data platform capabilities
β’ Ability to work independently, solve problems, update the stake holders.
β’ Analyze, design, develop and deploy solutions as per business requirements.
β’ Strong understanding of relational and dimensional data modeling.
β’ Experience in DevOps and CI/CD related technologies.
β’ Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.
Key Skills Required:
β’ 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
β’ 4+ years of experience with Python with working knowledge on Notebooks.
β’ 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
β’ 4+ years of experience with one of the leading public clouds and GCP: 2+ years
β’ 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
β’ 2+ years of experience with Kafka, Pub/Sub, Docker, Kubernetes
β’ Architecture design and documentation experience of 2+ years
β’ Troubleshoot, optimize data platform capabilities
β’ Ability to work independently, solve problems, update the stake holders.
β’ Analyze, design, develop and deploy solutions as per business requirements.
β’ Strong understanding of relational and dimensional data modeling.
β’ Experience in DevOps and CI/CD related technologies.
β’ Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.