
GCP Python Data Engineer with AI/ML Experience
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Python Data Engineer with AI/ML experience, offering a contract length of "unknown," and a pay rate of "unknown." Required skills include Python, GCP services, ETL/ELT, SQL, and experience in AI/ML model deployment.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 12, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Deployment #Python #Compliance #Storage #Data Governance #AI (Artificial Intelligence) #GCP (Google Cloud Platform) #Data Ingestion #Computer Science #Model Deployment #Data Storage #SQL (Structured Query Language) #TensorFlow #Microservices #Automation #Apache Beam #Docker #Data Processing #Airflow #Kafka (Apache Kafka) #GitLab #Dataflow #BigQuery #Datasets #Data Engineering #Jenkins #Data Science #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #PyTorch #Scala #Data Pipeline #Security #Kubernetes #DevOps #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: GCP Python Data Engineer with AI/ML Experience
Job Summary
We are seeking a highly skilled Data Engineer with expertise in Google Cloud Platform (GCP), Python, and AI/ML pipelines to design, build, and optimize scalable data solutions. The ideal candidate will have hands-on experience with data ingestion, transformation, orchestration, and integrating AI/ML models into production systems.
Key Responsibilities
• Design, develop, and maintain ETL/ELT pipelines on GCP using Python, BigQuery, Dataflow, and other GCP services.
• Collaborate with Data Scientists to operationalize AI/ML models using Vertex AI, AI Platform, or similar tools.
• Implement best practices for data storage, processing, and governance.
• Develop APIs and microservices to expose AI/ML model predictions and data outputs.
• Optimize pipelines for performance, scalability, and cost efficiency.
• Work with DevOps teams to implement CI/CD for data and ML pipelines.
• Monitor and troubleshoot production data pipelines and model deployments.
Required Skills & Qualifications
• 3–7 years of professional experience in Data Engineering.
• Strong proficiency in Python for data processing and automation.
• Solid hands-on experience with GCP services such as:
• BigQuery
• Cloud Storage
• Dataflow / Apache Beam
• Pub/Sub
• Composer (Airflow)
• Vertex AI / AI Platform
• Experience with AI/ML model deployment and serving in production.
• Strong knowledge of SQL and performance tuning for large datasets.
• Experience with Docker and Kubernetes.
• Familiarity with CI/CD tools (Cloud Build, Jenkins, GitLab CI).
Preferred Skills
• Experience with TensorFlow, PyTorch, or scikit-learn.
• Knowledge of data governance, security, and compliance on cloud platforms.
• Exposure to MLOps best practices.
• Experience with streaming data processing (Kafka, Pub/Sub).
Education
• Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.