GCP AI Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP AI Data Engineer with 8+ years of data engineering experience, including 1+ year in Gen AI, ideally in insurance. Contract length is unspecified, with a pay rate of $70-80/hr. Key skills include GCP, Python, and ETL tools.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
640
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Public Cloud #Deployment #GCP (Google Cloud Platform) #Deep Learning #Spark (Apache Spark) #Snowflake #TensorFlow #DevOps #PyTorch #AWS (Amazon Web Services) #Compliance #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Cloud #SQL (Structured Query Language) #Azure #Python #NoSQL #Data Pipeline #Scala #Programming #AI (Artificial Intelligence) #Data Science #Data Engineering
Role description
REQUIRED SKILLS AND EXPERIENCE β€’ 8+ years of experience across data engineering utilizing data solutions, SQL and NoSQL, Snowflake, ELT/ETL tools, CICD, Bigdata, GCP, Python/Spark, Datamesh, Datalake, Data Fabric β€’ 1+ year of hands-on experience supporting Gen AI initiatives in a data environment, ideally from insurance background β€’ Experience building out within ideally a GCP public cloud environment, but could be AWS/Azure β€’ Strong programming skills in Python and familiarity with deep learning frameworks such as PyTorch or TensorFlow JOB DESCRIPTION Responsible for implementing data pipelines that bring together structured, semi-structured and unstructured data to support AI and Agentic solutions - includes pre-processing with extraction, chunking, embedding and grounding strategies to get the data ready Develop GCP driven systems to improve data capabilities, ensuring compliance with industry best practices for insurance-specific data use cases and challenges Develop data domains and data products for various consumption archetypes including Reporting, Data Science, AI/ML, Analytics etc. Collaborate closely with DevOps and infrastructure teams to ensure seamless deployment, operation, and maintenance of data systems Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the GCP environment $70-80/hr based off of experience.